Apr 17 17:22:11.931856 ip-10-0-140-147 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 17:22:11.931870 ip-10-0-140-147 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 17:22:11.931880 ip-10-0-140-147 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 17:22:11.932198 ip-10-0-140-147 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 17:22:22.127011 ip-10-0-140-147 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 17:22:22.127031 ip-10-0-140-147 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot e7cf7d463ff84a1b83b903c63a1c580b -- Apr 17 17:24:48.221516 ip-10-0-140-147 systemd[1]: Starting Kubernetes Kubelet... Apr 17 17:24:48.633302 ip-10-0-140-147 kubenswrapper[2566]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:24:48.633302 ip-10-0-140-147 kubenswrapper[2566]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 17:24:48.633302 ip-10-0-140-147 kubenswrapper[2566]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:24:48.633302 ip-10-0-140-147 kubenswrapper[2566]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 17:24:48.633302 ip-10-0-140-147 kubenswrapper[2566]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:24:48.634930 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.634839 2566 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 17:24:48.637847 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637832 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:24:48.637887 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637848 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:24:48.637887 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637853 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:24:48.637887 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637857 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:24:48.637887 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637860 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:24:48.637887 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637863 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:24:48.637887 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637866 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:24:48.637887 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637868 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:24:48.637887 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637871 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:24:48.637887 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637874 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:24:48.637887 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637877 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:24:48.637887 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637879 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:24:48.637887 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637882 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:24:48.637887 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637885 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:24:48.637887 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637888 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:24:48.637887 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637890 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:24:48.637887 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637893 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:24:48.638284 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637901 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:24:48.638284 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637904 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:24:48.638284 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637907 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:24:48.638284 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637910 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:24:48.638284 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637912 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:24:48.638284 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637915 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:24:48.638284 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637918 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:24:48.638284 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637920 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:24:48.638284 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637923 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:24:48.638284 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637925 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:24:48.638284 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637928 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:24:48.638284 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637931 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:24:48.638284 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637933 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:24:48.638284 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637936 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:24:48.638284 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637938 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:24:48.638284 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637941 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:24:48.638284 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637943 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:24:48.638284 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637946 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:24:48.638284 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637948 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:24:48.638284 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637950 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:24:48.638760 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637953 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:24:48.638760 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637955 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:24:48.638760 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637958 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:24:48.638760 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637960 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:24:48.638760 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637962 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:24:48.638760 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637965 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:24:48.638760 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637968 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:24:48.638760 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637970 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:24:48.638760 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637973 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:24:48.638760 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637975 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:24:48.638760 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637977 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:24:48.638760 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637981 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:24:48.638760 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637983 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:24:48.638760 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637987 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:24:48.638760 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637991 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:24:48.638760 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637993 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:24:48.638760 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637996 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:24:48.638760 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.637998 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:24:48.638760 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638001 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:24:48.638760 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638003 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:24:48.639286 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638006 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:24:48.639286 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638008 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:24:48.639286 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638011 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:24:48.639286 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638013 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:24:48.639286 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638015 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:24:48.639286 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638018 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:24:48.639286 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638020 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:24:48.639286 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638023 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:24:48.639286 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638026 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:24:48.639286 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638028 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:24:48.639286 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638031 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:24:48.639286 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638033 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:24:48.639286 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638036 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:24:48.639286 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638038 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:24:48.639286 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638041 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:24:48.639286 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638043 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:24:48.639286 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638045 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:24:48.639286 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638048 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:24:48.639286 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638051 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:24:48.639286 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638053 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:24:48.639774 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638055 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:24:48.639774 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638058 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:24:48.639774 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638060 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:24:48.639774 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638063 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:24:48.639774 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638066 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:24:48.639774 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638071 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:24:48.639774 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638076 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:24:48.639774 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638079 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:24:48.639774 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638082 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:24:48.639774 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638461 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:24:48.639774 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638467 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:24:48.639774 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638469 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:24:48.639774 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638474 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:24:48.639774 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638477 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:24:48.639774 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638480 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:24:48.639774 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638484 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:24:48.639774 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638487 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:24:48.639774 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638490 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:24:48.640206 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638493 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:24:48.640206 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638496 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:24:48.640206 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638499 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:24:48.640206 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638502 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:24:48.640206 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638505 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:24:48.640206 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638508 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:24:48.640206 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638511 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:24:48.640206 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638513 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:24:48.640206 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638516 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:24:48.640206 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638518 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:24:48.640206 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638521 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:24:48.640206 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638524 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:24:48.640206 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638526 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:24:48.640206 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638529 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:24:48.640206 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638532 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:24:48.640206 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638534 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:24:48.640206 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638537 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:24:48.640206 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638540 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:24:48.640206 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638543 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:24:48.640206 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638545 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:24:48.640783 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638549 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:24:48.640783 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638551 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:24:48.640783 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638554 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:24:48.640783 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638557 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:24:48.640783 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638559 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:24:48.640783 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638562 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:24:48.640783 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638564 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:24:48.640783 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638567 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:24:48.640783 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638569 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:24:48.640783 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638572 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:24:48.640783 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638574 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:24:48.640783 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638577 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:24:48.640783 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638579 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:24:48.640783 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638581 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:24:48.640783 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638584 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:24:48.640783 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638586 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:24:48.640783 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638589 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:24:48.640783 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638591 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:24:48.640783 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638593 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:24:48.640783 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638596 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:24:48.641294 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638598 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:24:48.641294 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638601 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:24:48.641294 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638604 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:24:48.641294 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638607 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:24:48.641294 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638610 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:24:48.641294 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638612 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:24:48.641294 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638614 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:24:48.641294 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638617 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:24:48.641294 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638619 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:24:48.641294 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638623 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:24:48.641294 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638625 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:24:48.641294 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638628 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:24:48.641294 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638630 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:24:48.641294 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638633 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:24:48.641294 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638636 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:24:48.641294 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638638 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:24:48.641294 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638641 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:24:48.641294 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638643 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:24:48.641294 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638646 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:24:48.641294 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638648 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:24:48.641781 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638650 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:24:48.641781 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638653 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:24:48.641781 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638655 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:24:48.641781 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638658 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:24:48.641781 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638661 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:24:48.641781 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638663 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:24:48.641781 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638665 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:24:48.641781 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638668 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:24:48.641781 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638671 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:24:48.641781 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638674 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:24:48.641781 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638678 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:24:48.641781 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638681 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:24:48.641781 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638684 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:24:48.641781 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638686 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:24:48.641781 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638689 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:24:48.641781 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638691 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:24:48.641781 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.638694 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:24:48.641781 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640114 2566 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 17:24:48.641781 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640123 2566 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 17:24:48.641781 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640130 2566 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 17:24:48.641781 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640136 2566 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 17:24:48.642310 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640141 2566 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 17:24:48.642310 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640144 2566 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 17:24:48.642310 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640148 2566 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 17:24:48.642310 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640153 2566 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 17:24:48.642310 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640156 2566 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 17:24:48.642310 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640160 2566 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 17:24:48.642310 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640164 2566 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 17:24:48.642310 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640167 2566 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 17:24:48.642310 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640170 2566 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 17:24:48.642310 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640173 2566 flags.go:64] FLAG: --cgroup-root="" Apr 17 17:24:48.642310 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640176 2566 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 17:24:48.642310 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640179 2566 flags.go:64] FLAG: --client-ca-file="" Apr 17 17:24:48.642310 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640182 2566 flags.go:64] FLAG: --cloud-config="" Apr 17 17:24:48.642310 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640185 2566 flags.go:64] FLAG: --cloud-provider="external" Apr 17 17:24:48.642310 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640190 2566 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 17:24:48.642310 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640195 2566 flags.go:64] FLAG: --cluster-domain="" Apr 17 17:24:48.642310 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640198 2566 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 17:24:48.642310 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640201 2566 flags.go:64] FLAG: --config-dir="" Apr 17 17:24:48.642310 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640204 2566 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 17:24:48.642310 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640207 2566 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 17:24:48.642310 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640212 2566 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 17:24:48.642310 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640215 2566 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 17:24:48.642310 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640218 2566 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 17:24:48.642310 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640222 2566 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 17:24:48.642891 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640225 2566 flags.go:64] FLAG: --contention-profiling="false" Apr 17 17:24:48.642891 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640228 2566 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 17:24:48.642891 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640231 2566 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 17:24:48.642891 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640234 2566 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 17:24:48.642891 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640237 2566 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 17:24:48.642891 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640242 2566 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 17:24:48.642891 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640244 2566 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 17:24:48.642891 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640247 2566 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 17:24:48.642891 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640260 2566 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 17:24:48.642891 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640263 2566 flags.go:64] FLAG: --enable-server="true" Apr 17 17:24:48.642891 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640267 2566 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 17:24:48.642891 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640271 2566 flags.go:64] FLAG: --event-burst="100" Apr 17 17:24:48.642891 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640275 2566 flags.go:64] FLAG: --event-qps="50" Apr 17 17:24:48.642891 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640278 2566 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 17:24:48.642891 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640281 2566 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 17:24:48.642891 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640285 2566 flags.go:64] FLAG: --eviction-hard="" Apr 17 17:24:48.642891 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640289 2566 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 17:24:48.642891 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640291 2566 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 17:24:48.642891 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640294 2566 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 17:24:48.642891 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640298 2566 flags.go:64] FLAG: --eviction-soft="" Apr 17 17:24:48.642891 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640301 2566 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 17:24:48.642891 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640303 2566 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 17:24:48.642891 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640307 2566 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 17:24:48.642891 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640310 2566 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 17:24:48.642891 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640313 2566 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 17:24:48.643526 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640316 2566 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 17:24:48.643526 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640319 2566 flags.go:64] FLAG: --feature-gates="" Apr 17 17:24:48.643526 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640323 2566 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 17:24:48.643526 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640326 2566 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 17:24:48.643526 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640330 2566 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 17:24:48.643526 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640333 2566 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 17:24:48.643526 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640336 2566 flags.go:64] FLAG: --healthz-port="10248" Apr 17 17:24:48.643526 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640339 2566 flags.go:64] FLAG: --help="false" Apr 17 17:24:48.643526 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640342 2566 flags.go:64] FLAG: --hostname-override="ip-10-0-140-147.ec2.internal" Apr 17 17:24:48.643526 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640345 2566 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 17:24:48.643526 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640348 2566 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 17:24:48.643526 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640352 2566 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 17:24:48.643526 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640355 2566 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 17:24:48.643526 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640359 2566 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 17:24:48.643526 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640362 2566 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 17:24:48.643526 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640365 2566 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 17:24:48.643526 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640368 2566 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 17:24:48.643526 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640370 2566 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 17:24:48.643526 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640373 2566 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 17:24:48.643526 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640376 2566 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 17:24:48.643526 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640379 2566 flags.go:64] FLAG: --kube-reserved="" Apr 17 17:24:48.643526 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640383 2566 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 17:24:48.643526 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640386 2566 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 17:24:48.643526 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640389 2566 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 17:24:48.644101 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640392 2566 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 17:24:48.644101 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640395 2566 flags.go:64] FLAG: --lock-file="" Apr 17 17:24:48.644101 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640398 2566 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 17:24:48.644101 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640401 2566 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 17:24:48.644101 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640404 2566 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 17:24:48.644101 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640410 2566 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 17:24:48.644101 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640413 2566 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 17:24:48.644101 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640416 2566 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 17:24:48.644101 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640420 2566 flags.go:64] FLAG: --logging-format="text" Apr 17 17:24:48.644101 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640423 2566 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 17:24:48.644101 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640426 2566 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 17:24:48.644101 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640429 2566 flags.go:64] FLAG: --manifest-url="" Apr 17 17:24:48.644101 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640432 2566 flags.go:64] FLAG: --manifest-url-header="" Apr 17 17:24:48.644101 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640437 2566 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 17:24:48.644101 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640440 2566 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 17:24:48.644101 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640444 2566 flags.go:64] FLAG: --max-pods="110" Apr 17 17:24:48.644101 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640447 2566 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 17:24:48.644101 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640450 2566 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 17:24:48.644101 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640452 2566 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 17:24:48.644101 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640456 2566 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 17:24:48.644101 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640458 2566 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 17:24:48.644101 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640461 2566 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 17:24:48.644101 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640464 2566 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 17:24:48.644101 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640475 2566 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 17:24:48.644101 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640478 2566 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 17:24:48.644718 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640480 2566 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 17:24:48.644718 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640483 2566 flags.go:64] FLAG: --pod-cidr="" Apr 17 17:24:48.644718 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640486 2566 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 17:24:48.644718 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640492 2566 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 17:24:48.644718 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640495 2566 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 17:24:48.644718 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640498 2566 flags.go:64] FLAG: --pods-per-core="0" Apr 17 17:24:48.644718 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640501 2566 flags.go:64] FLAG: --port="10250" Apr 17 17:24:48.644718 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640504 2566 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 17:24:48.644718 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640507 2566 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-00aeeb942ea227471" Apr 17 17:24:48.644718 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640510 2566 flags.go:64] FLAG: --qos-reserved="" Apr 17 17:24:48.644718 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640513 2566 flags.go:64] FLAG: --read-only-port="10255" Apr 17 17:24:48.644718 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640516 2566 flags.go:64] FLAG: --register-node="true" Apr 17 17:24:48.644718 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640520 2566 flags.go:64] FLAG: --register-schedulable="true" Apr 17 17:24:48.644718 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640523 2566 flags.go:64] FLAG: --register-with-taints="" Apr 17 17:24:48.644718 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640527 2566 flags.go:64] FLAG: --registry-burst="10" Apr 17 17:24:48.644718 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640530 2566 flags.go:64] FLAG: --registry-qps="5" Apr 17 17:24:48.644718 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640533 2566 flags.go:64] FLAG: --reserved-cpus="" Apr 17 17:24:48.644718 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640535 2566 flags.go:64] FLAG: --reserved-memory="" Apr 17 17:24:48.644718 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640539 2566 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 17:24:48.644718 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640543 2566 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 17:24:48.644718 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640547 2566 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 17:24:48.644718 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640550 2566 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 17:24:48.644718 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640553 2566 flags.go:64] FLAG: --runonce="false" Apr 17 17:24:48.644718 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640556 2566 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 17:24:48.644718 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640559 2566 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 17:24:48.645337 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640562 2566 flags.go:64] FLAG: --seccomp-default="false" Apr 17 17:24:48.645337 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640565 2566 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 17:24:48.645337 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640568 2566 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 17:24:48.645337 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640571 2566 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 17:24:48.645337 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640574 2566 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 17:24:48.645337 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640577 2566 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 17:24:48.645337 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640580 2566 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 17:24:48.645337 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640583 2566 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 17:24:48.645337 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640585 2566 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 17:24:48.645337 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640588 2566 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 17:24:48.645337 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640591 2566 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 17:24:48.645337 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640594 2566 flags.go:64] FLAG: --system-cgroups="" Apr 17 17:24:48.645337 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640597 2566 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 17:24:48.645337 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640602 2566 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 17:24:48.645337 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640605 2566 flags.go:64] FLAG: --tls-cert-file="" Apr 17 17:24:48.645337 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640608 2566 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 17:24:48.645337 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640612 2566 flags.go:64] FLAG: --tls-min-version="" Apr 17 17:24:48.645337 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640615 2566 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 17:24:48.645337 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640618 2566 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 17:24:48.645337 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640622 2566 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 17:24:48.645337 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640625 2566 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 17:24:48.645337 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640628 2566 flags.go:64] FLAG: --v="2" Apr 17 17:24:48.645337 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640632 2566 flags.go:64] FLAG: --version="false" Apr 17 17:24:48.645337 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640636 2566 flags.go:64] FLAG: --vmodule="" Apr 17 17:24:48.645337 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640641 2566 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 17:24:48.645923 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.640644 2566 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 17:24:48.645923 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640734 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:24:48.645923 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640738 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:24:48.645923 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640741 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:24:48.645923 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640743 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:24:48.645923 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640746 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:24:48.645923 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640749 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:24:48.645923 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640751 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:24:48.645923 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640754 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:24:48.645923 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640757 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:24:48.645923 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640760 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:24:48.645923 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640781 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:24:48.645923 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640785 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:24:48.645923 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640788 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:24:48.645923 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640792 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:24:48.645923 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640795 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:24:48.645923 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640797 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:24:48.645923 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640800 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:24:48.645923 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640803 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:24:48.645923 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640805 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:24:48.646418 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640808 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:24:48.646418 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640811 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:24:48.646418 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640814 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:24:48.646418 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640816 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:24:48.646418 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640819 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:24:48.646418 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640821 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:24:48.646418 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640829 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:24:48.646418 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640832 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:24:48.646418 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640834 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:24:48.646418 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640837 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:24:48.646418 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640840 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:24:48.646418 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640843 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:24:48.646418 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640845 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:24:48.646418 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640848 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:24:48.646418 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640851 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:24:48.646418 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640853 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:24:48.646418 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640856 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:24:48.646418 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640858 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:24:48.646418 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640861 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:24:48.646418 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640863 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:24:48.646980 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640866 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:24:48.646980 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640869 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:24:48.646980 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640871 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:24:48.646980 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640874 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:24:48.646980 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640876 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:24:48.646980 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640878 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:24:48.646980 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640881 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:24:48.646980 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640883 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:24:48.646980 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640886 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:24:48.646980 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640888 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:24:48.646980 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640891 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:24:48.646980 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640893 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:24:48.646980 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640896 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:24:48.646980 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640899 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:24:48.646980 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640901 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:24:48.646980 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640904 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:24:48.646980 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640907 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:24:48.646980 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640909 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:24:48.646980 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640913 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:24:48.646980 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640916 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:24:48.647495 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640918 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:24:48.647495 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640921 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:24:48.647495 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640925 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:24:48.647495 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640930 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:24:48.647495 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640933 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:24:48.647495 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640936 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:24:48.647495 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640938 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:24:48.647495 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640941 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:24:48.647495 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640944 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:24:48.647495 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640947 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:24:48.647495 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640949 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:24:48.647495 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640952 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:24:48.647495 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640954 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:24:48.647495 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640957 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:24:48.647495 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640959 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:24:48.647495 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640962 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:24:48.647495 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640965 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:24:48.647495 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640967 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:24:48.647495 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640970 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:24:48.648204 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640972 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:24:48.648204 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640976 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:24:48.648204 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640979 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:24:48.648204 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640982 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:24:48.648204 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640985 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:24:48.648204 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640988 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:24:48.648204 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640991 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:24:48.648204 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.640993 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:24:48.648204 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.641710 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:24:48.649232 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.649208 2566 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 17:24:48.649232 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.649228 2566 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 17:24:48.649379 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649291 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:24:48.649379 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649296 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:24:48.649379 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649299 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:24:48.649379 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649302 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:24:48.649379 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649305 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:24:48.649379 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649308 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:24:48.649379 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649310 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:24:48.649379 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649313 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:24:48.649379 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649316 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:24:48.649379 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649319 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:24:48.649379 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649321 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:24:48.649379 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649324 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:24:48.649379 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649327 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:24:48.649379 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649330 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:24:48.649379 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649333 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:24:48.649379 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649335 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:24:48.649379 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649338 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:24:48.649379 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649340 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:24:48.649379 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649343 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:24:48.649379 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649345 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:24:48.649884 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649348 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:24:48.649884 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649351 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:24:48.649884 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649353 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:24:48.649884 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649356 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:24:48.649884 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649359 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:24:48.649884 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649361 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:24:48.649884 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649364 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:24:48.649884 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649366 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:24:48.649884 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649369 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:24:48.649884 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649372 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:24:48.649884 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649374 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:24:48.649884 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649377 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:24:48.649884 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649380 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:24:48.649884 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649383 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:24:48.649884 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649386 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:24:48.649884 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649388 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:24:48.649884 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649391 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:24:48.649884 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649394 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:24:48.649884 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649396 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:24:48.650376 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649399 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:24:48.650376 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649401 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:24:48.650376 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649404 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:24:48.650376 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649406 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:24:48.650376 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649409 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:24:48.650376 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649411 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:24:48.650376 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649414 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:24:48.650376 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649417 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:24:48.650376 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649420 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:24:48.650376 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649422 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:24:48.650376 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649425 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:24:48.650376 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649427 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:24:48.650376 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649430 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:24:48.650376 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649432 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:24:48.650376 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649435 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:24:48.650376 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649437 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:24:48.650376 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649440 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:24:48.650376 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649442 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:24:48.650376 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649447 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:24:48.650376 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649450 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:24:48.650864 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649453 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:24:48.650864 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649455 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:24:48.650864 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649459 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:24:48.650864 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649463 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:24:48.650864 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649466 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:24:48.650864 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649469 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:24:48.650864 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649473 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:24:48.650864 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649475 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:24:48.650864 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649478 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:24:48.650864 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649481 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:24:48.650864 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649484 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:24:48.650864 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649486 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:24:48.650864 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649489 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:24:48.650864 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649491 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:24:48.650864 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649494 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:24:48.650864 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649497 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:24:48.650864 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649499 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:24:48.650864 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649503 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:24:48.650864 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649505 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:24:48.651347 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649508 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:24:48.651347 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649510 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:24:48.651347 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649514 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:24:48.651347 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649517 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:24:48.651347 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649519 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:24:48.651347 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649522 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:24:48.651347 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649524 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:24:48.651347 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649526 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:24:48.651347 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.649531 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:24:48.651347 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649641 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:24:48.651347 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649646 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:24:48.651347 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649649 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:24:48.651347 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649652 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:24:48.651347 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649654 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:24:48.651347 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649657 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:24:48.651723 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649660 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:24:48.651723 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649663 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:24:48.651723 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649665 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:24:48.651723 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649668 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:24:48.651723 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649671 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:24:48.651723 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649673 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:24:48.651723 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649677 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:24:48.651723 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649680 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:24:48.651723 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649683 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:24:48.651723 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649686 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:24:48.651723 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649688 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:24:48.651723 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649691 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:24:48.651723 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649693 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:24:48.651723 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649696 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:24:48.651723 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649698 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:24:48.651723 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649701 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:24:48.651723 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649704 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:24:48.651723 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649707 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:24:48.651723 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649709 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:24:48.652192 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649712 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:24:48.652192 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649715 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:24:48.652192 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649717 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:24:48.652192 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649720 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:24:48.652192 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649722 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:24:48.652192 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649726 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:24:48.652192 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649729 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:24:48.652192 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649732 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:24:48.652192 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649735 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:24:48.652192 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649738 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:24:48.652192 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649740 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:24:48.652192 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649743 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:24:48.652192 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649746 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:24:48.652192 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649748 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:24:48.652192 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649751 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:24:48.652192 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649753 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:24:48.652192 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649756 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:24:48.652192 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649759 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:24:48.652192 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649761 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:24:48.652192 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649764 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:24:48.652688 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649766 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:24:48.652688 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649769 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:24:48.652688 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649771 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:24:48.652688 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649774 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:24:48.652688 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649777 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:24:48.652688 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649779 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:24:48.652688 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649782 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:24:48.652688 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649784 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:24:48.652688 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649787 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:24:48.652688 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649789 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:24:48.652688 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649792 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:24:48.652688 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649795 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:24:48.652688 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649797 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:24:48.652688 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649800 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:24:48.652688 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649802 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:24:48.652688 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649804 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:24:48.652688 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649807 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:24:48.652688 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649810 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:24:48.652688 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649812 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:24:48.652688 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649814 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:24:48.653159 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649817 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:24:48.653159 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649820 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:24:48.653159 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649822 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:24:48.653159 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649824 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:24:48.653159 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649827 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:24:48.653159 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649830 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:24:48.653159 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649832 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:24:48.653159 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649835 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:24:48.653159 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649837 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:24:48.653159 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649840 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:24:48.653159 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649843 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:24:48.653159 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649845 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:24:48.653159 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649848 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:24:48.653159 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649850 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:24:48.653159 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649852 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:24:48.653159 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649855 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:24:48.653159 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649858 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:24:48.653159 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649860 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:24:48.653159 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649863 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:24:48.653159 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649865 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:24:48.653653 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:48.649868 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:24:48.653653 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.649873 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:24:48.653653 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.650578 2566 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 17:24:48.653653 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.653559 2566 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 17:24:48.654412 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.654401 2566 server.go:1019] "Starting client certificate rotation" Apr 17 17:24:48.654512 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.654497 2566 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 17:24:48.654545 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.654540 2566 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 17:24:48.675890 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.675877 2566 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 17:24:48.677713 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.677694 2566 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 17:24:48.690149 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.690129 2566 log.go:25] "Validated CRI v1 runtime API" Apr 17 17:24:48.695284 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.695270 2566 log.go:25] "Validated CRI v1 image API" Apr 17 17:24:48.701658 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.701628 2566 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 17:24:48.702903 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.702886 2566 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 17:24:48.703738 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.703720 2566 fs.go:135] Filesystem UUIDs: map[148037f6-5b0a-48fc-8aec-888751b1c928:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 fdb7de3c-6cde-4626-9a63-9f36567520d2:/dev/nvme0n1p3] Apr 17 17:24:48.703803 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.703738 2566 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 17:24:48.710523 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.710293 2566 manager.go:217] Machine: {Timestamp:2026-04-17 17:24:48.708354733 +0000 UTC m=+0.378434367 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099332 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2844da5135490e523e9f7eddf582c9 SystemUUID:ec2844da-5135-490e-523e-9f7eddf582c9 BootID:e7cf7d46-3ff8-4a1b-83b9-03c63a1c580b Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:d3:bb:ed:c6:3f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:d3:bb:ed:c6:3f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:3a:01:af:5e:d7:d8 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 17:24:48.711029 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.711017 2566 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 17:24:48.711113 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.711101 2566 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 17:24:48.711421 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.711402 2566 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 17:24:48.711567 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.711423 2566 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-140-147.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 17:24:48.711618 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.711575 2566 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 17:24:48.711618 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.711585 2566 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 17:24:48.711618 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.711598 2566 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 17:24:48.712344 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.712333 2566 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 17:24:48.713617 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.713607 2566 state_mem.go:36] "Initialized new in-memory state store" Apr 17 17:24:48.713717 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.713708 2566 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 17:24:48.715764 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.715755 2566 kubelet.go:491] "Attempting to sync node with API server" Apr 17 17:24:48.715799 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.715768 2566 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 17:24:48.715799 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.715779 2566 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 17:24:48.715799 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.715788 2566 kubelet.go:397] "Adding apiserver pod source" Apr 17 17:24:48.715799 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.715797 2566 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 17:24:48.716784 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.716773 2566 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 17:24:48.716823 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.716791 2566 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 17:24:48.719470 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.719456 2566 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 17:24:48.720745 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.720732 2566 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 17:24:48.722380 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.722369 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 17:24:48.722425 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.722386 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 17:24:48.722425 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.722393 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 17:24:48.722425 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.722401 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 17:24:48.722425 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.722409 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 17:24:48.722425 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.722415 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 17:24:48.722425 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.722422 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 17:24:48.722581 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.722433 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 17:24:48.722581 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.722440 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 17:24:48.722581 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.722446 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 17:24:48.722581 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.722455 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 17:24:48.722581 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.722464 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 17:24:48.723508 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.723497 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 17:24:48.723543 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.723510 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 17:24:48.727077 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.727064 2566 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 17:24:48.727124 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.727105 2566 server.go:1295] "Started kubelet" Apr 17 17:24:48.727213 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.727178 2566 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 17:24:48.727369 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.727286 2566 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 17:24:48.727428 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.727412 2566 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 17:24:48.727926 ip-10-0-140-147 systemd[1]: Started Kubernetes Kubelet. Apr 17 17:24:48.728081 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:48.728061 2566 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 17:24:48.728146 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:48.728089 2566 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-140-147.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 17:24:48.728190 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.728163 2566 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-140-147.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 17:24:48.729061 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.729038 2566 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 17:24:48.729220 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.729208 2566 server.go:317] "Adding debug handlers to kubelet server" Apr 17 17:24:48.732619 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:48.731620 2566 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-147.ec2.internal.18a734d9f3ff0f94 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-147.ec2.internal,UID:ip-10-0-140-147.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-140-147.ec2.internal,},FirstTimestamp:2026-04-17 17:24:48.727076756 +0000 UTC m=+0.397156388,LastTimestamp:2026-04-17 17:24:48.727076756 +0000 UTC m=+0.397156388,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-147.ec2.internal,}" Apr 17 17:24:48.734135 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.734118 2566 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 17:24:48.734702 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.734684 2566 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 17:24:48.734956 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.734931 2566 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wfqc6" Apr 17 17:24:48.735305 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.735288 2566 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 17:24:48.735305 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.735291 2566 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 17:24:48.735436 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.735313 2566 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 17:24:48.735436 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.735415 2566 reconstruct.go:97] "Volume reconstruction finished" Apr 17 17:24:48.735436 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.735426 2566 reconciler.go:26] "Reconciler: start to sync state" Apr 17 17:24:48.735698 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:48.735662 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-147.ec2.internal\" not found" Apr 17 17:24:48.737279 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.737244 2566 factory.go:153] Registering CRI-O factory Apr 17 17:24:48.737382 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.737283 2566 factory.go:223] Registration of the crio container factory successfully Apr 17 17:24:48.737382 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.737354 2566 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 17:24:48.737382 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.737364 2566 factory.go:55] Registering systemd factory Apr 17 17:24:48.737382 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.737374 2566 factory.go:223] Registration of the systemd container factory successfully Apr 17 17:24:48.737559 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.737397 2566 factory.go:103] Registering Raw factory Apr 17 17:24:48.737559 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.737422 2566 manager.go:1196] Started watching for new ooms in manager Apr 17 17:24:48.738296 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.738279 2566 manager.go:319] Starting recovery of all containers Apr 17 17:24:48.740088 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:48.740064 2566 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 17:24:48.743359 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.743337 2566 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wfqc6" Apr 17 17:24:48.744196 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.744163 2566 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 17:24:48.745980 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:48.745951 2566 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 17 17:24:48.746061 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:48.745992 2566 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-140-147.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 17 17:24:48.749626 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.749598 2566 manager.go:324] Recovery completed Apr 17 17:24:48.754106 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.754093 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:24:48.757196 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.757180 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-147.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:24:48.757268 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.757214 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-147.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:24:48.757268 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.757225 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-147.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:24:48.757786 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.757770 2566 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 17:24:48.757853 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.757787 2566 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 17:24:48.757853 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.757805 2566 state_mem.go:36] "Initialized new in-memory state store" Apr 17 17:24:48.759987 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.759975 2566 policy_none.go:49] "None policy: Start" Apr 17 17:24:48.760041 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.759991 2566 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 17:24:48.760041 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.760001 2566 state_mem.go:35] "Initializing new in-memory state store" Apr 17 17:24:48.802881 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.802861 2566 manager.go:341] "Starting Device Plugin manager" Apr 17 17:24:48.805914 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:48.802949 2566 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 17:24:48.805914 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.802964 2566 server.go:85] "Starting device plugin registration server" Apr 17 17:24:48.805914 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.803176 2566 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 17:24:48.805914 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.803185 2566 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 17:24:48.805914 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.803296 2566 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 17:24:48.805914 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.803372 2566 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 17:24:48.805914 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.803382 2566 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 17:24:48.805914 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:48.804306 2566 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 17:24:48.805914 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:48.804341 2566 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-140-147.ec2.internal\" not found" Apr 17 17:24:48.833291 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.833266 2566 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 17:24:48.833405 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.833297 2566 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 17:24:48.833405 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.833318 2566 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 17:24:48.833405 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.833325 2566 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 17:24:48.833405 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:48.833355 2566 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 17:24:48.836513 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.836494 2566 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:24:48.903527 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.903428 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:24:48.904596 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.904577 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-147.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:24:48.904706 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.904607 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-147.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:24:48.904706 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.904618 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-147.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:24:48.904706 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.904643 2566 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-140-147.ec2.internal" Apr 17 17:24:48.910629 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.910614 2566 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-140-147.ec2.internal" Apr 17 17:24:48.910694 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:48.910635 2566 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-140-147.ec2.internal\": node \"ip-10-0-140-147.ec2.internal\" not found" Apr 17 17:24:48.922386 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:48.922364 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-147.ec2.internal\" not found" Apr 17 17:24:48.933968 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.933942 2566 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-147.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-140-147.ec2.internal"] Apr 17 17:24:48.934026 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.934010 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:24:48.934772 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.934757 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-147.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:24:48.934843 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.934784 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-147.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:24:48.934843 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.934794 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-147.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:24:48.935990 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.935978 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:24:48.936138 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.936122 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-147.ec2.internal" Apr 17 17:24:48.936202 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.936165 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:24:48.936643 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.936629 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-147.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:24:48.936727 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.936629 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-147.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:24:48.936727 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.936689 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-147.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:24:48.936727 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.936707 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-147.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:24:48.936727 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.936655 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-147.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:24:48.936898 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.936734 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-147.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:24:48.937808 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.937792 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-147.ec2.internal" Apr 17 17:24:48.937889 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.937852 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:24:48.938471 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.938455 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-147.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:24:48.938558 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.938484 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-147.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:24:48.938558 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:48.938499 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-147.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:24:48.953782 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:48.953757 2566 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-147.ec2.internal\" not found" node="ip-10-0-140-147.ec2.internal" Apr 17 17:24:48.958591 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:48.958573 2566 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-147.ec2.internal\" not found" node="ip-10-0-140-147.ec2.internal" Apr 17 17:24:49.022609 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:49.022574 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-147.ec2.internal\" not found" Apr 17 17:24:49.036120 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:49.036084 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/fc5397f3b7e87f74f15b6f9197df7f72-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-147.ec2.internal\" (UID: \"fc5397f3b7e87f74f15b6f9197df7f72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-147.ec2.internal" Apr 17 17:24:49.036281 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:49.036132 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc5397f3b7e87f74f15b6f9197df7f72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-147.ec2.internal\" (UID: \"fc5397f3b7e87f74f15b6f9197df7f72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-147.ec2.internal" Apr 17 17:24:49.123512 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:49.123466 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-147.ec2.internal\" not found" Apr 17 17:24:49.136826 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:49.136743 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/fc5397f3b7e87f74f15b6f9197df7f72-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-147.ec2.internal\" (UID: \"fc5397f3b7e87f74f15b6f9197df7f72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-147.ec2.internal" Apr 17 17:24:49.136910 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:49.136824 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/fc5397f3b7e87f74f15b6f9197df7f72-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-147.ec2.internal\" (UID: \"fc5397f3b7e87f74f15b6f9197df7f72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-147.ec2.internal" Apr 17 17:24:49.136910 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:49.136865 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc5397f3b7e87f74f15b6f9197df7f72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-147.ec2.internal\" (UID: \"fc5397f3b7e87f74f15b6f9197df7f72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-147.ec2.internal" Apr 17 17:24:49.136910 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:49.136903 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/cf7a4b638e7a30f012eeb3bf5b7f1ece-config\") pod \"kube-apiserver-proxy-ip-10-0-140-147.ec2.internal\" (UID: \"cf7a4b638e7a30f012eeb3bf5b7f1ece\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-147.ec2.internal" Apr 17 17:24:49.137021 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:49.136943 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc5397f3b7e87f74f15b6f9197df7f72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-147.ec2.internal\" (UID: \"fc5397f3b7e87f74f15b6f9197df7f72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-147.ec2.internal" Apr 17 17:24:49.224215 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:49.224148 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-147.ec2.internal\" not found" Apr 17 17:24:49.237510 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:49.237489 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/cf7a4b638e7a30f012eeb3bf5b7f1ece-config\") pod \"kube-apiserver-proxy-ip-10-0-140-147.ec2.internal\" (UID: \"cf7a4b638e7a30f012eeb3bf5b7f1ece\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-147.ec2.internal" Apr 17 17:24:49.237566 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:49.237536 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/cf7a4b638e7a30f012eeb3bf5b7f1ece-config\") pod \"kube-apiserver-proxy-ip-10-0-140-147.ec2.internal\" (UID: \"cf7a4b638e7a30f012eeb3bf5b7f1ece\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-147.ec2.internal" Apr 17 17:24:49.255602 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:49.255583 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-147.ec2.internal" Apr 17 17:24:49.261327 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:49.261310 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-147.ec2.internal" Apr 17 17:24:49.324989 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:49.324954 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-147.ec2.internal\" not found" Apr 17 17:24:49.425489 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:49.425445 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-147.ec2.internal\" not found" Apr 17 17:24:49.525975 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:49.525900 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-147.ec2.internal\" not found" Apr 17 17:24:49.562517 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:49.562492 2566 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:24:49.605377 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:49.605351 2566 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:24:49.626415 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:49.626389 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-147.ec2.internal\" not found" Apr 17 17:24:49.653954 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:49.653934 2566 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 17:24:49.654555 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:49.654068 2566 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 17:24:49.654555 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:49.654098 2566 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 17:24:49.654555 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:49.654098 2566 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 17:24:49.726517 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:49.726484 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-147.ec2.internal\" not found" Apr 17 17:24:49.734534 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:49.734515 2566 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 17:24:49.740940 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:49.740907 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf7a4b638e7a30f012eeb3bf5b7f1ece.slice/crio-608b3cc9a11525ccefdcf1047a3a06b99126afaa68c1ed313012302a0577bc6c WatchSource:0}: Error finding container 608b3cc9a11525ccefdcf1047a3a06b99126afaa68c1ed313012302a0577bc6c: Status 404 returned error can't find the container with id 608b3cc9a11525ccefdcf1047a3a06b99126afaa68c1ed313012302a0577bc6c Apr 17 17:24:49.741545 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:49.741527 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc5397f3b7e87f74f15b6f9197df7f72.slice/crio-6066b06541607e95486bf6b2b196fb97a6d90c85951e7ca3b905f4faad76a5a6 WatchSource:0}: Error finding container 6066b06541607e95486bf6b2b196fb97a6d90c85951e7ca3b905f4faad76a5a6: Status 404 returned error can't find the container with id 6066b06541607e95486bf6b2b196fb97a6d90c85951e7ca3b905f4faad76a5a6 Apr 17 17:24:49.745919 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:49.745904 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:24:49.745996 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:49.745977 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 17:19:48 +0000 UTC" deadline="2027-11-14 12:23:57.418868039 +0000 UTC" Apr 17 17:24:49.746034 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:49.745997 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13818h59m7.67287359s" Apr 17 17:24:49.749515 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:49.749497 2566 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 17:24:49.770052 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:49.770032 2566 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-vwvkm" Apr 17 17:24:49.777876 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:49.777838 2566 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-vwvkm" Apr 17 17:24:49.827053 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:49.827034 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-147.ec2.internal\" not found" Apr 17 17:24:49.835731 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:49.835684 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-147.ec2.internal" event={"ID":"fc5397f3b7e87f74f15b6f9197df7f72","Type":"ContainerStarted","Data":"6066b06541607e95486bf6b2b196fb97a6d90c85951e7ca3b905f4faad76a5a6"} Apr 17 17:24:49.836438 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:49.836418 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-147.ec2.internal" event={"ID":"cf7a4b638e7a30f012eeb3bf5b7f1ece","Type":"ContainerStarted","Data":"608b3cc9a11525ccefdcf1047a3a06b99126afaa68c1ed313012302a0577bc6c"} Apr 17 17:24:49.927780 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:49.927756 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-147.ec2.internal\" not found" Apr 17 17:24:50.028307 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:50.028216 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-147.ec2.internal\" not found" Apr 17 17:24:50.071843 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.071817 2566 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:24:50.135666 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.135638 2566 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-147.ec2.internal" Apr 17 17:24:50.147335 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.147315 2566 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 17:24:50.148333 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.148307 2566 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-147.ec2.internal" Apr 17 17:24:50.156357 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.156342 2566 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 17:24:50.716834 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.716804 2566 apiserver.go:52] "Watching apiserver" Apr 17 17:24:50.723350 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.723321 2566 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 17:24:50.726162 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.726137 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-kc5w4","kube-system/konnectivity-agent-q9m54","kube-system/kube-apiserver-proxy-ip-10-0-140-147.ec2.internal","openshift-multus/multus-xxqdc","openshift-ovn-kubernetes/ovnkube-node-nz777","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6gnf","openshift-cluster-node-tuning-operator/tuned-nmmln","openshift-image-registry/node-ca-ttt97","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-147.ec2.internal","openshift-multus/multus-additional-cni-plugins-qn8t8","openshift-multus/network-metrics-daemon-wgmfp","openshift-network-diagnostics/network-check-target-vdhlz"] Apr 17 17:24:50.729170 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.729117 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-kc5w4" Apr 17 17:24:50.730238 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.730217 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-q9m54" Apr 17 17:24:50.730416 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.730393 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.731619 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.731600 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6gnf" Apr 17 17:24:50.732511 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.732495 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 17:24:50.732745 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.732726 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 17:24:50.732745 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.732743 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:24:50.732955 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.732783 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-nmmln" Apr 17 17:24:50.733025 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.732961 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 17:24:50.733025 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.732984 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-sp75x\"" Apr 17 17:24:50.733146 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.733087 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 17:24:50.733146 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.733103 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 17:24:50.733146 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.733116 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 17:24:50.733603 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.733585 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 17:24:50.735126 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.734302 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 17:24:50.735126 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.734651 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 17:24:50.735126 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.734676 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-f9gp2\"" Apr 17 17:24:50.735126 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.734900 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 17:24:50.735386 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.735137 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 17:24:50.736227 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.735647 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-b6dcv\"" Apr 17 17:24:50.736227 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.735735 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ttt97" Apr 17 17:24:50.736467 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.736447 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-zqhdk\"" Apr 17 17:24:50.736914 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.736677 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 17:24:50.736914 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.736762 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:24:50.736914 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.736765 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-5x4pc\"" Apr 17 17:24:50.738010 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.737992 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 17:24:50.738110 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.738052 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-vlcl6\"" Apr 17 17:24:50.738110 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.738106 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 17:24:50.738318 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.738293 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 17:24:50.738318 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.738314 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 17:24:50.739006 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.738962 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qn8t8" Apr 17 17:24:50.739339 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.739153 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vdhlz" Apr 17 17:24:50.739339 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:50.739218 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vdhlz" podUID="d29a1468-8ac5-454a-a993-6a5055191ec4" Apr 17 17:24:50.739512 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.739487 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 17:24:50.740431 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.740413 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.741557 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.741539 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-6t24b\"" Apr 17 17:24:50.741659 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.741541 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 17:24:50.741722 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.741705 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 17:24:50.741883 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.741831 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 17:24:50.741985 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.741953 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wgmfp" Apr 17 17:24:50.741985 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.741982 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 17:24:50.742093 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:50.742007 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wgmfp" podUID="b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a" Apr 17 17:24:50.742486 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.742466 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 17:24:50.742569 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.742544 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-v4g62\"" Apr 17 17:24:50.743245 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.743227 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 17:24:50.744910 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.744887 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a21d46e7-a070-48be-a4ce-d2af9bd52539-etc-systemd\") pod \"tuned-nmmln\" (UID: \"a21d46e7-a070-48be-a4ce-d2af9bd52539\") " pod="openshift-cluster-node-tuning-operator/tuned-nmmln" Apr 17 17:24:50.744999 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.744922 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs7m2\" (UniqueName: \"kubernetes.io/projected/a21d46e7-a070-48be-a4ce-d2af9bd52539-kube-api-access-hs7m2\") pod \"tuned-nmmln\" (UID: \"a21d46e7-a070-48be-a4ce-d2af9bd52539\") " pod="openshift-cluster-node-tuning-operator/tuned-nmmln" Apr 17 17:24:50.744999 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.744949 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9ea4766d-2197-486d-8373-8f7340574545-etc-selinux\") pod \"aws-ebs-csi-driver-node-n6gnf\" (UID: \"9ea4766d-2197-486d-8373-8f7340574545\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6gnf" Apr 17 17:24:50.744999 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.744969 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a21d46e7-a070-48be-a4ce-d2af9bd52539-etc-modprobe-d\") pod \"tuned-nmmln\" (UID: \"a21d46e7-a070-48be-a4ce-d2af9bd52539\") " pod="openshift-cluster-node-tuning-operator/tuned-nmmln" Apr 17 17:24:50.744999 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.744989 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/35fe1bc2-e1a4-4744-8f67-3cc38747e567-var-lib-openvswitch\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.745167 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.745012 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9ea4766d-2197-486d-8373-8f7340574545-device-dir\") pod \"aws-ebs-csi-driver-node-n6gnf\" (UID: \"9ea4766d-2197-486d-8373-8f7340574545\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6gnf" Apr 17 17:24:50.745167 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.745036 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwrsv\" (UniqueName: \"kubernetes.io/projected/9ea4766d-2197-486d-8373-8f7340574545-kube-api-access-zwrsv\") pod \"aws-ebs-csi-driver-node-n6gnf\" (UID: \"9ea4766d-2197-486d-8373-8f7340574545\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6gnf" Apr 17 17:24:50.745167 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.745060 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/968b8801-64dd-454d-b1af-675ad2d36924-system-cni-dir\") pod \"multus-additional-cni-plugins-qn8t8\" (UID: \"968b8801-64dd-454d-b1af-675ad2d36924\") " pod="openshift-multus/multus-additional-cni-plugins-qn8t8" Apr 17 17:24:50.745167 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.745086 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/968b8801-64dd-454d-b1af-675ad2d36924-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qn8t8\" (UID: \"968b8801-64dd-454d-b1af-675ad2d36924\") " pod="openshift-multus/multus-additional-cni-plugins-qn8t8" Apr 17 17:24:50.745167 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.745110 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/35fe1bc2-e1a4-4744-8f67-3cc38747e567-run-ovn\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.745167 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.745135 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/348f9f1d-49f3-4771-9e69-3b4ba90f29e9-konnectivity-ca\") pod \"konnectivity-agent-q9m54\" (UID: \"348f9f1d-49f3-4771-9e69-3b4ba90f29e9\") " pod="kube-system/konnectivity-agent-q9m54" Apr 17 17:24:50.745167 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.745151 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/35fe1bc2-e1a4-4744-8f67-3cc38747e567-host-slash\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.745439 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.745173 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a21d46e7-a070-48be-a4ce-d2af9bd52539-var-lib-kubelet\") pod \"tuned-nmmln\" (UID: \"a21d46e7-a070-48be-a4ce-d2af9bd52539\") " pod="openshift-cluster-node-tuning-operator/tuned-nmmln" Apr 17 17:24:50.745439 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.745189 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a21d46e7-a070-48be-a4ce-d2af9bd52539-host\") pod \"tuned-nmmln\" (UID: \"a21d46e7-a070-48be-a4ce-d2af9bd52539\") " pod="openshift-cluster-node-tuning-operator/tuned-nmmln" Apr 17 17:24:50.745439 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.745213 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/968b8801-64dd-454d-b1af-675ad2d36924-cnibin\") pod \"multus-additional-cni-plugins-qn8t8\" (UID: \"968b8801-64dd-454d-b1af-675ad2d36924\") " pod="openshift-multus/multus-additional-cni-plugins-qn8t8" Apr 17 17:24:50.745439 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.745266 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/35fe1bc2-e1a4-4744-8f67-3cc38747e567-host-cni-netd\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.745439 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.745296 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a21d46e7-a070-48be-a4ce-d2af9bd52539-tmp\") pod \"tuned-nmmln\" (UID: \"a21d46e7-a070-48be-a4ce-d2af9bd52539\") " pod="openshift-cluster-node-tuning-operator/tuned-nmmln" Apr 17 17:24:50.745439 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.745318 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/35fe1bc2-e1a4-4744-8f67-3cc38747e567-ovnkube-config\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.745439 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.745341 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9ea4766d-2197-486d-8373-8f7340574545-sys-fs\") pod \"aws-ebs-csi-driver-node-n6gnf\" (UID: \"9ea4766d-2197-486d-8373-8f7340574545\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6gnf" Apr 17 17:24:50.745439 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.745367 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd4hr\" (UniqueName: \"kubernetes.io/projected/b10257f3-c35f-4a14-bf62-ab474fc1eeae-kube-api-access-dd4hr\") pod \"iptables-alerter-kc5w4\" (UID: \"b10257f3-c35f-4a14-bf62-ab474fc1eeae\") " pod="openshift-network-operator/iptables-alerter-kc5w4" Apr 17 17:24:50.745439 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.745393 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/35fe1bc2-e1a4-4744-8f67-3cc38747e567-run-openvswitch\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.745439 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.745414 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a3ce8ab-c625-49a4-a457-49fae7f24c9a-host\") pod \"node-ca-ttt97\" (UID: \"5a3ce8ab-c625-49a4-a457-49fae7f24c9a\") " pod="openshift-image-registry/node-ca-ttt97" Apr 17 17:24:50.745795 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.745453 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b10257f3-c35f-4a14-bf62-ab474fc1eeae-iptables-alerter-script\") pod \"iptables-alerter-kc5w4\" (UID: \"b10257f3-c35f-4a14-bf62-ab474fc1eeae\") " pod="openshift-network-operator/iptables-alerter-kc5w4" Apr 17 17:24:50.745795 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.745486 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/35fe1bc2-e1a4-4744-8f67-3cc38747e567-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.745795 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.745513 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5a3ce8ab-c625-49a4-a457-49fae7f24c9a-serviceca\") pod \"node-ca-ttt97\" (UID: \"5a3ce8ab-c625-49a4-a457-49fae7f24c9a\") " pod="openshift-image-registry/node-ca-ttt97" Apr 17 17:24:50.745795 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.745538 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/35fe1bc2-e1a4-4744-8f67-3cc38747e567-host-cni-bin\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.745795 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.745563 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/35fe1bc2-e1a4-4744-8f67-3cc38747e567-host-kubelet\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.745795 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.745580 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/35fe1bc2-e1a4-4744-8f67-3cc38747e567-node-log\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.745795 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.745594 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/35fe1bc2-e1a4-4744-8f67-3cc38747e567-log-socket\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.745795 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.745614 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9ea4766d-2197-486d-8373-8f7340574545-kubelet-dir\") pod \"aws-ebs-csi-driver-node-n6gnf\" (UID: \"9ea4766d-2197-486d-8373-8f7340574545\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6gnf" Apr 17 17:24:50.745795 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.745647 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9ea4766d-2197-486d-8373-8f7340574545-socket-dir\") pod \"aws-ebs-csi-driver-node-n6gnf\" (UID: \"9ea4766d-2197-486d-8373-8f7340574545\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6gnf" Apr 17 17:24:50.745795 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.745665 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnncd\" (UniqueName: \"kubernetes.io/projected/968b8801-64dd-454d-b1af-675ad2d36924-kube-api-access-pnncd\") pod \"multus-additional-cni-plugins-qn8t8\" (UID: \"968b8801-64dd-454d-b1af-675ad2d36924\") " pod="openshift-multus/multus-additional-cni-plugins-qn8t8" Apr 17 17:24:50.745795 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.745681 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz7qh\" (UniqueName: \"kubernetes.io/projected/35fe1bc2-e1a4-4744-8f67-3cc38747e567-kube-api-access-zz7qh\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.745795 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.745730 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a21d46e7-a070-48be-a4ce-d2af9bd52539-lib-modules\") pod \"tuned-nmmln\" (UID: \"a21d46e7-a070-48be-a4ce-d2af9bd52539\") " pod="openshift-cluster-node-tuning-operator/tuned-nmmln" Apr 17 17:24:50.745795 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.745748 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/968b8801-64dd-454d-b1af-675ad2d36924-os-release\") pod \"multus-additional-cni-plugins-qn8t8\" (UID: \"968b8801-64dd-454d-b1af-675ad2d36924\") " pod="openshift-multus/multus-additional-cni-plugins-qn8t8" Apr 17 17:24:50.745795 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.745762 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/968b8801-64dd-454d-b1af-675ad2d36924-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qn8t8\" (UID: \"968b8801-64dd-454d-b1af-675ad2d36924\") " pod="openshift-multus/multus-additional-cni-plugins-qn8t8" Apr 17 17:24:50.745795 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.745782 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/35fe1bc2-e1a4-4744-8f67-3cc38747e567-host-run-netns\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.746378 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.745817 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/35fe1bc2-e1a4-4744-8f67-3cc38747e567-ovn-node-metrics-cert\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.746378 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.745837 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/35fe1bc2-e1a4-4744-8f67-3cc38747e567-run-systemd\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.746378 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.745851 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/35fe1bc2-e1a4-4744-8f67-3cc38747e567-ovnkube-script-lib\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.746378 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.745872 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2248\" (UniqueName: \"kubernetes.io/projected/5a3ce8ab-c625-49a4-a457-49fae7f24c9a-kube-api-access-k2248\") pod \"node-ca-ttt97\" (UID: \"5a3ce8ab-c625-49a4-a457-49fae7f24c9a\") " pod="openshift-image-registry/node-ca-ttt97" Apr 17 17:24:50.746378 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.745895 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a21d46e7-a070-48be-a4ce-d2af9bd52539-etc-sysctl-conf\") pod \"tuned-nmmln\" (UID: \"a21d46e7-a070-48be-a4ce-d2af9bd52539\") " pod="openshift-cluster-node-tuning-operator/tuned-nmmln" Apr 17 17:24:50.746378 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.745921 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/968b8801-64dd-454d-b1af-675ad2d36924-cni-binary-copy\") pod \"multus-additional-cni-plugins-qn8t8\" (UID: \"968b8801-64dd-454d-b1af-675ad2d36924\") " pod="openshift-multus/multus-additional-cni-plugins-qn8t8" Apr 17 17:24:50.746378 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.745943 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/35fe1bc2-e1a4-4744-8f67-3cc38747e567-host-run-ovn-kubernetes\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.746378 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.745966 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/35fe1bc2-e1a4-4744-8f67-3cc38747e567-etc-openvswitch\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.746378 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.745988 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/35fe1bc2-e1a4-4744-8f67-3cc38747e567-systemd-units\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.746378 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.746032 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a21d46e7-a070-48be-a4ce-d2af9bd52539-etc-sysctl-d\") pod \"tuned-nmmln\" (UID: \"a21d46e7-a070-48be-a4ce-d2af9bd52539\") " pod="openshift-cluster-node-tuning-operator/tuned-nmmln" Apr 17 17:24:50.746378 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.746089 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b10257f3-c35f-4a14-bf62-ab474fc1eeae-host-slash\") pod \"iptables-alerter-kc5w4\" (UID: \"b10257f3-c35f-4a14-bf62-ab474fc1eeae\") " pod="openshift-network-operator/iptables-alerter-kc5w4" Apr 17 17:24:50.746378 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.746124 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/348f9f1d-49f3-4771-9e69-3b4ba90f29e9-agent-certs\") pod \"konnectivity-agent-q9m54\" (UID: \"348f9f1d-49f3-4771-9e69-3b4ba90f29e9\") " pod="kube-system/konnectivity-agent-q9m54" Apr 17 17:24:50.746378 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.746169 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9ea4766d-2197-486d-8373-8f7340574545-registration-dir\") pod \"aws-ebs-csi-driver-node-n6gnf\" (UID: \"9ea4766d-2197-486d-8373-8f7340574545\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6gnf" Apr 17 17:24:50.746378 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.746194 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a21d46e7-a070-48be-a4ce-d2af9bd52539-etc-tuned\") pod \"tuned-nmmln\" (UID: \"a21d46e7-a070-48be-a4ce-d2af9bd52539\") " pod="openshift-cluster-node-tuning-operator/tuned-nmmln" Apr 17 17:24:50.746378 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.746225 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a21d46e7-a070-48be-a4ce-d2af9bd52539-etc-kubernetes\") pod \"tuned-nmmln\" (UID: \"a21d46e7-a070-48be-a4ce-d2af9bd52539\") " pod="openshift-cluster-node-tuning-operator/tuned-nmmln" Apr 17 17:24:50.746378 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.746273 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a21d46e7-a070-48be-a4ce-d2af9bd52539-run\") pod \"tuned-nmmln\" (UID: \"a21d46e7-a070-48be-a4ce-d2af9bd52539\") " pod="openshift-cluster-node-tuning-operator/tuned-nmmln" Apr 17 17:24:50.746968 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.746317 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a21d46e7-a070-48be-a4ce-d2af9bd52539-sys\") pod \"tuned-nmmln\" (UID: \"a21d46e7-a070-48be-a4ce-d2af9bd52539\") " pod="openshift-cluster-node-tuning-operator/tuned-nmmln" Apr 17 17:24:50.746968 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.746358 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/968b8801-64dd-454d-b1af-675ad2d36924-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qn8t8\" (UID: \"968b8801-64dd-454d-b1af-675ad2d36924\") " pod="openshift-multus/multus-additional-cni-plugins-qn8t8" Apr 17 17:24:50.746968 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.746388 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/35fe1bc2-e1a4-4744-8f67-3cc38747e567-env-overrides\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.746968 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.746420 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a21d46e7-a070-48be-a4ce-d2af9bd52539-etc-sysconfig\") pod \"tuned-nmmln\" (UID: \"a21d46e7-a070-48be-a4ce-d2af9bd52539\") " pod="openshift-cluster-node-tuning-operator/tuned-nmmln" Apr 17 17:24:50.778453 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.778428 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 17:19:49 +0000 UTC" deadline="2027-10-25 23:10:44.04705816 +0000 UTC" Apr 17 17:24:50.778453 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.778453 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13349h45m53.268609252s" Apr 17 17:24:50.836710 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.836682 2566 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 17:24:50.847472 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.847447 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/35fe1bc2-e1a4-4744-8f67-3cc38747e567-run-systemd\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.847609 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.847486 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/35fe1bc2-e1a4-4744-8f67-3cc38747e567-ovnkube-script-lib\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.847609 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.847510 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k2248\" (UniqueName: \"kubernetes.io/projected/5a3ce8ab-c625-49a4-a457-49fae7f24c9a-kube-api-access-k2248\") pod \"node-ca-ttt97\" (UID: \"5a3ce8ab-c625-49a4-a457-49fae7f24c9a\") " pod="openshift-image-registry/node-ca-ttt97" Apr 17 17:24:50.847609 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.847530 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a21d46e7-a070-48be-a4ce-d2af9bd52539-etc-sysctl-conf\") pod \"tuned-nmmln\" (UID: \"a21d46e7-a070-48be-a4ce-d2af9bd52539\") " pod="openshift-cluster-node-tuning-operator/tuned-nmmln" Apr 17 17:24:50.847609 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.847553 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/968b8801-64dd-454d-b1af-675ad2d36924-cni-binary-copy\") pod \"multus-additional-cni-plugins-qn8t8\" (UID: \"968b8801-64dd-454d-b1af-675ad2d36924\") " pod="openshift-multus/multus-additional-cni-plugins-qn8t8" Apr 17 17:24:50.847609 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.847574 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/35fe1bc2-e1a4-4744-8f67-3cc38747e567-run-systemd\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.847609 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.847582 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ce658e37-54cb-4560-9405-21ba87bc0d35-host-var-lib-cni-multus\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.847894 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.847641 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ce658e37-54cb-4560-9405-21ba87bc0d35-multus-daemon-config\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.847894 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.847685 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqbcc\" (UniqueName: \"kubernetes.io/projected/ce658e37-54cb-4560-9405-21ba87bc0d35-kube-api-access-tqbcc\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.847894 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.847713 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/35fe1bc2-e1a4-4744-8f67-3cc38747e567-host-run-ovn-kubernetes\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.847894 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.847738 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/35fe1bc2-e1a4-4744-8f67-3cc38747e567-etc-openvswitch\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.847894 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.847763 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/35fe1bc2-e1a4-4744-8f67-3cc38747e567-systemd-units\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.847894 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.847791 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a21d46e7-a070-48be-a4ce-d2af9bd52539-etc-sysctl-d\") pod \"tuned-nmmln\" (UID: \"a21d46e7-a070-48be-a4ce-d2af9bd52539\") " pod="openshift-cluster-node-tuning-operator/tuned-nmmln" Apr 17 17:24:50.847894 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.847814 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b10257f3-c35f-4a14-bf62-ab474fc1eeae-host-slash\") pod \"iptables-alerter-kc5w4\" (UID: \"b10257f3-c35f-4a14-bf62-ab474fc1eeae\") " pod="openshift-network-operator/iptables-alerter-kc5w4" Apr 17 17:24:50.847894 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.847818 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/35fe1bc2-e1a4-4744-8f67-3cc38747e567-host-run-ovn-kubernetes\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.847894 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.847841 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ce658e37-54cb-4560-9405-21ba87bc0d35-system-cni-dir\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.847894 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.847875 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ce658e37-54cb-4560-9405-21ba87bc0d35-host-run-k8s-cni-cncf-io\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.847894 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.847892 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/35fe1bc2-e1a4-4744-8f67-3cc38747e567-etc-openvswitch\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.848408 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.847904 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ce658e37-54cb-4560-9405-21ba87bc0d35-host-var-lib-cni-bin\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.848408 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.847924 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/35fe1bc2-e1a4-4744-8f67-3cc38747e567-systemd-units\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.848408 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.847933 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/348f9f1d-49f3-4771-9e69-3b4ba90f29e9-agent-certs\") pod \"konnectivity-agent-q9m54\" (UID: \"348f9f1d-49f3-4771-9e69-3b4ba90f29e9\") " pod="kube-system/konnectivity-agent-q9m54" Apr 17 17:24:50.848408 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.848048 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a21d46e7-a070-48be-a4ce-d2af9bd52539-etc-sysctl-d\") pod \"tuned-nmmln\" (UID: \"a21d46e7-a070-48be-a4ce-d2af9bd52539\") " pod="openshift-cluster-node-tuning-operator/tuned-nmmln" Apr 17 17:24:50.848408 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.848059 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a21d46e7-a070-48be-a4ce-d2af9bd52539-etc-sysctl-conf\") pod \"tuned-nmmln\" (UID: \"a21d46e7-a070-48be-a4ce-d2af9bd52539\") " pod="openshift-cluster-node-tuning-operator/tuned-nmmln" Apr 17 17:24:50.848408 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.848088 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9ea4766d-2197-486d-8373-8f7340574545-registration-dir\") pod \"aws-ebs-csi-driver-node-n6gnf\" (UID: \"9ea4766d-2197-486d-8373-8f7340574545\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6gnf" Apr 17 17:24:50.848408 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.848116 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a21d46e7-a070-48be-a4ce-d2af9bd52539-etc-tuned\") pod \"tuned-nmmln\" (UID: \"a21d46e7-a070-48be-a4ce-d2af9bd52539\") " pod="openshift-cluster-node-tuning-operator/tuned-nmmln" Apr 17 17:24:50.848408 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.848143 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ce658e37-54cb-4560-9405-21ba87bc0d35-os-release\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.848408 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.848170 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ce658e37-54cb-4560-9405-21ba87bc0d35-cni-binary-copy\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.848408 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.848193 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/968b8801-64dd-454d-b1af-675ad2d36924-cni-binary-copy\") pod \"multus-additional-cni-plugins-qn8t8\" (UID: \"968b8801-64dd-454d-b1af-675ad2d36924\") " pod="openshift-multus/multus-additional-cni-plugins-qn8t8" Apr 17 17:24:50.848408 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.848200 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/35fe1bc2-e1a4-4744-8f67-3cc38747e567-ovnkube-script-lib\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.848408 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.848240 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m97gc\" (UniqueName: \"kubernetes.io/projected/b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a-kube-api-access-m97gc\") pod \"network-metrics-daemon-wgmfp\" (UID: \"b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a\") " pod="openshift-multus/network-metrics-daemon-wgmfp" Apr 17 17:24:50.848408 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.848284 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b10257f3-c35f-4a14-bf62-ab474fc1eeae-host-slash\") pod \"iptables-alerter-kc5w4\" (UID: \"b10257f3-c35f-4a14-bf62-ab474fc1eeae\") " pod="openshift-network-operator/iptables-alerter-kc5w4" Apr 17 17:24:50.848408 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.848290 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9ea4766d-2197-486d-8373-8f7340574545-registration-dir\") pod \"aws-ebs-csi-driver-node-n6gnf\" (UID: \"9ea4766d-2197-486d-8373-8f7340574545\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6gnf" Apr 17 17:24:50.848408 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.848289 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a21d46e7-a070-48be-a4ce-d2af9bd52539-etc-kubernetes\") pod \"tuned-nmmln\" (UID: \"a21d46e7-a070-48be-a4ce-d2af9bd52539\") " pod="openshift-cluster-node-tuning-operator/tuned-nmmln" Apr 17 17:24:50.848408 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.848329 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a21d46e7-a070-48be-a4ce-d2af9bd52539-run\") pod \"tuned-nmmln\" (UID: \"a21d46e7-a070-48be-a4ce-d2af9bd52539\") " pod="openshift-cluster-node-tuning-operator/tuned-nmmln" Apr 17 17:24:50.848408 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.848365 2566 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 17:24:50.848408 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.848389 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a21d46e7-a070-48be-a4ce-d2af9bd52539-sys\") pod \"tuned-nmmln\" (UID: \"a21d46e7-a070-48be-a4ce-d2af9bd52539\") " pod="openshift-cluster-node-tuning-operator/tuned-nmmln" Apr 17 17:24:50.849278 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.848420 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/968b8801-64dd-454d-b1af-675ad2d36924-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qn8t8\" (UID: \"968b8801-64dd-454d-b1af-675ad2d36924\") " pod="openshift-multus/multus-additional-cni-plugins-qn8t8" Apr 17 17:24:50.849278 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.848481 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a21d46e7-a070-48be-a4ce-d2af9bd52539-sys\") pod \"tuned-nmmln\" (UID: \"a21d46e7-a070-48be-a4ce-d2af9bd52539\") " pod="openshift-cluster-node-tuning-operator/tuned-nmmln" Apr 17 17:24:50.849278 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.848512 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ce658e37-54cb-4560-9405-21ba87bc0d35-host-run-netns\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.849278 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.848540 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ce658e37-54cb-4560-9405-21ba87bc0d35-host-var-lib-kubelet\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.849278 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.848567 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/35fe1bc2-e1a4-4744-8f67-3cc38747e567-env-overrides\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.849278 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.848593 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a21d46e7-a070-48be-a4ce-d2af9bd52539-etc-sysconfig\") pod \"tuned-nmmln\" (UID: \"a21d46e7-a070-48be-a4ce-d2af9bd52539\") " pod="openshift-cluster-node-tuning-operator/tuned-nmmln" Apr 17 17:24:50.849278 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.848617 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a21d46e7-a070-48be-a4ce-d2af9bd52539-etc-systemd\") pod \"tuned-nmmln\" (UID: \"a21d46e7-a070-48be-a4ce-d2af9bd52539\") " pod="openshift-cluster-node-tuning-operator/tuned-nmmln" Apr 17 17:24:50.849278 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.848642 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hs7m2\" (UniqueName: \"kubernetes.io/projected/a21d46e7-a070-48be-a4ce-d2af9bd52539-kube-api-access-hs7m2\") pod \"tuned-nmmln\" (UID: \"a21d46e7-a070-48be-a4ce-d2af9bd52539\") " pod="openshift-cluster-node-tuning-operator/tuned-nmmln" Apr 17 17:24:50.849278 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.848668 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9ea4766d-2197-486d-8373-8f7340574545-etc-selinux\") pod \"aws-ebs-csi-driver-node-n6gnf\" (UID: \"9ea4766d-2197-486d-8373-8f7340574545\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6gnf" Apr 17 17:24:50.849278 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.848718 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a21d46e7-a070-48be-a4ce-d2af9bd52539-etc-sysconfig\") pod \"tuned-nmmln\" (UID: \"a21d46e7-a070-48be-a4ce-d2af9bd52539\") " pod="openshift-cluster-node-tuning-operator/tuned-nmmln" Apr 17 17:24:50.849278 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.848761 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a21d46e7-a070-48be-a4ce-d2af9bd52539-etc-systemd\") pod \"tuned-nmmln\" (UID: \"a21d46e7-a070-48be-a4ce-d2af9bd52539\") " pod="openshift-cluster-node-tuning-operator/tuned-nmmln" Apr 17 17:24:50.849278 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.848878 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/968b8801-64dd-454d-b1af-675ad2d36924-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qn8t8\" (UID: \"968b8801-64dd-454d-b1af-675ad2d36924\") " pod="openshift-multus/multus-additional-cni-plugins-qn8t8" Apr 17 17:24:50.849278 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.849172 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/35fe1bc2-e1a4-4744-8f67-3cc38747e567-env-overrides\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.849278 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.848370 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a21d46e7-a070-48be-a4ce-d2af9bd52539-run\") pod \"tuned-nmmln\" (UID: \"a21d46e7-a070-48be-a4ce-d2af9bd52539\") " pod="openshift-cluster-node-tuning-operator/tuned-nmmln" Apr 17 17:24:50.849278 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.849242 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9ea4766d-2197-486d-8373-8f7340574545-etc-selinux\") pod \"aws-ebs-csi-driver-node-n6gnf\" (UID: \"9ea4766d-2197-486d-8373-8f7340574545\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6gnf" Apr 17 17:24:50.849952 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.849299 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a21d46e7-a070-48be-a4ce-d2af9bd52539-etc-modprobe-d\") pod \"tuned-nmmln\" (UID: \"a21d46e7-a070-48be-a4ce-d2af9bd52539\") " pod="openshift-cluster-node-tuning-operator/tuned-nmmln" Apr 17 17:24:50.849952 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.849325 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/35fe1bc2-e1a4-4744-8f67-3cc38747e567-var-lib-openvswitch\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.849952 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.849351 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9ea4766d-2197-486d-8373-8f7340574545-device-dir\") pod \"aws-ebs-csi-driver-node-n6gnf\" (UID: \"9ea4766d-2197-486d-8373-8f7340574545\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6gnf" Apr 17 17:24:50.849952 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.849372 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a21d46e7-a070-48be-a4ce-d2af9bd52539-etc-kubernetes\") pod \"tuned-nmmln\" (UID: \"a21d46e7-a070-48be-a4ce-d2af9bd52539\") " pod="openshift-cluster-node-tuning-operator/tuned-nmmln" Apr 17 17:24:50.849952 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.849377 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zwrsv\" (UniqueName: \"kubernetes.io/projected/9ea4766d-2197-486d-8373-8f7340574545-kube-api-access-zwrsv\") pod \"aws-ebs-csi-driver-node-n6gnf\" (UID: \"9ea4766d-2197-486d-8373-8f7340574545\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6gnf" Apr 17 17:24:50.849952 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.849402 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/968b8801-64dd-454d-b1af-675ad2d36924-system-cni-dir\") pod \"multus-additional-cni-plugins-qn8t8\" (UID: \"968b8801-64dd-454d-b1af-675ad2d36924\") " pod="openshift-multus/multus-additional-cni-plugins-qn8t8" Apr 17 17:24:50.849952 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.849425 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/968b8801-64dd-454d-b1af-675ad2d36924-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qn8t8\" (UID: \"968b8801-64dd-454d-b1af-675ad2d36924\") " pod="openshift-multus/multus-additional-cni-plugins-qn8t8" Apr 17 17:24:50.849952 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.849426 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/35fe1bc2-e1a4-4744-8f67-3cc38747e567-var-lib-openvswitch\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.849952 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.849460 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a21d46e7-a070-48be-a4ce-d2af9bd52539-etc-modprobe-d\") pod \"tuned-nmmln\" (UID: \"a21d46e7-a070-48be-a4ce-d2af9bd52539\") " pod="openshift-cluster-node-tuning-operator/tuned-nmmln" Apr 17 17:24:50.849952 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.849509 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9ea4766d-2197-486d-8373-8f7340574545-device-dir\") pod \"aws-ebs-csi-driver-node-n6gnf\" (UID: \"9ea4766d-2197-486d-8373-8f7340574545\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6gnf" Apr 17 17:24:50.849952 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.849535 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ce658e37-54cb-4560-9405-21ba87bc0d35-hostroot\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.849952 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.849560 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ce658e37-54cb-4560-9405-21ba87bc0d35-multus-conf-dir\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.849952 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.849585 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ce658e37-54cb-4560-9405-21ba87bc0d35-host-run-multus-certs\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.849952 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.849614 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/35fe1bc2-e1a4-4744-8f67-3cc38747e567-run-ovn\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.849952 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.849639 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/348f9f1d-49f3-4771-9e69-3b4ba90f29e9-konnectivity-ca\") pod \"konnectivity-agent-q9m54\" (UID: \"348f9f1d-49f3-4771-9e69-3b4ba90f29e9\") " pod="kube-system/konnectivity-agent-q9m54" Apr 17 17:24:50.849952 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.849676 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/35fe1bc2-e1a4-4744-8f67-3cc38747e567-host-slash\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.849952 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.849683 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/968b8801-64dd-454d-b1af-675ad2d36924-system-cni-dir\") pod \"multus-additional-cni-plugins-qn8t8\" (UID: \"968b8801-64dd-454d-b1af-675ad2d36924\") " pod="openshift-multus/multus-additional-cni-plugins-qn8t8" Apr 17 17:24:50.850743 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.849699 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a21d46e7-a070-48be-a4ce-d2af9bd52539-var-lib-kubelet\") pod \"tuned-nmmln\" (UID: \"a21d46e7-a070-48be-a4ce-d2af9bd52539\") " pod="openshift-cluster-node-tuning-operator/tuned-nmmln" Apr 17 17:24:50.850743 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.849739 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a21d46e7-a070-48be-a4ce-d2af9bd52539-host\") pod \"tuned-nmmln\" (UID: \"a21d46e7-a070-48be-a4ce-d2af9bd52539\") " pod="openshift-cluster-node-tuning-operator/tuned-nmmln" Apr 17 17:24:50.850743 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.849765 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/968b8801-64dd-454d-b1af-675ad2d36924-cnibin\") pod \"multus-additional-cni-plugins-qn8t8\" (UID: \"968b8801-64dd-454d-b1af-675ad2d36924\") " pod="openshift-multus/multus-additional-cni-plugins-qn8t8" Apr 17 17:24:50.850743 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.849773 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a21d46e7-a070-48be-a4ce-d2af9bd52539-var-lib-kubelet\") pod \"tuned-nmmln\" (UID: \"a21d46e7-a070-48be-a4ce-d2af9bd52539\") " pod="openshift-cluster-node-tuning-operator/tuned-nmmln" Apr 17 17:24:50.850743 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.849808 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/35fe1bc2-e1a4-4744-8f67-3cc38747e567-run-ovn\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.850743 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.849813 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ce658e37-54cb-4560-9405-21ba87bc0d35-etc-kubernetes\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.850743 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.849877 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/968b8801-64dd-454d-b1af-675ad2d36924-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qn8t8\" (UID: \"968b8801-64dd-454d-b1af-675ad2d36924\") " pod="openshift-multus/multus-additional-cni-plugins-qn8t8" Apr 17 17:24:50.850743 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.849907 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/35fe1bc2-e1a4-4744-8f67-3cc38747e567-host-cni-netd\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.850743 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.849936 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a21d46e7-a070-48be-a4ce-d2af9bd52539-tmp\") pod \"tuned-nmmln\" (UID: \"a21d46e7-a070-48be-a4ce-d2af9bd52539\") " pod="openshift-cluster-node-tuning-operator/tuned-nmmln" Apr 17 17:24:50.850743 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.849938 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a21d46e7-a070-48be-a4ce-d2af9bd52539-host\") pod \"tuned-nmmln\" (UID: \"a21d46e7-a070-48be-a4ce-d2af9bd52539\") " pod="openshift-cluster-node-tuning-operator/tuned-nmmln" Apr 17 17:24:50.850743 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.849943 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/35fe1bc2-e1a4-4744-8f67-3cc38747e567-host-slash\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.850743 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.849940 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/968b8801-64dd-454d-b1af-675ad2d36924-cnibin\") pod \"multus-additional-cni-plugins-qn8t8\" (UID: \"968b8801-64dd-454d-b1af-675ad2d36924\") " pod="openshift-multus/multus-additional-cni-plugins-qn8t8" Apr 17 17:24:50.850743 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.849973 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/35fe1bc2-e1a4-4744-8f67-3cc38747e567-ovnkube-config\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.850743 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.850008 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/35fe1bc2-e1a4-4744-8f67-3cc38747e567-host-cni-netd\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.850743 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.850134 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9ea4766d-2197-486d-8373-8f7340574545-sys-fs\") pod \"aws-ebs-csi-driver-node-n6gnf\" (UID: \"9ea4766d-2197-486d-8373-8f7340574545\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6gnf" Apr 17 17:24:50.850743 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.850165 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dd4hr\" (UniqueName: \"kubernetes.io/projected/b10257f3-c35f-4a14-bf62-ab474fc1eeae-kube-api-access-dd4hr\") pod \"iptables-alerter-kc5w4\" (UID: \"b10257f3-c35f-4a14-bf62-ab474fc1eeae\") " pod="openshift-network-operator/iptables-alerter-kc5w4" Apr 17 17:24:50.850743 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.850193 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ce658e37-54cb-4560-9405-21ba87bc0d35-multus-cni-dir\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.851552 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.850218 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/348f9f1d-49f3-4771-9e69-3b4ba90f29e9-konnectivity-ca\") pod \"konnectivity-agent-q9m54\" (UID: \"348f9f1d-49f3-4771-9e69-3b4ba90f29e9\") " pod="kube-system/konnectivity-agent-q9m54" Apr 17 17:24:50.851552 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.850219 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/35fe1bc2-e1a4-4744-8f67-3cc38747e567-run-openvswitch\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.851552 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.850277 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a3ce8ab-c625-49a4-a457-49fae7f24c9a-host\") pod \"node-ca-ttt97\" (UID: \"5a3ce8ab-c625-49a4-a457-49fae7f24c9a\") " pod="openshift-image-registry/node-ca-ttt97" Apr 17 17:24:50.851552 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.850279 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/35fe1bc2-e1a4-4744-8f67-3cc38747e567-run-openvswitch\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.851552 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.850302 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b10257f3-c35f-4a14-bf62-ab474fc1eeae-iptables-alerter-script\") pod \"iptables-alerter-kc5w4\" (UID: \"b10257f3-c35f-4a14-bf62-ab474fc1eeae\") " pod="openshift-network-operator/iptables-alerter-kc5w4" Apr 17 17:24:50.851552 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.850330 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a-metrics-certs\") pod \"network-metrics-daemon-wgmfp\" (UID: \"b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a\") " pod="openshift-multus/network-metrics-daemon-wgmfp" Apr 17 17:24:50.851552 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.850356 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/35fe1bc2-e1a4-4744-8f67-3cc38747e567-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.851552 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.850381 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5a3ce8ab-c625-49a4-a457-49fae7f24c9a-serviceca\") pod \"node-ca-ttt97\" (UID: \"5a3ce8ab-c625-49a4-a457-49fae7f24c9a\") " pod="openshift-image-registry/node-ca-ttt97" Apr 17 17:24:50.851552 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.850405 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvbnr\" (UniqueName: \"kubernetes.io/projected/d29a1468-8ac5-454a-a993-6a5055191ec4-kube-api-access-rvbnr\") pod \"network-check-target-vdhlz\" (UID: \"d29a1468-8ac5-454a-a993-6a5055191ec4\") " pod="openshift-network-diagnostics/network-check-target-vdhlz" Apr 17 17:24:50.851552 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.850432 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/35fe1bc2-e1a4-4744-8f67-3cc38747e567-host-cni-bin\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.851552 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.850458 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/35fe1bc2-e1a4-4744-8f67-3cc38747e567-host-kubelet\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.851552 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.850480 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/35fe1bc2-e1a4-4744-8f67-3cc38747e567-node-log\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.851552 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.850513 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/35fe1bc2-e1a4-4744-8f67-3cc38747e567-log-socket\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.851552 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.850539 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/35fe1bc2-e1a4-4744-8f67-3cc38747e567-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.851552 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.850541 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9ea4766d-2197-486d-8373-8f7340574545-kubelet-dir\") pod \"aws-ebs-csi-driver-node-n6gnf\" (UID: \"9ea4766d-2197-486d-8373-8f7340574545\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6gnf" Apr 17 17:24:50.851552 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.850552 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/35fe1bc2-e1a4-4744-8f67-3cc38747e567-host-cni-bin\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.851552 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.850571 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9ea4766d-2197-486d-8373-8f7340574545-socket-dir\") pod \"aws-ebs-csi-driver-node-n6gnf\" (UID: \"9ea4766d-2197-486d-8373-8f7340574545\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6gnf" Apr 17 17:24:50.852130 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.850589 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9ea4766d-2197-486d-8373-8f7340574545-kubelet-dir\") pod \"aws-ebs-csi-driver-node-n6gnf\" (UID: \"9ea4766d-2197-486d-8373-8f7340574545\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6gnf" Apr 17 17:24:50.852130 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.850600 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pnncd\" (UniqueName: \"kubernetes.io/projected/968b8801-64dd-454d-b1af-675ad2d36924-kube-api-access-pnncd\") pod \"multus-additional-cni-plugins-qn8t8\" (UID: \"968b8801-64dd-454d-b1af-675ad2d36924\") " pod="openshift-multus/multus-additional-cni-plugins-qn8t8" Apr 17 17:24:50.852130 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.850627 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zz7qh\" (UniqueName: \"kubernetes.io/projected/35fe1bc2-e1a4-4744-8f67-3cc38747e567-kube-api-access-zz7qh\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.852130 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.850632 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/35fe1bc2-e1a4-4744-8f67-3cc38747e567-host-kubelet\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.852130 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.850654 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a21d46e7-a070-48be-a4ce-d2af9bd52539-lib-modules\") pod \"tuned-nmmln\" (UID: \"a21d46e7-a070-48be-a4ce-d2af9bd52539\") " pod="openshift-cluster-node-tuning-operator/tuned-nmmln" Apr 17 17:24:50.852130 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.850675 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/35fe1bc2-e1a4-4744-8f67-3cc38747e567-node-log\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.852130 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.850701 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/968b8801-64dd-454d-b1af-675ad2d36924-os-release\") pod \"multus-additional-cni-plugins-qn8t8\" (UID: \"968b8801-64dd-454d-b1af-675ad2d36924\") " pod="openshift-multus/multus-additional-cni-plugins-qn8t8" Apr 17 17:24:50.852130 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.850706 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/35fe1bc2-e1a4-4744-8f67-3cc38747e567-log-socket\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.852130 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.850512 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/35fe1bc2-e1a4-4744-8f67-3cc38747e567-ovnkube-config\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.852130 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.850738 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/968b8801-64dd-454d-b1af-675ad2d36924-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qn8t8\" (UID: \"968b8801-64dd-454d-b1af-675ad2d36924\") " pod="openshift-multus/multus-additional-cni-plugins-qn8t8" Apr 17 17:24:50.852130 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.850745 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a21d46e7-a070-48be-a4ce-d2af9bd52539-lib-modules\") pod \"tuned-nmmln\" (UID: \"a21d46e7-a070-48be-a4ce-d2af9bd52539\") " pod="openshift-cluster-node-tuning-operator/tuned-nmmln" Apr 17 17:24:50.852130 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.850787 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/968b8801-64dd-454d-b1af-675ad2d36924-os-release\") pod \"multus-additional-cni-plugins-qn8t8\" (UID: \"968b8801-64dd-454d-b1af-675ad2d36924\") " pod="openshift-multus/multus-additional-cni-plugins-qn8t8" Apr 17 17:24:50.852130 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.850824 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a3ce8ab-c625-49a4-a457-49fae7f24c9a-host\") pod \"node-ca-ttt97\" (UID: \"5a3ce8ab-c625-49a4-a457-49fae7f24c9a\") " pod="openshift-image-registry/node-ca-ttt97" Apr 17 17:24:50.852130 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.850866 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/968b8801-64dd-454d-b1af-675ad2d36924-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qn8t8\" (UID: \"968b8801-64dd-454d-b1af-675ad2d36924\") " pod="openshift-multus/multus-additional-cni-plugins-qn8t8" Apr 17 17:24:50.852130 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.850895 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ce658e37-54cb-4560-9405-21ba87bc0d35-cnibin\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.852130 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.850919 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ce658e37-54cb-4560-9405-21ba87bc0d35-multus-socket-dir-parent\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.852130 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.850956 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/35fe1bc2-e1a4-4744-8f67-3cc38747e567-host-run-netns\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.852735 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.850999 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/35fe1bc2-e1a4-4744-8f67-3cc38747e567-ovn-node-metrics-cert\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.852735 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.851060 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9ea4766d-2197-486d-8373-8f7340574545-socket-dir\") pod \"aws-ebs-csi-driver-node-n6gnf\" (UID: \"9ea4766d-2197-486d-8373-8f7340574545\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6gnf" Apr 17 17:24:50.852735 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.851087 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/35fe1bc2-e1a4-4744-8f67-3cc38747e567-host-run-netns\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.852735 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.850329 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9ea4766d-2197-486d-8373-8f7340574545-sys-fs\") pod \"aws-ebs-csi-driver-node-n6gnf\" (UID: \"9ea4766d-2197-486d-8373-8f7340574545\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6gnf" Apr 17 17:24:50.852735 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.851626 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b10257f3-c35f-4a14-bf62-ab474fc1eeae-iptables-alerter-script\") pod \"iptables-alerter-kc5w4\" (UID: \"b10257f3-c35f-4a14-bf62-ab474fc1eeae\") " pod="openshift-network-operator/iptables-alerter-kc5w4" Apr 17 17:24:50.852735 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.851824 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5a3ce8ab-c625-49a4-a457-49fae7f24c9a-serviceca\") pod \"node-ca-ttt97\" (UID: \"5a3ce8ab-c625-49a4-a457-49fae7f24c9a\") " pod="openshift-image-registry/node-ca-ttt97" Apr 17 17:24:50.852735 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.852005 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a21d46e7-a070-48be-a4ce-d2af9bd52539-etc-tuned\") pod \"tuned-nmmln\" (UID: \"a21d46e7-a070-48be-a4ce-d2af9bd52539\") " pod="openshift-cluster-node-tuning-operator/tuned-nmmln" Apr 17 17:24:50.852735 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.852228 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/348f9f1d-49f3-4771-9e69-3b4ba90f29e9-agent-certs\") pod \"konnectivity-agent-q9m54\" (UID: \"348f9f1d-49f3-4771-9e69-3b4ba90f29e9\") " pod="kube-system/konnectivity-agent-q9m54" Apr 17 17:24:50.853335 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.853313 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/35fe1bc2-e1a4-4744-8f67-3cc38747e567-ovn-node-metrics-cert\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.853463 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.853443 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a21d46e7-a070-48be-a4ce-d2af9bd52539-tmp\") pod \"tuned-nmmln\" (UID: \"a21d46e7-a070-48be-a4ce-d2af9bd52539\") " pod="openshift-cluster-node-tuning-operator/tuned-nmmln" Apr 17 17:24:50.856001 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.855979 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2248\" (UniqueName: \"kubernetes.io/projected/5a3ce8ab-c625-49a4-a457-49fae7f24c9a-kube-api-access-k2248\") pod \"node-ca-ttt97\" (UID: \"5a3ce8ab-c625-49a4-a457-49fae7f24c9a\") " pod="openshift-image-registry/node-ca-ttt97" Apr 17 17:24:50.856690 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.856667 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs7m2\" (UniqueName: \"kubernetes.io/projected/a21d46e7-a070-48be-a4ce-d2af9bd52539-kube-api-access-hs7m2\") pod \"tuned-nmmln\" (UID: \"a21d46e7-a070-48be-a4ce-d2af9bd52539\") " pod="openshift-cluster-node-tuning-operator/tuned-nmmln" Apr 17 17:24:50.857669 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.857638 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwrsv\" (UniqueName: \"kubernetes.io/projected/9ea4766d-2197-486d-8373-8f7340574545-kube-api-access-zwrsv\") pod \"aws-ebs-csi-driver-node-n6gnf\" (UID: \"9ea4766d-2197-486d-8373-8f7340574545\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6gnf" Apr 17 17:24:50.858411 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.858390 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd4hr\" (UniqueName: \"kubernetes.io/projected/b10257f3-c35f-4a14-bf62-ab474fc1eeae-kube-api-access-dd4hr\") pod \"iptables-alerter-kc5w4\" (UID: \"b10257f3-c35f-4a14-bf62-ab474fc1eeae\") " pod="openshift-network-operator/iptables-alerter-kc5w4" Apr 17 17:24:50.859376 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.859348 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnncd\" (UniqueName: \"kubernetes.io/projected/968b8801-64dd-454d-b1af-675ad2d36924-kube-api-access-pnncd\") pod \"multus-additional-cni-plugins-qn8t8\" (UID: \"968b8801-64dd-454d-b1af-675ad2d36924\") " pod="openshift-multus/multus-additional-cni-plugins-qn8t8" Apr 17 17:24:50.859732 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.859710 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz7qh\" (UniqueName: \"kubernetes.io/projected/35fe1bc2-e1a4-4744-8f67-3cc38747e567-kube-api-access-zz7qh\") pod \"ovnkube-node-nz777\" (UID: \"35fe1bc2-e1a4-4744-8f67-3cc38747e567\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:50.951902 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.951867 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ce658e37-54cb-4560-9405-21ba87bc0d35-host-var-lib-cni-bin\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.952061 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.951911 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ce658e37-54cb-4560-9405-21ba87bc0d35-os-release\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.952061 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.951937 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ce658e37-54cb-4560-9405-21ba87bc0d35-cni-binary-copy\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.952061 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.951959 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m97gc\" (UniqueName: \"kubernetes.io/projected/b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a-kube-api-access-m97gc\") pod \"network-metrics-daemon-wgmfp\" (UID: \"b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a\") " pod="openshift-multus/network-metrics-daemon-wgmfp" Apr 17 17:24:50.952061 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.951969 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ce658e37-54cb-4560-9405-21ba87bc0d35-host-var-lib-cni-bin\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.952061 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.951980 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ce658e37-54cb-4560-9405-21ba87bc0d35-host-run-netns\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.952061 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.952011 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ce658e37-54cb-4560-9405-21ba87bc0d35-os-release\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.952419 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.952124 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ce658e37-54cb-4560-9405-21ba87bc0d35-host-var-lib-kubelet\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.952419 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.952150 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ce658e37-54cb-4560-9405-21ba87bc0d35-host-var-lib-kubelet\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.952419 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.952128 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ce658e37-54cb-4560-9405-21ba87bc0d35-host-run-netns\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.952419 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.952167 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ce658e37-54cb-4560-9405-21ba87bc0d35-hostroot\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.952419 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.952189 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ce658e37-54cb-4560-9405-21ba87bc0d35-multus-conf-dir\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.952419 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.952232 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ce658e37-54cb-4560-9405-21ba87bc0d35-multus-conf-dir\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.952419 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.952232 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ce658e37-54cb-4560-9405-21ba87bc0d35-hostroot\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.952419 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.952272 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ce658e37-54cb-4560-9405-21ba87bc0d35-host-run-multus-certs\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.952419 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.952313 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ce658e37-54cb-4560-9405-21ba87bc0d35-etc-kubernetes\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.952419 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.952342 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ce658e37-54cb-4560-9405-21ba87bc0d35-multus-cni-dir\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.952419 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.952370 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a-metrics-certs\") pod \"network-metrics-daemon-wgmfp\" (UID: \"b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a\") " pod="openshift-multus/network-metrics-daemon-wgmfp" Apr 17 17:24:50.952419 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.952378 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ce658e37-54cb-4560-9405-21ba87bc0d35-etc-kubernetes\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.952419 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.952369 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ce658e37-54cb-4560-9405-21ba87bc0d35-host-run-multus-certs\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.952419 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.952400 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvbnr\" (UniqueName: \"kubernetes.io/projected/d29a1468-8ac5-454a-a993-6a5055191ec4-kube-api-access-rvbnr\") pod \"network-check-target-vdhlz\" (UID: \"d29a1468-8ac5-454a-a993-6a5055191ec4\") " pod="openshift-network-diagnostics/network-check-target-vdhlz" Apr 17 17:24:50.952419 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.952421 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ce658e37-54cb-4560-9405-21ba87bc0d35-multus-cni-dir\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.952999 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.952447 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ce658e37-54cb-4560-9405-21ba87bc0d35-cnibin\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.952999 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.952477 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ce658e37-54cb-4560-9405-21ba87bc0d35-multus-socket-dir-parent\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.952999 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:50.952490 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:50.952999 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.952509 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ce658e37-54cb-4560-9405-21ba87bc0d35-host-var-lib-cni-multus\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.952999 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.952535 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ce658e37-54cb-4560-9405-21ba87bc0d35-multus-daemon-config\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.952999 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.952541 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ce658e37-54cb-4560-9405-21ba87bc0d35-cnibin\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.952999 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.952565 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ce658e37-54cb-4560-9405-21ba87bc0d35-multus-socket-dir-parent\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.952999 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.952562 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ce658e37-54cb-4560-9405-21ba87bc0d35-cni-binary-copy\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.952999 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:50.952574 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a-metrics-certs podName:b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a nodeName:}" failed. No retries permitted until 2026-04-17 17:24:51.452532581 +0000 UTC m=+3.122612201 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a-metrics-certs") pod "network-metrics-daemon-wgmfp" (UID: "b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:50.952999 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.952583 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ce658e37-54cb-4560-9405-21ba87bc0d35-host-var-lib-cni-multus\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.952999 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.952637 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tqbcc\" (UniqueName: \"kubernetes.io/projected/ce658e37-54cb-4560-9405-21ba87bc0d35-kube-api-access-tqbcc\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.952999 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.952666 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ce658e37-54cb-4560-9405-21ba87bc0d35-system-cni-dir\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.952999 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.952688 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ce658e37-54cb-4560-9405-21ba87bc0d35-host-run-k8s-cni-cncf-io\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.952999 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.952742 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ce658e37-54cb-4560-9405-21ba87bc0d35-system-cni-dir\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.952999 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.952745 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ce658e37-54cb-4560-9405-21ba87bc0d35-host-run-k8s-cni-cncf-io\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.952999 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.952989 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ce658e37-54cb-4560-9405-21ba87bc0d35-multus-daemon-config\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:50.958920 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.958902 2566 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:24:50.959027 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:50.958963 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:24:50.959027 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:50.959013 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:24:50.959027 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:50.959026 2566 projected.go:194] Error preparing data for projected volume kube-api-access-rvbnr for pod openshift-network-diagnostics/network-check-target-vdhlz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:50.959495 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:50.959467 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d29a1468-8ac5-454a-a993-6a5055191ec4-kube-api-access-rvbnr podName:d29a1468-8ac5-454a-a993-6a5055191ec4 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:51.459447922 +0000 UTC m=+3.129527593 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-rvbnr" (UniqueName: "kubernetes.io/projected/d29a1468-8ac5-454a-a993-6a5055191ec4-kube-api-access-rvbnr") pod "network-check-target-vdhlz" (UID: "d29a1468-8ac5-454a-a993-6a5055191ec4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:50.962203 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.962095 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m97gc\" (UniqueName: \"kubernetes.io/projected/b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a-kube-api-access-m97gc\") pod \"network-metrics-daemon-wgmfp\" (UID: \"b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a\") " pod="openshift-multus/network-metrics-daemon-wgmfp" Apr 17 17:24:50.962302 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:50.962275 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqbcc\" (UniqueName: \"kubernetes.io/projected/ce658e37-54cb-4560-9405-21ba87bc0d35-kube-api-access-tqbcc\") pod \"multus-xxqdc\" (UID: \"ce658e37-54cb-4560-9405-21ba87bc0d35\") " pod="openshift-multus/multus-xxqdc" Apr 17 17:24:51.044585 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:51.044507 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-kc5w4" Apr 17 17:24:51.052287 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:51.052262 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-q9m54" Apr 17 17:24:51.059953 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:51.059934 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:24:51.065551 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:51.065520 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6gnf" Apr 17 17:24:51.071140 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:51.071123 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-nmmln" Apr 17 17:24:51.077652 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:51.077637 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ttt97" Apr 17 17:24:51.083157 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:51.083139 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qn8t8" Apr 17 17:24:51.087719 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:51.087703 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xxqdc" Apr 17 17:24:51.402327 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:51.402298 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35fe1bc2_e1a4_4744_8f67_3cc38747e567.slice/crio-5737c5277d87ef013f22b3ee3c21982a6e16fb167c163767416a1cb773dfb917 WatchSource:0}: Error finding container 5737c5277d87ef013f22b3ee3c21982a6e16fb167c163767416a1cb773dfb917: Status 404 returned error can't find the container with id 5737c5277d87ef013f22b3ee3c21982a6e16fb167c163767416a1cb773dfb917 Apr 17 17:24:51.403679 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:51.403628 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce658e37_54cb_4560_9405_21ba87bc0d35.slice/crio-7b4be8030956b3c6e85c274b094eb853d444b4641a9aa4044ab2c6c4b70b0215 WatchSource:0}: Error finding container 7b4be8030956b3c6e85c274b094eb853d444b4641a9aa4044ab2c6c4b70b0215: Status 404 returned error can't find the container with id 7b4be8030956b3c6e85c274b094eb853d444b4641a9aa4044ab2c6c4b70b0215 Apr 17 17:24:51.406076 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:51.406053 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb10257f3_c35f_4a14_bf62_ab474fc1eeae.slice/crio-6602a819e5e0172b3f12a1b5d4ef4336ddce006d69dba2c72870094ceda00718 WatchSource:0}: Error finding container 6602a819e5e0172b3f12a1b5d4ef4336ddce006d69dba2c72870094ceda00718: Status 404 returned error can't find the container with id 6602a819e5e0172b3f12a1b5d4ef4336ddce006d69dba2c72870094ceda00718 Apr 17 17:24:51.407561 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:51.407543 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod348f9f1d_49f3_4771_9e69_3b4ba90f29e9.slice/crio-06d321385e600dfe464be6762663d1f227aba0abfe8bd5fd7279f6a1c2d406e2 WatchSource:0}: Error finding container 06d321385e600dfe464be6762663d1f227aba0abfe8bd5fd7279f6a1c2d406e2: Status 404 returned error can't find the container with id 06d321385e600dfe464be6762663d1f227aba0abfe8bd5fd7279f6a1c2d406e2 Apr 17 17:24:51.409025 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:51.409003 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a3ce8ab_c625_49a4_a457_49fae7f24c9a.slice/crio-2273431c173bf2376dc355856efecf94c81c82e4ac4a6b8cf88fc0ceaf0a0bbb WatchSource:0}: Error finding container 2273431c173bf2376dc355856efecf94c81c82e4ac4a6b8cf88fc0ceaf0a0bbb: Status 404 returned error can't find the container with id 2273431c173bf2376dc355856efecf94c81c82e4ac4a6b8cf88fc0ceaf0a0bbb Apr 17 17:24:51.410059 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:51.410040 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod968b8801_64dd_454d_b1af_675ad2d36924.slice/crio-0fd263548ec1aebfd8a4a38a9325c3c5b5250bb54c72cbbde9853302c7de2854 WatchSource:0}: Error finding container 0fd263548ec1aebfd8a4a38a9325c3c5b5250bb54c72cbbde9853302c7de2854: Status 404 returned error can't find the container with id 0fd263548ec1aebfd8a4a38a9325c3c5b5250bb54c72cbbde9853302c7de2854 Apr 17 17:24:51.411182 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:51.411014 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda21d46e7_a070_48be_a4ce_d2af9bd52539.slice/crio-7b18b3905a4a5d047c114c9d6b833359944bb9cf1817c3f6000a1c6e1329b0d3 WatchSource:0}: Error finding container 7b18b3905a4a5d047c114c9d6b833359944bb9cf1817c3f6000a1c6e1329b0d3: Status 404 returned error can't find the container with id 7b18b3905a4a5d047c114c9d6b833359944bb9cf1817c3f6000a1c6e1329b0d3 Apr 17 17:24:51.413333 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:24:51.413186 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ea4766d_2197_486d_8373_8f7340574545.slice/crio-5b99645e807d46d54ff9e1e4fe71dce0063fdc614c720af582d88a9464af4253 WatchSource:0}: Error finding container 5b99645e807d46d54ff9e1e4fe71dce0063fdc614c720af582d88a9464af4253: Status 404 returned error can't find the container with id 5b99645e807d46d54ff9e1e4fe71dce0063fdc614c720af582d88a9464af4253 Apr 17 17:24:51.456266 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:51.456230 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a-metrics-certs\") pod \"network-metrics-daemon-wgmfp\" (UID: \"b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a\") " pod="openshift-multus/network-metrics-daemon-wgmfp" Apr 17 17:24:51.456367 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:51.456351 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:51.456410 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:51.456404 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a-metrics-certs podName:b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a nodeName:}" failed. No retries permitted until 2026-04-17 17:24:52.456390435 +0000 UTC m=+4.126470056 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a-metrics-certs") pod "network-metrics-daemon-wgmfp" (UID: "b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:51.556931 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:51.556901 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvbnr\" (UniqueName: \"kubernetes.io/projected/d29a1468-8ac5-454a-a993-6a5055191ec4-kube-api-access-rvbnr\") pod \"network-check-target-vdhlz\" (UID: \"d29a1468-8ac5-454a-a993-6a5055191ec4\") " pod="openshift-network-diagnostics/network-check-target-vdhlz" Apr 17 17:24:51.557067 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:51.557013 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:24:51.557067 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:51.557026 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:24:51.557067 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:51.557035 2566 projected.go:194] Error preparing data for projected volume kube-api-access-rvbnr for pod openshift-network-diagnostics/network-check-target-vdhlz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:51.557164 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:51.557091 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d29a1468-8ac5-454a-a993-6a5055191ec4-kube-api-access-rvbnr podName:d29a1468-8ac5-454a-a993-6a5055191ec4 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:52.55707316 +0000 UTC m=+4.227152780 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-rvbnr" (UniqueName: "kubernetes.io/projected/d29a1468-8ac5-454a-a993-6a5055191ec4-kube-api-access-rvbnr") pod "network-check-target-vdhlz" (UID: "d29a1468-8ac5-454a-a993-6a5055191ec4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:51.779601 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:51.779450 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 17:19:49 +0000 UTC" deadline="2028-01-09 16:31:34.836084884 +0000 UTC" Apr 17 17:24:51.779601 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:51.779505 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15167h6m43.056584193s" Apr 17 17:24:51.847612 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:51.847576 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-147.ec2.internal" event={"ID":"cf7a4b638e7a30f012eeb3bf5b7f1ece","Type":"ContainerStarted","Data":"51c2958e04de51eab40c671e5d0f693e3373b1a091653293a669c5f58c3f3fee"} Apr 17 17:24:51.861615 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:51.861416 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6gnf" event={"ID":"9ea4766d-2197-486d-8373-8f7340574545","Type":"ContainerStarted","Data":"5b99645e807d46d54ff9e1e4fe71dce0063fdc614c720af582d88a9464af4253"} Apr 17 17:24:51.864329 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:51.864300 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qn8t8" event={"ID":"968b8801-64dd-454d-b1af-675ad2d36924","Type":"ContainerStarted","Data":"0fd263548ec1aebfd8a4a38a9325c3c5b5250bb54c72cbbde9853302c7de2854"} Apr 17 17:24:51.869032 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:51.868857 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ttt97" event={"ID":"5a3ce8ab-c625-49a4-a457-49fae7f24c9a","Type":"ContainerStarted","Data":"2273431c173bf2376dc355856efecf94c81c82e4ac4a6b8cf88fc0ceaf0a0bbb"} Apr 17 17:24:51.877359 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:51.877325 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-kc5w4" event={"ID":"b10257f3-c35f-4a14-bf62-ab474fc1eeae","Type":"ContainerStarted","Data":"6602a819e5e0172b3f12a1b5d4ef4336ddce006d69dba2c72870094ceda00718"} Apr 17 17:24:51.889407 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:51.889380 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xxqdc" event={"ID":"ce658e37-54cb-4560-9405-21ba87bc0d35","Type":"ContainerStarted","Data":"7b4be8030956b3c6e85c274b094eb853d444b4641a9aa4044ab2c6c4b70b0215"} Apr 17 17:24:51.895727 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:51.895702 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nz777" event={"ID":"35fe1bc2-e1a4-4744-8f67-3cc38747e567","Type":"ContainerStarted","Data":"5737c5277d87ef013f22b3ee3c21982a6e16fb167c163767416a1cb773dfb917"} Apr 17 17:24:51.899713 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:51.899668 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-nmmln" event={"ID":"a21d46e7-a070-48be-a4ce-d2af9bd52539","Type":"ContainerStarted","Data":"7b18b3905a4a5d047c114c9d6b833359944bb9cf1817c3f6000a1c6e1329b0d3"} Apr 17 17:24:51.900995 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:51.900962 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-q9m54" event={"ID":"348f9f1d-49f3-4771-9e69-3b4ba90f29e9","Type":"ContainerStarted","Data":"06d321385e600dfe464be6762663d1f227aba0abfe8bd5fd7279f6a1c2d406e2"} Apr 17 17:24:52.298205 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:52.298177 2566 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:24:52.465924 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:52.465812 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a-metrics-certs\") pod \"network-metrics-daemon-wgmfp\" (UID: \"b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a\") " pod="openshift-multus/network-metrics-daemon-wgmfp" Apr 17 17:24:52.466055 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:52.465965 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:52.466055 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:52.466042 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a-metrics-certs podName:b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a nodeName:}" failed. No retries permitted until 2026-04-17 17:24:54.466023394 +0000 UTC m=+6.136103027 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a-metrics-certs") pod "network-metrics-daemon-wgmfp" (UID: "b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:52.547848 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:52.547818 2566 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:24:52.567088 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:52.567056 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:24:52.567088 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:52.567091 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:24:52.567333 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:52.567104 2566 projected.go:194] Error preparing data for projected volume kube-api-access-rvbnr for pod openshift-network-diagnostics/network-check-target-vdhlz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:52.567333 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:52.567163 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d29a1468-8ac5-454a-a993-6a5055191ec4-kube-api-access-rvbnr podName:d29a1468-8ac5-454a-a993-6a5055191ec4 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:54.567143055 +0000 UTC m=+6.237222695 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-rvbnr" (UniqueName: "kubernetes.io/projected/d29a1468-8ac5-454a-a993-6a5055191ec4-kube-api-access-rvbnr") pod "network-check-target-vdhlz" (UID: "d29a1468-8ac5-454a-a993-6a5055191ec4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:52.567333 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:52.567275 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvbnr\" (UniqueName: \"kubernetes.io/projected/d29a1468-8ac5-454a-a993-6a5055191ec4-kube-api-access-rvbnr\") pod \"network-check-target-vdhlz\" (UID: \"d29a1468-8ac5-454a-a993-6a5055191ec4\") " pod="openshift-network-diagnostics/network-check-target-vdhlz" Apr 17 17:24:52.834423 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:52.834394 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wgmfp" Apr 17 17:24:52.834867 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:52.834529 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wgmfp" podUID="b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a" Apr 17 17:24:52.834956 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:52.834937 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vdhlz" Apr 17 17:24:52.835630 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:52.835600 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vdhlz" podUID="d29a1468-8ac5-454a-a993-6a5055191ec4" Apr 17 17:24:52.920762 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:52.916919 2566 generic.go:358] "Generic (PLEG): container finished" podID="fc5397f3b7e87f74f15b6f9197df7f72" containerID="5f7644889ce30e55e1c6c96ea71d0c2619c52db3c1b0f9afa971ef21a355ae23" exitCode=0 Apr 17 17:24:52.920762 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:52.917083 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-147.ec2.internal" event={"ID":"fc5397f3b7e87f74f15b6f9197df7f72","Type":"ContainerDied","Data":"5f7644889ce30e55e1c6c96ea71d0c2619c52db3c1b0f9afa971ef21a355ae23"} Apr 17 17:24:52.931586 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:52.931021 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-147.ec2.internal" podStartSLOduration=2.931005857 podStartE2EDuration="2.931005857s" podCreationTimestamp="2026-04-17 17:24:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:24:51.866121365 +0000 UTC m=+3.536201007" watchObservedRunningTime="2026-04-17 17:24:52.931005857 +0000 UTC m=+4.601085501" Apr 17 17:24:53.921923 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:53.921870 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-147.ec2.internal" event={"ID":"fc5397f3b7e87f74f15b6f9197df7f72","Type":"ContainerStarted","Data":"3318a2efc72193782c4f4a5a9d6723704e1f1341fa9b853ee098b9394c2d32f9"} Apr 17 17:24:54.484112 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:54.484076 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a-metrics-certs\") pod \"network-metrics-daemon-wgmfp\" (UID: \"b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a\") " pod="openshift-multus/network-metrics-daemon-wgmfp" Apr 17 17:24:54.484304 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:54.484245 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:54.484370 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:54.484331 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a-metrics-certs podName:b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a nodeName:}" failed. No retries permitted until 2026-04-17 17:24:58.484312286 +0000 UTC m=+10.154391919 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a-metrics-certs") pod "network-metrics-daemon-wgmfp" (UID: "b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:54.585360 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:54.584784 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvbnr\" (UniqueName: \"kubernetes.io/projected/d29a1468-8ac5-454a-a993-6a5055191ec4-kube-api-access-rvbnr\") pod \"network-check-target-vdhlz\" (UID: \"d29a1468-8ac5-454a-a993-6a5055191ec4\") " pod="openshift-network-diagnostics/network-check-target-vdhlz" Apr 17 17:24:54.585360 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:54.584937 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:24:54.585360 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:54.584958 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:24:54.585360 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:54.584972 2566 projected.go:194] Error preparing data for projected volume kube-api-access-rvbnr for pod openshift-network-diagnostics/network-check-target-vdhlz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:54.585360 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:54.585025 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d29a1468-8ac5-454a-a993-6a5055191ec4-kube-api-access-rvbnr podName:d29a1468-8ac5-454a-a993-6a5055191ec4 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:58.585006804 +0000 UTC m=+10.255086440 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-rvbnr" (UniqueName: "kubernetes.io/projected/d29a1468-8ac5-454a-a993-6a5055191ec4-kube-api-access-rvbnr") pod "network-check-target-vdhlz" (UID: "d29a1468-8ac5-454a-a993-6a5055191ec4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:54.836001 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:54.835416 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vdhlz" Apr 17 17:24:54.836001 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:54.835563 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vdhlz" podUID="d29a1468-8ac5-454a-a993-6a5055191ec4" Apr 17 17:24:54.836001 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:54.835940 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wgmfp" Apr 17 17:24:54.836328 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:54.836045 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wgmfp" podUID="b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a" Apr 17 17:24:56.128727 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:56.128546 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-147.ec2.internal" podStartSLOduration=6.128525565 podStartE2EDuration="6.128525565s" podCreationTimestamp="2026-04-17 17:24:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:24:53.935651948 +0000 UTC m=+5.605731591" watchObservedRunningTime="2026-04-17 17:24:56.128525565 +0000 UTC m=+7.798605208" Apr 17 17:24:56.129197 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:56.129087 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-td48j"] Apr 17 17:24:56.132423 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:56.132392 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-td48j" Apr 17 17:24:56.135640 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:56.135178 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 17:24:56.135640 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:56.135369 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 17:24:56.135640 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:56.135379 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-zpnn2\"" Apr 17 17:24:56.199293 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:56.199241 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9b5aeedb-5fe5-4d2a-bc4d-db986c41c8fd-hosts-file\") pod \"node-resolver-td48j\" (UID: \"9b5aeedb-5fe5-4d2a-bc4d-db986c41c8fd\") " pod="openshift-dns/node-resolver-td48j" Apr 17 17:24:56.199483 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:56.199331 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzgfw\" (UniqueName: \"kubernetes.io/projected/9b5aeedb-5fe5-4d2a-bc4d-db986c41c8fd-kube-api-access-kzgfw\") pod \"node-resolver-td48j\" (UID: \"9b5aeedb-5fe5-4d2a-bc4d-db986c41c8fd\") " pod="openshift-dns/node-resolver-td48j" Apr 17 17:24:56.199483 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:56.199376 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9b5aeedb-5fe5-4d2a-bc4d-db986c41c8fd-tmp-dir\") pod \"node-resolver-td48j\" (UID: \"9b5aeedb-5fe5-4d2a-bc4d-db986c41c8fd\") " pod="openshift-dns/node-resolver-td48j" Apr 17 17:24:56.300195 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:56.300156 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9b5aeedb-5fe5-4d2a-bc4d-db986c41c8fd-hosts-file\") pod \"node-resolver-td48j\" (UID: \"9b5aeedb-5fe5-4d2a-bc4d-db986c41c8fd\") " pod="openshift-dns/node-resolver-td48j" Apr 17 17:24:56.300401 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:56.300227 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kzgfw\" (UniqueName: \"kubernetes.io/projected/9b5aeedb-5fe5-4d2a-bc4d-db986c41c8fd-kube-api-access-kzgfw\") pod \"node-resolver-td48j\" (UID: \"9b5aeedb-5fe5-4d2a-bc4d-db986c41c8fd\") " pod="openshift-dns/node-resolver-td48j" Apr 17 17:24:56.300401 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:56.300291 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9b5aeedb-5fe5-4d2a-bc4d-db986c41c8fd-tmp-dir\") pod \"node-resolver-td48j\" (UID: \"9b5aeedb-5fe5-4d2a-bc4d-db986c41c8fd\") " pod="openshift-dns/node-resolver-td48j" Apr 17 17:24:56.300741 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:56.300641 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9b5aeedb-5fe5-4d2a-bc4d-db986c41c8fd-tmp-dir\") pod \"node-resolver-td48j\" (UID: \"9b5aeedb-5fe5-4d2a-bc4d-db986c41c8fd\") " pod="openshift-dns/node-resolver-td48j" Apr 17 17:24:56.300741 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:56.300692 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9b5aeedb-5fe5-4d2a-bc4d-db986c41c8fd-hosts-file\") pod \"node-resolver-td48j\" (UID: \"9b5aeedb-5fe5-4d2a-bc4d-db986c41c8fd\") " pod="openshift-dns/node-resolver-td48j" Apr 17 17:24:56.321152 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:56.321125 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzgfw\" (UniqueName: \"kubernetes.io/projected/9b5aeedb-5fe5-4d2a-bc4d-db986c41c8fd-kube-api-access-kzgfw\") pod \"node-resolver-td48j\" (UID: \"9b5aeedb-5fe5-4d2a-bc4d-db986c41c8fd\") " pod="openshift-dns/node-resolver-td48j" Apr 17 17:24:56.447795 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:56.447685 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-td48j" Apr 17 17:24:56.834191 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:56.834160 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vdhlz" Apr 17 17:24:56.834481 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:56.834302 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vdhlz" podUID="d29a1468-8ac5-454a-a993-6a5055191ec4" Apr 17 17:24:56.834856 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:56.834828 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wgmfp" Apr 17 17:24:56.834951 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:56.834935 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wgmfp" podUID="b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a" Apr 17 17:24:58.517499 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:58.517328 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a-metrics-certs\") pod \"network-metrics-daemon-wgmfp\" (UID: \"b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a\") " pod="openshift-multus/network-metrics-daemon-wgmfp" Apr 17 17:24:58.517499 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:58.517487 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:58.518111 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:58.517551 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a-metrics-certs podName:b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a nodeName:}" failed. No retries permitted until 2026-04-17 17:25:06.51753311 +0000 UTC m=+18.187612753 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a-metrics-certs") pod "network-metrics-daemon-wgmfp" (UID: "b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:58.618434 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:58.618386 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvbnr\" (UniqueName: \"kubernetes.io/projected/d29a1468-8ac5-454a-a993-6a5055191ec4-kube-api-access-rvbnr\") pod \"network-check-target-vdhlz\" (UID: \"d29a1468-8ac5-454a-a993-6a5055191ec4\") " pod="openshift-network-diagnostics/network-check-target-vdhlz" Apr 17 17:24:58.618616 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:58.618553 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:24:58.618616 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:58.618579 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:24:58.618616 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:58.618592 2566 projected.go:194] Error preparing data for projected volume kube-api-access-rvbnr for pod openshift-network-diagnostics/network-check-target-vdhlz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:58.618760 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:58.618658 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d29a1468-8ac5-454a-a993-6a5055191ec4-kube-api-access-rvbnr podName:d29a1468-8ac5-454a-a993-6a5055191ec4 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:06.618639221 +0000 UTC m=+18.288718861 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-rvbnr" (UniqueName: "kubernetes.io/projected/d29a1468-8ac5-454a-a993-6a5055191ec4-kube-api-access-rvbnr") pod "network-check-target-vdhlz" (UID: "d29a1468-8ac5-454a-a993-6a5055191ec4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:58.835203 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:58.835058 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vdhlz" Apr 17 17:24:58.835203 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:24:58.835103 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wgmfp" Apr 17 17:24:58.835203 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:58.835162 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vdhlz" podUID="d29a1468-8ac5-454a-a993-6a5055191ec4" Apr 17 17:24:58.835517 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:24:58.835213 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wgmfp" podUID="b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a" Apr 17 17:25:00.833999 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:00.833960 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wgmfp" Apr 17 17:25:00.834473 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:00.833969 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vdhlz" Apr 17 17:25:00.834473 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:00.834110 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wgmfp" podUID="b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a" Apr 17 17:25:00.834473 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:00.834163 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vdhlz" podUID="d29a1468-8ac5-454a-a993-6a5055191ec4" Apr 17 17:25:02.833741 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:02.833702 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vdhlz" Apr 17 17:25:02.834154 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:02.833757 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wgmfp" Apr 17 17:25:02.834154 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:02.833833 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vdhlz" podUID="d29a1468-8ac5-454a-a993-6a5055191ec4" Apr 17 17:25:02.834154 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:02.833969 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wgmfp" podUID="b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a" Apr 17 17:25:04.834450 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:04.834182 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vdhlz" Apr 17 17:25:04.834878 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:04.834207 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wgmfp" Apr 17 17:25:04.834878 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:04.834567 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vdhlz" podUID="d29a1468-8ac5-454a-a993-6a5055191ec4" Apr 17 17:25:04.834878 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:04.834619 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wgmfp" podUID="b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a" Apr 17 17:25:05.338438 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:05.338405 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-q4gqn"] Apr 17 17:25:05.406039 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:05.406008 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q4gqn" Apr 17 17:25:05.406205 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:05.406091 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q4gqn" podUID="b2a4b11e-5add-4df7-8e69-5b3342e010fe" Apr 17 17:25:05.471661 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:05.471621 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b2a4b11e-5add-4df7-8e69-5b3342e010fe-dbus\") pod \"global-pull-secret-syncer-q4gqn\" (UID: \"b2a4b11e-5add-4df7-8e69-5b3342e010fe\") " pod="kube-system/global-pull-secret-syncer-q4gqn" Apr 17 17:25:05.471837 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:05.471682 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b2a4b11e-5add-4df7-8e69-5b3342e010fe-original-pull-secret\") pod \"global-pull-secret-syncer-q4gqn\" (UID: \"b2a4b11e-5add-4df7-8e69-5b3342e010fe\") " pod="kube-system/global-pull-secret-syncer-q4gqn" Apr 17 17:25:05.471837 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:05.471773 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b2a4b11e-5add-4df7-8e69-5b3342e010fe-kubelet-config\") pod \"global-pull-secret-syncer-q4gqn\" (UID: \"b2a4b11e-5add-4df7-8e69-5b3342e010fe\") " pod="kube-system/global-pull-secret-syncer-q4gqn" Apr 17 17:25:05.572956 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:05.572921 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b2a4b11e-5add-4df7-8e69-5b3342e010fe-original-pull-secret\") pod \"global-pull-secret-syncer-q4gqn\" (UID: \"b2a4b11e-5add-4df7-8e69-5b3342e010fe\") " pod="kube-system/global-pull-secret-syncer-q4gqn" Apr 17 17:25:05.573117 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:05.572977 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b2a4b11e-5add-4df7-8e69-5b3342e010fe-kubelet-config\") pod \"global-pull-secret-syncer-q4gqn\" (UID: \"b2a4b11e-5add-4df7-8e69-5b3342e010fe\") " pod="kube-system/global-pull-secret-syncer-q4gqn" Apr 17 17:25:05.573117 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:05.573023 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b2a4b11e-5add-4df7-8e69-5b3342e010fe-dbus\") pod \"global-pull-secret-syncer-q4gqn\" (UID: \"b2a4b11e-5add-4df7-8e69-5b3342e010fe\") " pod="kube-system/global-pull-secret-syncer-q4gqn" Apr 17 17:25:05.573117 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:05.573089 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:25:05.573268 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:05.573126 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b2a4b11e-5add-4df7-8e69-5b3342e010fe-kubelet-config\") pod \"global-pull-secret-syncer-q4gqn\" (UID: \"b2a4b11e-5add-4df7-8e69-5b3342e010fe\") " pod="kube-system/global-pull-secret-syncer-q4gqn" Apr 17 17:25:05.573268 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:05.573158 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2a4b11e-5add-4df7-8e69-5b3342e010fe-original-pull-secret podName:b2a4b11e-5add-4df7-8e69-5b3342e010fe nodeName:}" failed. No retries permitted until 2026-04-17 17:25:06.073137023 +0000 UTC m=+17.743216650 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b2a4b11e-5add-4df7-8e69-5b3342e010fe-original-pull-secret") pod "global-pull-secret-syncer-q4gqn" (UID: "b2a4b11e-5add-4df7-8e69-5b3342e010fe") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:25:05.573268 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:05.573245 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b2a4b11e-5add-4df7-8e69-5b3342e010fe-dbus\") pod \"global-pull-secret-syncer-q4gqn\" (UID: \"b2a4b11e-5add-4df7-8e69-5b3342e010fe\") " pod="kube-system/global-pull-secret-syncer-q4gqn" Apr 17 17:25:06.077227 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:06.077193 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b2a4b11e-5add-4df7-8e69-5b3342e010fe-original-pull-secret\") pod \"global-pull-secret-syncer-q4gqn\" (UID: \"b2a4b11e-5add-4df7-8e69-5b3342e010fe\") " pod="kube-system/global-pull-secret-syncer-q4gqn" Apr 17 17:25:06.077603 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:06.077370 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:25:06.077603 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:06.077442 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2a4b11e-5add-4df7-8e69-5b3342e010fe-original-pull-secret podName:b2a4b11e-5add-4df7-8e69-5b3342e010fe nodeName:}" failed. No retries permitted until 2026-04-17 17:25:07.077421554 +0000 UTC m=+18.747501183 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b2a4b11e-5add-4df7-8e69-5b3342e010fe-original-pull-secret") pod "global-pull-secret-syncer-q4gqn" (UID: "b2a4b11e-5add-4df7-8e69-5b3342e010fe") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:25:06.579924 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:06.579894 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a-metrics-certs\") pod \"network-metrics-daemon-wgmfp\" (UID: \"b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a\") " pod="openshift-multus/network-metrics-daemon-wgmfp" Apr 17 17:25:06.580066 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:06.580001 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:06.580066 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:06.580063 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a-metrics-certs podName:b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a nodeName:}" failed. No retries permitted until 2026-04-17 17:25:22.580043531 +0000 UTC m=+34.250123157 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a-metrics-certs") pod "network-metrics-daemon-wgmfp" (UID: "b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:06.680559 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:06.680524 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvbnr\" (UniqueName: \"kubernetes.io/projected/d29a1468-8ac5-454a-a993-6a5055191ec4-kube-api-access-rvbnr\") pod \"network-check-target-vdhlz\" (UID: \"d29a1468-8ac5-454a-a993-6a5055191ec4\") " pod="openshift-network-diagnostics/network-check-target-vdhlz" Apr 17 17:25:06.680732 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:06.680707 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:25:06.680732 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:06.680728 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:25:06.680844 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:06.680741 2566 projected.go:194] Error preparing data for projected volume kube-api-access-rvbnr for pod openshift-network-diagnostics/network-check-target-vdhlz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:06.680844 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:06.680801 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d29a1468-8ac5-454a-a993-6a5055191ec4-kube-api-access-rvbnr podName:d29a1468-8ac5-454a-a993-6a5055191ec4 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:22.680779985 +0000 UTC m=+34.350859619 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-rvbnr" (UniqueName: "kubernetes.io/projected/d29a1468-8ac5-454a-a993-6a5055191ec4-kube-api-access-rvbnr") pod "network-check-target-vdhlz" (UID: "d29a1468-8ac5-454a-a993-6a5055191ec4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:06.833620 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:06.833538 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vdhlz" Apr 17 17:25:06.833777 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:06.833544 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wgmfp" Apr 17 17:25:06.833777 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:06.833669 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vdhlz" podUID="d29a1468-8ac5-454a-a993-6a5055191ec4" Apr 17 17:25:06.833777 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:06.833752 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wgmfp" podUID="b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a" Apr 17 17:25:06.833777 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:06.833544 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q4gqn" Apr 17 17:25:06.833964 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:06.833824 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q4gqn" podUID="b2a4b11e-5add-4df7-8e69-5b3342e010fe" Apr 17 17:25:07.084756 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:07.084679 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b2a4b11e-5add-4df7-8e69-5b3342e010fe-original-pull-secret\") pod \"global-pull-secret-syncer-q4gqn\" (UID: \"b2a4b11e-5add-4df7-8e69-5b3342e010fe\") " pod="kube-system/global-pull-secret-syncer-q4gqn" Apr 17 17:25:07.085172 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:07.084798 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:25:07.085172 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:07.084870 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2a4b11e-5add-4df7-8e69-5b3342e010fe-original-pull-secret podName:b2a4b11e-5add-4df7-8e69-5b3342e010fe nodeName:}" failed. No retries permitted until 2026-04-17 17:25:09.084847595 +0000 UTC m=+20.754927254 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b2a4b11e-5add-4df7-8e69-5b3342e010fe-original-pull-secret") pod "global-pull-secret-syncer-q4gqn" (UID: "b2a4b11e-5add-4df7-8e69-5b3342e010fe") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:25:08.178601 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:25:08.178566 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b5aeedb_5fe5_4d2a_bc4d_db986c41c8fd.slice/crio-990c0c9be1a2f27baebcd17135bb24563ed7ac6667d5794696197a3b7de0483d WatchSource:0}: Error finding container 990c0c9be1a2f27baebcd17135bb24563ed7ac6667d5794696197a3b7de0483d: Status 404 returned error can't find the container with id 990c0c9be1a2f27baebcd17135bb24563ed7ac6667d5794696197a3b7de0483d Apr 17 17:25:08.834310 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:08.834106 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vdhlz" Apr 17 17:25:08.834408 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:08.834200 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wgmfp" Apr 17 17:25:08.834408 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:08.834385 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vdhlz" podUID="d29a1468-8ac5-454a-a993-6a5055191ec4" Apr 17 17:25:08.834509 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:08.834456 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wgmfp" podUID="b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a" Apr 17 17:25:08.834509 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:08.834223 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q4gqn" Apr 17 17:25:08.834614 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:08.834522 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q4gqn" podUID="b2a4b11e-5add-4df7-8e69-5b3342e010fe" Apr 17 17:25:08.944312 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:08.944275 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6gnf" event={"ID":"9ea4766d-2197-486d-8373-8f7340574545","Type":"ContainerStarted","Data":"1a887f858eff3570583957afc580a44adcaaa8dcc55b6823adf38c484c88ca40"} Apr 17 17:25:08.945508 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:08.945485 2566 generic.go:358] "Generic (PLEG): container finished" podID="968b8801-64dd-454d-b1af-675ad2d36924" containerID="a266d67798a54969b1a36462bcbfccde156cff595537083a2141c497ffcbc758" exitCode=0 Apr 17 17:25:08.945608 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:08.945550 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qn8t8" event={"ID":"968b8801-64dd-454d-b1af-675ad2d36924","Type":"ContainerDied","Data":"a266d67798a54969b1a36462bcbfccde156cff595537083a2141c497ffcbc758"} Apr 17 17:25:08.947602 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:08.947291 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ttt97" event={"ID":"5a3ce8ab-c625-49a4-a457-49fae7f24c9a","Type":"ContainerStarted","Data":"a7f6bc17b9473f7885bd966902167898bfa93b661c88fdcbe83023024368f424"} Apr 17 17:25:08.948781 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:08.948753 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xxqdc" event={"ID":"ce658e37-54cb-4560-9405-21ba87bc0d35","Type":"ContainerStarted","Data":"0d54c079cab1eb1984655c287afd9c6a9fcd06d0223f1e9ec9c3d215abc73f55"} Apr 17 17:25:08.950984 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:08.950965 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz777_35fe1bc2-e1a4-4744-8f67-3cc38747e567/ovn-acl-logging/0.log" Apr 17 17:25:08.951375 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:08.951356 2566 generic.go:358] "Generic (PLEG): container finished" podID="35fe1bc2-e1a4-4744-8f67-3cc38747e567" containerID="98b0ce18798bfe25695bc6d227d25de5a7375f30369f0d38e25a19fd97405dc6" exitCode=1 Apr 17 17:25:08.951479 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:08.951416 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nz777" event={"ID":"35fe1bc2-e1a4-4744-8f67-3cc38747e567","Type":"ContainerStarted","Data":"0654f045573323ceba58f8c79cc662009084e390140e4c37825ac33f2dcb92e3"} Apr 17 17:25:08.951479 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:08.951436 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nz777" event={"ID":"35fe1bc2-e1a4-4744-8f67-3cc38747e567","Type":"ContainerStarted","Data":"f5f148e67a596b1b779660f05a6be71c19ea9e63c5f5d7cbd9b86ee8969cdd23"} Apr 17 17:25:08.951479 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:08.951447 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nz777" event={"ID":"35fe1bc2-e1a4-4744-8f67-3cc38747e567","Type":"ContainerStarted","Data":"b1d7641e7b1cccb38bed394b6c6d46d4a6741b05197531eaae262154cfe01464"} Apr 17 17:25:08.951479 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:08.951456 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nz777" event={"ID":"35fe1bc2-e1a4-4744-8f67-3cc38747e567","Type":"ContainerDied","Data":"98b0ce18798bfe25695bc6d227d25de5a7375f30369f0d38e25a19fd97405dc6"} Apr 17 17:25:08.951479 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:08.951466 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nz777" event={"ID":"35fe1bc2-e1a4-4744-8f67-3cc38747e567","Type":"ContainerStarted","Data":"dd6b3c517ce038d40d29507aab7039332dd14deda270d71da6c0090064efd90e"} Apr 17 17:25:08.952765 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:08.952743 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-nmmln" event={"ID":"a21d46e7-a070-48be-a4ce-d2af9bd52539","Type":"ContainerStarted","Data":"c91856d82c161c7202eaae67906e9c9d23e5ca491691c6ac5c5e36c2ddfc70bc"} Apr 17 17:25:08.954236 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:08.954209 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-q9m54" event={"ID":"348f9f1d-49f3-4771-9e69-3b4ba90f29e9","Type":"ContainerStarted","Data":"b60d205f2b23336cac0da4c620ea8b2974ff3ef841ccec2eb0066d7a400147af"} Apr 17 17:25:08.955542 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:08.955519 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-td48j" event={"ID":"9b5aeedb-5fe5-4d2a-bc4d-db986c41c8fd","Type":"ContainerStarted","Data":"73b1f9953bffe29096ca2cd20bc1fff9ce52e1745bd7faca08fcc4154235baab"} Apr 17 17:25:08.955608 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:08.955549 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-td48j" event={"ID":"9b5aeedb-5fe5-4d2a-bc4d-db986c41c8fd","Type":"ContainerStarted","Data":"990c0c9be1a2f27baebcd17135bb24563ed7ac6667d5794696197a3b7de0483d"} Apr 17 17:25:08.989520 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:08.989468 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-q9m54" podStartSLOduration=4.223375927 podStartE2EDuration="20.989451543s" podCreationTimestamp="2026-04-17 17:24:48 +0000 UTC" firstStartedPulling="2026-04-17 17:24:51.408988012 +0000 UTC m=+3.079067632" lastFinishedPulling="2026-04-17 17:25:08.175063624 +0000 UTC m=+19.845143248" observedRunningTime="2026-04-17 17:25:08.989104345 +0000 UTC m=+20.659183989" watchObservedRunningTime="2026-04-17 17:25:08.989451543 +0000 UTC m=+20.659531186" Apr 17 17:25:09.009762 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:09.009722 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-nmmln" podStartSLOduration=4.245817097 podStartE2EDuration="21.009710001s" podCreationTimestamp="2026-04-17 17:24:48 +0000 UTC" firstStartedPulling="2026-04-17 17:24:51.412915968 +0000 UTC m=+3.082995594" lastFinishedPulling="2026-04-17 17:25:08.176808866 +0000 UTC m=+19.846888498" observedRunningTime="2026-04-17 17:25:09.00966776 +0000 UTC m=+20.679747403" watchObservedRunningTime="2026-04-17 17:25:09.009710001 +0000 UTC m=+20.679789643" Apr 17 17:25:09.019413 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:09.019392 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-q9m54" Apr 17 17:25:09.019931 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:09.019913 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-q9m54" Apr 17 17:25:09.025657 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:09.025612 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-td48j" podStartSLOduration=13.025598688 podStartE2EDuration="13.025598688s" podCreationTimestamp="2026-04-17 17:24:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:25:09.024908438 +0000 UTC m=+20.694988082" watchObservedRunningTime="2026-04-17 17:25:09.025598688 +0000 UTC m=+20.695678331" Apr 17 17:25:09.044694 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:09.044655 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-ttt97" podStartSLOduration=4.280474719 podStartE2EDuration="21.044641278s" podCreationTimestamp="2026-04-17 17:24:48 +0000 UTC" firstStartedPulling="2026-04-17 17:24:51.41086869 +0000 UTC m=+3.080948311" lastFinishedPulling="2026-04-17 17:25:08.17503525 +0000 UTC m=+19.845114870" observedRunningTime="2026-04-17 17:25:09.044173084 +0000 UTC m=+20.714252727" watchObservedRunningTime="2026-04-17 17:25:09.044641278 +0000 UTC m=+20.714720919" Apr 17 17:25:09.062975 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:09.062938 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-xxqdc" podStartSLOduration=3.076267866 podStartE2EDuration="20.062925504s" podCreationTimestamp="2026-04-17 17:24:49 +0000 UTC" firstStartedPulling="2026-04-17 17:24:51.40545174 +0000 UTC m=+3.075531360" lastFinishedPulling="2026-04-17 17:25:08.392109375 +0000 UTC m=+20.062188998" observedRunningTime="2026-04-17 17:25:09.062315961 +0000 UTC m=+20.732395603" watchObservedRunningTime="2026-04-17 17:25:09.062925504 +0000 UTC m=+20.733005146" Apr 17 17:25:09.098372 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:09.098340 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b2a4b11e-5add-4df7-8e69-5b3342e010fe-original-pull-secret\") pod \"global-pull-secret-syncer-q4gqn\" (UID: \"b2a4b11e-5add-4df7-8e69-5b3342e010fe\") " pod="kube-system/global-pull-secret-syncer-q4gqn" Apr 17 17:25:09.098721 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:09.098701 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:25:09.098836 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:09.098749 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2a4b11e-5add-4df7-8e69-5b3342e010fe-original-pull-secret podName:b2a4b11e-5add-4df7-8e69-5b3342e010fe nodeName:}" failed. No retries permitted until 2026-04-17 17:25:13.098736797 +0000 UTC m=+24.768816418 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b2a4b11e-5add-4df7-8e69-5b3342e010fe-original-pull-secret") pod "global-pull-secret-syncer-q4gqn" (UID: "b2a4b11e-5add-4df7-8e69-5b3342e010fe") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:25:09.760181 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:09.760027 2566 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 17:25:09.815739 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:09.815593 2566 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T17:25:09.760178248Z","UUID":"d8f5d72c-58e4-4dfb-9921-66759708bd01","Handler":null,"Name":"","Endpoint":""} Apr 17 17:25:09.817842 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:09.817802 2566 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 17:25:09.817943 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:09.817852 2566 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 17:25:09.960215 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:09.960141 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz777_35fe1bc2-e1a4-4744-8f67-3cc38747e567/ovn-acl-logging/0.log" Apr 17 17:25:09.960563 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:09.960539 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nz777" event={"ID":"35fe1bc2-e1a4-4744-8f67-3cc38747e567","Type":"ContainerStarted","Data":"baa3508e8657040c945d94d83b7fa442b086b0c93f8b84f5405cb2704b7c3a53"} Apr 17 17:25:09.962159 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:09.962133 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6gnf" event={"ID":"9ea4766d-2197-486d-8373-8f7340574545","Type":"ContainerStarted","Data":"9a5f2a2bb46e0d10fbf83dd392af500d0cabfa8af4c31013d7992355d7f144c5"} Apr 17 17:25:09.963840 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:09.963795 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-kc5w4" event={"ID":"b10257f3-c35f-4a14-bf62-ab474fc1eeae","Type":"ContainerStarted","Data":"1c8d85b1ccbfd88942bfaa09297fe9f4df06c459d6294e859f2b0b530694957e"} Apr 17 17:25:09.964665 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:09.964649 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-q9m54" Apr 17 17:25:09.965244 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:09.965223 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-q9m54" Apr 17 17:25:09.996941 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:09.996892 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-kc5w4" podStartSLOduration=5.309028813 podStartE2EDuration="21.99687795s" podCreationTimestamp="2026-04-17 17:24:48 +0000 UTC" firstStartedPulling="2026-04-17 17:24:51.407573515 +0000 UTC m=+3.077653149" lastFinishedPulling="2026-04-17 17:25:08.095422664 +0000 UTC m=+19.765502286" observedRunningTime="2026-04-17 17:25:09.981352258 +0000 UTC m=+21.651431900" watchObservedRunningTime="2026-04-17 17:25:09.99687795 +0000 UTC m=+21.666957647" Apr 17 17:25:10.833882 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:10.833857 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vdhlz" Apr 17 17:25:10.834269 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:10.833860 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wgmfp" Apr 17 17:25:10.834269 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:10.833995 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vdhlz" podUID="d29a1468-8ac5-454a-a993-6a5055191ec4" Apr 17 17:25:10.834269 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:10.834101 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wgmfp" podUID="b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a" Apr 17 17:25:10.834269 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:10.833857 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q4gqn" Apr 17 17:25:10.834269 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:10.834207 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q4gqn" podUID="b2a4b11e-5add-4df7-8e69-5b3342e010fe" Apr 17 17:25:10.967824 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:10.967784 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6gnf" event={"ID":"9ea4766d-2197-486d-8373-8f7340574545","Type":"ContainerStarted","Data":"db9cde949fee521d9145f44442da740858c6e9e96df4d7208bd6c4dd38e99319"} Apr 17 17:25:11.972878 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:11.972849 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz777_35fe1bc2-e1a4-4744-8f67-3cc38747e567/ovn-acl-logging/0.log" Apr 17 17:25:11.973511 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:11.973267 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nz777" event={"ID":"35fe1bc2-e1a4-4744-8f67-3cc38747e567","Type":"ContainerStarted","Data":"f96439c1646fba752cbd505b8268a53ac45f3b1df0d9aa9b172c28cc890d5fc7"} Apr 17 17:25:12.834452 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:12.834419 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q4gqn" Apr 17 17:25:12.834630 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:12.834420 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wgmfp" Apr 17 17:25:12.834630 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:12.834544 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q4gqn" podUID="b2a4b11e-5add-4df7-8e69-5b3342e010fe" Apr 17 17:25:12.834630 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:12.834419 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vdhlz" Apr 17 17:25:12.834785 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:12.834646 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wgmfp" podUID="b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a" Apr 17 17:25:12.834785 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:12.834672 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vdhlz" podUID="d29a1468-8ac5-454a-a993-6a5055191ec4" Apr 17 17:25:13.130588 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:13.130509 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b2a4b11e-5add-4df7-8e69-5b3342e010fe-original-pull-secret\") pod \"global-pull-secret-syncer-q4gqn\" (UID: \"b2a4b11e-5add-4df7-8e69-5b3342e010fe\") " pod="kube-system/global-pull-secret-syncer-q4gqn" Apr 17 17:25:13.130989 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:13.130652 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:25:13.130989 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:13.130724 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2a4b11e-5add-4df7-8e69-5b3342e010fe-original-pull-secret podName:b2a4b11e-5add-4df7-8e69-5b3342e010fe nodeName:}" failed. No retries permitted until 2026-04-17 17:25:21.130708517 +0000 UTC m=+32.800788143 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b2a4b11e-5add-4df7-8e69-5b3342e010fe-original-pull-secret") pod "global-pull-secret-syncer-q4gqn" (UID: "b2a4b11e-5add-4df7-8e69-5b3342e010fe") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:25:13.978191 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:13.978026 2566 generic.go:358] "Generic (PLEG): container finished" podID="968b8801-64dd-454d-b1af-675ad2d36924" containerID="237696395bc444b0902115010f0cabb6151474e289638ed4092173e8656b0a41" exitCode=0 Apr 17 17:25:13.978370 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:13.978113 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qn8t8" event={"ID":"968b8801-64dd-454d-b1af-675ad2d36924","Type":"ContainerDied","Data":"237696395bc444b0902115010f0cabb6151474e289638ed4092173e8656b0a41"} Apr 17 17:25:13.981409 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:13.981393 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz777_35fe1bc2-e1a4-4744-8f67-3cc38747e567/ovn-acl-logging/0.log" Apr 17 17:25:13.981758 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:13.981720 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nz777" event={"ID":"35fe1bc2-e1a4-4744-8f67-3cc38747e567","Type":"ContainerStarted","Data":"d0fd9dab64698358440cb7cb62bcced6bcbc806e07490c72efa2de90eeec0a62"} Apr 17 17:25:13.982057 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:13.982026 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:25:13.982057 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:13.982052 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:25:13.982291 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:13.982179 2566 scope.go:117] "RemoveContainer" containerID="98b0ce18798bfe25695bc6d227d25de5a7375f30369f0d38e25a19fd97405dc6" Apr 17 17:25:13.997458 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:13.997441 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:25:14.001609 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:14.001575 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6gnf" podStartSLOduration=6.620035902 podStartE2EDuration="26.001563108s" podCreationTimestamp="2026-04-17 17:24:48 +0000 UTC" firstStartedPulling="2026-04-17 17:24:51.414897668 +0000 UTC m=+3.084977294" lastFinishedPulling="2026-04-17 17:25:10.796424871 +0000 UTC m=+22.466504500" observedRunningTime="2026-04-17 17:25:10.989607139 +0000 UTC m=+22.659686780" watchObservedRunningTime="2026-04-17 17:25:14.001563108 +0000 UTC m=+25.671642750" Apr 17 17:25:14.833846 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:14.833677 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wgmfp" Apr 17 17:25:14.834290 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:14.833677 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q4gqn" Apr 17 17:25:14.834290 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:14.833930 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wgmfp" podUID="b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a" Apr 17 17:25:14.834290 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:14.833678 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vdhlz" Apr 17 17:25:14.834290 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:14.834049 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q4gqn" podUID="b2a4b11e-5add-4df7-8e69-5b3342e010fe" Apr 17 17:25:14.834290 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:14.834114 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vdhlz" podUID="d29a1468-8ac5-454a-a993-6a5055191ec4" Apr 17 17:25:14.987713 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:14.987646 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz777_35fe1bc2-e1a4-4744-8f67-3cc38747e567/ovn-acl-logging/0.log" Apr 17 17:25:14.988116 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:14.988082 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nz777" event={"ID":"35fe1bc2-e1a4-4744-8f67-3cc38747e567","Type":"ContainerStarted","Data":"8b803c9dfeaa422d6de16570979462a6f691ef60ac874614f84e67ba73d49578"} Apr 17 17:25:14.988393 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:14.988375 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:25:14.990328 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:14.990301 2566 generic.go:358] "Generic (PLEG): container finished" podID="968b8801-64dd-454d-b1af-675ad2d36924" containerID="ca312bc241692258fe5bd5617014366e2ee43da096cbb2d328386d2e2b66cfee" exitCode=0 Apr 17 17:25:14.990431 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:14.990338 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qn8t8" event={"ID":"968b8801-64dd-454d-b1af-675ad2d36924","Type":"ContainerDied","Data":"ca312bc241692258fe5bd5617014366e2ee43da096cbb2d328386d2e2b66cfee"} Apr 17 17:25:15.003926 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:15.003901 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:25:15.018654 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:15.018605 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-nz777" podStartSLOduration=10.204391545 podStartE2EDuration="27.018586991s" podCreationTimestamp="2026-04-17 17:24:48 +0000 UTC" firstStartedPulling="2026-04-17 17:24:51.404701877 +0000 UTC m=+3.074781497" lastFinishedPulling="2026-04-17 17:25:08.218897308 +0000 UTC m=+19.888976943" observedRunningTime="2026-04-17 17:25:15.017890636 +0000 UTC m=+26.687970277" watchObservedRunningTime="2026-04-17 17:25:15.018586991 +0000 UTC m=+26.688666635" Apr 17 17:25:15.256223 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:15.256150 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wgmfp"] Apr 17 17:25:15.256386 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:15.256271 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wgmfp" Apr 17 17:25:15.256386 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:15.256365 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wgmfp" podUID="b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a" Apr 17 17:25:15.256793 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:15.256771 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-q4gqn"] Apr 17 17:25:15.256888 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:15.256854 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q4gqn" Apr 17 17:25:15.256947 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:15.256930 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q4gqn" podUID="b2a4b11e-5add-4df7-8e69-5b3342e010fe" Apr 17 17:25:15.258104 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:15.258083 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-vdhlz"] Apr 17 17:25:15.258177 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:15.258142 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vdhlz" Apr 17 17:25:15.258227 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:15.258203 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vdhlz" podUID="d29a1468-8ac5-454a-a993-6a5055191ec4" Apr 17 17:25:15.994275 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:15.994183 2566 generic.go:358] "Generic (PLEG): container finished" podID="968b8801-64dd-454d-b1af-675ad2d36924" containerID="ce65e0a34ad15122b5e1295fccac9898cc3a0d6363b8a72b6f36e7111fa1d7fd" exitCode=0 Apr 17 17:25:15.994778 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:15.994287 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qn8t8" event={"ID":"968b8801-64dd-454d-b1af-675ad2d36924","Type":"ContainerDied","Data":"ce65e0a34ad15122b5e1295fccac9898cc3a0d6363b8a72b6f36e7111fa1d7fd"} Apr 17 17:25:16.833740 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:16.833704 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q4gqn" Apr 17 17:25:16.833908 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:16.833815 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wgmfp" Apr 17 17:25:16.834017 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:16.833994 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vdhlz" Apr 17 17:25:16.834017 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:16.834005 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q4gqn" podUID="b2a4b11e-5add-4df7-8e69-5b3342e010fe" Apr 17 17:25:16.834179 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:16.834103 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vdhlz" podUID="d29a1468-8ac5-454a-a993-6a5055191ec4" Apr 17 17:25:16.834266 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:16.834183 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wgmfp" podUID="b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a" Apr 17 17:25:17.163213 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:17.163126 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-td48j_9b5aeedb-5fe5-4d2a-bc4d-db986c41c8fd/dns-node-resolver/0.log" Apr 17 17:25:18.148828 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:18.148802 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-ttt97_5a3ce8ab-c625-49a4-a457-49fae7f24c9a/node-ca/0.log" Apr 17 17:25:18.835274 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:18.835236 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vdhlz" Apr 17 17:25:18.835806 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:18.835333 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wgmfp" Apr 17 17:25:18.835806 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:18.835366 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q4gqn" Apr 17 17:25:18.835806 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:18.835377 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vdhlz" podUID="d29a1468-8ac5-454a-a993-6a5055191ec4" Apr 17 17:25:18.835806 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:18.835437 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q4gqn" podUID="b2a4b11e-5add-4df7-8e69-5b3342e010fe" Apr 17 17:25:18.835806 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:18.835532 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wgmfp" podUID="b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a" Apr 17 17:25:20.833531 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:20.833495 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vdhlz" Apr 17 17:25:20.833976 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:20.833608 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vdhlz" podUID="d29a1468-8ac5-454a-a993-6a5055191ec4" Apr 17 17:25:20.833976 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:20.833635 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q4gqn" Apr 17 17:25:20.833976 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:20.833746 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q4gqn" podUID="b2a4b11e-5add-4df7-8e69-5b3342e010fe" Apr 17 17:25:20.833976 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:20.833642 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wgmfp" Apr 17 17:25:20.833976 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:20.833826 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wgmfp" podUID="b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a" Apr 17 17:25:21.194293 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:21.194179 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b2a4b11e-5add-4df7-8e69-5b3342e010fe-original-pull-secret\") pod \"global-pull-secret-syncer-q4gqn\" (UID: \"b2a4b11e-5add-4df7-8e69-5b3342e010fe\") " pod="kube-system/global-pull-secret-syncer-q4gqn" Apr 17 17:25:21.194450 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:21.194322 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:25:21.194450 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:21.194393 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2a4b11e-5add-4df7-8e69-5b3342e010fe-original-pull-secret podName:b2a4b11e-5add-4df7-8e69-5b3342e010fe nodeName:}" failed. No retries permitted until 2026-04-17 17:25:37.194373406 +0000 UTC m=+48.864453037 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b2a4b11e-5add-4df7-8e69-5b3342e010fe-original-pull-secret") pod "global-pull-secret-syncer-q4gqn" (UID: "b2a4b11e-5add-4df7-8e69-5b3342e010fe") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:25:22.008375 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:22.008162 2566 generic.go:358] "Generic (PLEG): container finished" podID="968b8801-64dd-454d-b1af-675ad2d36924" containerID="1b749cfdc254ca34717732a2fa3678f4a6b5d7637dde160e6ed695dd3213c5d4" exitCode=0 Apr 17 17:25:22.008375 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:22.008241 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qn8t8" event={"ID":"968b8801-64dd-454d-b1af-675ad2d36924","Type":"ContainerDied","Data":"1b749cfdc254ca34717732a2fa3678f4a6b5d7637dde160e6ed695dd3213c5d4"} Apr 17 17:25:22.608385 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:22.608345 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a-metrics-certs\") pod \"network-metrics-daemon-wgmfp\" (UID: \"b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a\") " pod="openshift-multus/network-metrics-daemon-wgmfp" Apr 17 17:25:22.608560 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:22.608460 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:22.608560 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:22.608510 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a-metrics-certs podName:b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a nodeName:}" failed. No retries permitted until 2026-04-17 17:25:54.608495482 +0000 UTC m=+66.278575102 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a-metrics-certs") pod "network-metrics-daemon-wgmfp" (UID: "b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:22.709381 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:22.709349 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvbnr\" (UniqueName: \"kubernetes.io/projected/d29a1468-8ac5-454a-a993-6a5055191ec4-kube-api-access-rvbnr\") pod \"network-check-target-vdhlz\" (UID: \"d29a1468-8ac5-454a-a993-6a5055191ec4\") " pod="openshift-network-diagnostics/network-check-target-vdhlz" Apr 17 17:25:22.709529 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:22.709468 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:25:22.709529 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:22.709480 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:25:22.709529 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:22.709489 2566 projected.go:194] Error preparing data for projected volume kube-api-access-rvbnr for pod openshift-network-diagnostics/network-check-target-vdhlz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:22.709628 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:22.709531 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d29a1468-8ac5-454a-a993-6a5055191ec4-kube-api-access-rvbnr podName:d29a1468-8ac5-454a-a993-6a5055191ec4 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:54.709520089 +0000 UTC m=+66.379599709 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-rvbnr" (UniqueName: "kubernetes.io/projected/d29a1468-8ac5-454a-a993-6a5055191ec4-kube-api-access-rvbnr") pod "network-check-target-vdhlz" (UID: "d29a1468-8ac5-454a-a993-6a5055191ec4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:22.834503 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:22.834472 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vdhlz" Apr 17 17:25:22.834676 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:22.834476 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wgmfp" Apr 17 17:25:22.834676 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:22.834578 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vdhlz" podUID="d29a1468-8ac5-454a-a993-6a5055191ec4" Apr 17 17:25:22.834676 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:22.834665 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wgmfp" podUID="b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a" Apr 17 17:25:22.834836 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:22.834680 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q4gqn" Apr 17 17:25:22.834836 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:22.834738 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q4gqn" podUID="b2a4b11e-5add-4df7-8e69-5b3342e010fe" Apr 17 17:25:23.012594 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:23.012516 2566 generic.go:358] "Generic (PLEG): container finished" podID="968b8801-64dd-454d-b1af-675ad2d36924" containerID="c843ae314518c66950b5d325895e6a7061708ae5d888297a58f7b39e7a7bc166" exitCode=0 Apr 17 17:25:23.012594 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:23.012573 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qn8t8" event={"ID":"968b8801-64dd-454d-b1af-675ad2d36924","Type":"ContainerDied","Data":"c843ae314518c66950b5d325895e6a7061708ae5d888297a58f7b39e7a7bc166"} Apr 17 17:25:24.016827 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:24.016791 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qn8t8" event={"ID":"968b8801-64dd-454d-b1af-675ad2d36924","Type":"ContainerStarted","Data":"de3131ada19172544ea6a3a778ff303b36ca8252eaddc54ab28ce4f357e374f9"} Apr 17 17:25:24.043678 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:24.043634 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-qn8t8" podStartSLOduration=5.90562233 podStartE2EDuration="36.043620466s" podCreationTimestamp="2026-04-17 17:24:48 +0000 UTC" firstStartedPulling="2026-04-17 17:24:51.411530308 +0000 UTC m=+3.081609930" lastFinishedPulling="2026-04-17 17:25:21.549528438 +0000 UTC m=+33.219608066" observedRunningTime="2026-04-17 17:25:24.043509415 +0000 UTC m=+35.713589056" watchObservedRunningTime="2026-04-17 17:25:24.043620466 +0000 UTC m=+35.713700086" Apr 17 17:25:24.833782 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:24.833747 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wgmfp" Apr 17 17:25:24.833937 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:24.833787 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vdhlz" Apr 17 17:25:24.833937 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:24.833866 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q4gqn" Apr 17 17:25:24.833937 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:24.833884 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wgmfp" podUID="b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a" Apr 17 17:25:24.834100 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:24.833958 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q4gqn" podUID="b2a4b11e-5add-4df7-8e69-5b3342e010fe" Apr 17 17:25:24.834100 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:24.834044 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vdhlz" podUID="d29a1468-8ac5-454a-a993-6a5055191ec4" Apr 17 17:25:26.834194 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:26.834166 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vdhlz" Apr 17 17:25:26.834716 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:26.834168 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q4gqn" Apr 17 17:25:26.834716 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:26.834274 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vdhlz" podUID="d29a1468-8ac5-454a-a993-6a5055191ec4" Apr 17 17:25:26.834716 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:26.834366 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q4gqn" podUID="b2a4b11e-5add-4df7-8e69-5b3342e010fe" Apr 17 17:25:26.834716 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:26.834170 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wgmfp" Apr 17 17:25:26.834716 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:26.834482 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wgmfp" podUID="b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a" Apr 17 17:25:28.834481 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:28.834452 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q4gqn" Apr 17 17:25:28.834991 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:28.834536 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q4gqn" podUID="b2a4b11e-5add-4df7-8e69-5b3342e010fe" Apr 17 17:25:28.834991 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:28.834578 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vdhlz" Apr 17 17:25:28.834991 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:28.834623 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vdhlz" podUID="d29a1468-8ac5-454a-a993-6a5055191ec4" Apr 17 17:25:28.834991 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:28.834650 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wgmfp" Apr 17 17:25:28.834991 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:28.834702 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wgmfp" podUID="b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a" Apr 17 17:25:30.833798 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:30.833769 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q4gqn" Apr 17 17:25:30.834171 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:30.833862 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vdhlz" Apr 17 17:25:30.834171 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:30.833877 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wgmfp" Apr 17 17:25:30.834171 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:30.833865 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q4gqn" podUID="b2a4b11e-5add-4df7-8e69-5b3342e010fe" Apr 17 17:25:30.834171 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:30.833943 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wgmfp" podUID="b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a" Apr 17 17:25:30.834171 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:30.834023 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vdhlz" podUID="d29a1468-8ac5-454a-a993-6a5055191ec4" Apr 17 17:25:32.833523 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:32.833495 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wgmfp" Apr 17 17:25:32.833888 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:32.833494 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q4gqn" Apr 17 17:25:32.833888 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:32.833596 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wgmfp" podUID="b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a" Apr 17 17:25:32.833888 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:32.833495 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vdhlz" Apr 17 17:25:32.833888 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:32.833691 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q4gqn" podUID="b2a4b11e-5add-4df7-8e69-5b3342e010fe" Apr 17 17:25:32.833888 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:32.833788 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vdhlz" podUID="d29a1468-8ac5-454a-a993-6a5055191ec4" Apr 17 17:25:34.833690 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:34.833657 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q4gqn" Apr 17 17:25:34.834150 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:34.833657 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wgmfp" Apr 17 17:25:34.834150 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:34.833771 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q4gqn" podUID="b2a4b11e-5add-4df7-8e69-5b3342e010fe" Apr 17 17:25:34.834150 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:34.833665 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vdhlz" Apr 17 17:25:34.834150 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:34.833835 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wgmfp" podUID="b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a" Apr 17 17:25:34.834150 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:34.833892 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vdhlz" podUID="d29a1468-8ac5-454a-a993-6a5055191ec4" Apr 17 17:25:36.833643 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:36.833612 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wgmfp" Apr 17 17:25:36.833643 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:36.833628 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q4gqn" Apr 17 17:25:36.834157 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:36.833628 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vdhlz" Apr 17 17:25:36.834157 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:36.833727 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wgmfp" podUID="b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a" Apr 17 17:25:36.834157 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:36.833816 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q4gqn" podUID="b2a4b11e-5add-4df7-8e69-5b3342e010fe" Apr 17 17:25:36.834157 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:36.833880 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vdhlz" podUID="d29a1468-8ac5-454a-a993-6a5055191ec4" Apr 17 17:25:37.230660 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:37.230583 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b2a4b11e-5add-4df7-8e69-5b3342e010fe-original-pull-secret\") pod \"global-pull-secret-syncer-q4gqn\" (UID: \"b2a4b11e-5add-4df7-8e69-5b3342e010fe\") " pod="kube-system/global-pull-secret-syncer-q4gqn" Apr 17 17:25:37.230796 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:37.230689 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:25:37.230796 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:37.230740 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2a4b11e-5add-4df7-8e69-5b3342e010fe-original-pull-secret podName:b2a4b11e-5add-4df7-8e69-5b3342e010fe nodeName:}" failed. No retries permitted until 2026-04-17 17:26:09.230723294 +0000 UTC m=+80.900802915 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b2a4b11e-5add-4df7-8e69-5b3342e010fe-original-pull-secret") pod "global-pull-secret-syncer-q4gqn" (UID: "b2a4b11e-5add-4df7-8e69-5b3342e010fe") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:25:38.834824 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:38.834789 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q4gqn" Apr 17 17:25:38.835288 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:38.834873 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q4gqn" podUID="b2a4b11e-5add-4df7-8e69-5b3342e010fe" Apr 17 17:25:38.835288 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:38.834949 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vdhlz" Apr 17 17:25:38.835288 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:38.835054 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wgmfp" Apr 17 17:25:38.835288 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:38.835054 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vdhlz" podUID="d29a1468-8ac5-454a-a993-6a5055191ec4" Apr 17 17:25:38.835288 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:38.835145 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wgmfp" podUID="b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a" Apr 17 17:25:39.145100 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.145029 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-147.ec2.internal" event="NodeReady" Apr 17 17:25:39.145281 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.145164 2566 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 17:25:39.188963 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.188929 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5848fc8888-kv722"] Apr 17 17:25:39.218715 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.218686 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-sdn9m"] Apr 17 17:25:39.218859 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.218827 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5848fc8888-kv722" Apr 17 17:25:39.221920 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.221900 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 17:25:39.222062 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.222046 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 17:25:39.222930 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.222904 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-qvzpk\"" Apr 17 17:25:39.223009 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.222968 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 17:25:39.226917 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.226897 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 17:25:39.239013 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.238991 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-45bfq"] Apr 17 17:25:39.239142 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.239127 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-sdn9m" Apr 17 17:25:39.242665 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.242647 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 17:25:39.242761 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.242696 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 17:25:39.243025 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.243010 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-4n9s5\"" Apr 17 17:25:39.243025 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.243019 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 17:25:39.243141 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.243056 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 17:25:39.266649 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.266354 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5848fc8888-kv722"] Apr 17 17:25:39.266649 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.266493 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-45bfq" Apr 17 17:25:39.266829 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.266499 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-sdn9m"] Apr 17 17:25:39.266927 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.266916 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-45bfq"] Apr 17 17:25:39.269829 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.269807 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 17:25:39.270137 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.270119 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 17:25:39.270482 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.270454 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-9l5w8\"" Apr 17 17:25:39.306242 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.306221 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-6bftx"] Apr 17 17:25:39.329042 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.329019 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6bftx"] Apr 17 17:25:39.329158 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.329134 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6bftx" Apr 17 17:25:39.331340 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.331320 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-n9t7x\"" Apr 17 17:25:39.331554 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.331540 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 17:25:39.332075 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.332060 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 17:25:39.332144 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.332085 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 17:25:39.345070 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.345049 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3bc8d11b-297a-4446-8745-9da775945615-ca-trust-extracted\") pod \"image-registry-5848fc8888-kv722\" (UID: \"3bc8d11b-297a-4446-8745-9da775945615\") " pod="openshift-image-registry/image-registry-5848fc8888-kv722" Apr 17 17:25:39.345163 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.345077 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3bc8d11b-297a-4446-8745-9da775945615-trusted-ca\") pod \"image-registry-5848fc8888-kv722\" (UID: \"3bc8d11b-297a-4446-8745-9da775945615\") " pod="openshift-image-registry/image-registry-5848fc8888-kv722" Apr 17 17:25:39.345163 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.345096 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3bc8d11b-297a-4446-8745-9da775945615-installation-pull-secrets\") pod \"image-registry-5848fc8888-kv722\" (UID: \"3bc8d11b-297a-4446-8745-9da775945615\") " pod="openshift-image-registry/image-registry-5848fc8888-kv722" Apr 17 17:25:39.345281 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.345171 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/624a9915-ae5a-4744-8433-7bcca05df5bf-metrics-tls\") pod \"dns-default-45bfq\" (UID: \"624a9915-ae5a-4744-8433-7bcca05df5bf\") " pod="openshift-dns/dns-default-45bfq" Apr 17 17:25:39.345281 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.345269 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3bc8d11b-297a-4446-8745-9da775945615-registry-tls\") pod \"image-registry-5848fc8888-kv722\" (UID: \"3bc8d11b-297a-4446-8745-9da775945615\") " pod="openshift-image-registry/image-registry-5848fc8888-kv722" Apr 17 17:25:39.345395 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.345304 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3bc8d11b-297a-4446-8745-9da775945615-registry-certificates\") pod \"image-registry-5848fc8888-kv722\" (UID: \"3bc8d11b-297a-4446-8745-9da775945615\") " pod="openshift-image-registry/image-registry-5848fc8888-kv722" Apr 17 17:25:39.345395 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.345335 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqhlf\" (UniqueName: \"kubernetes.io/projected/3bc8d11b-297a-4446-8745-9da775945615-kube-api-access-rqhlf\") pod \"image-registry-5848fc8888-kv722\" (UID: \"3bc8d11b-297a-4446-8745-9da775945615\") " pod="openshift-image-registry/image-registry-5848fc8888-kv722" Apr 17 17:25:39.345395 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.345359 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4h7k\" (UniqueName: \"kubernetes.io/projected/808d341a-97c0-4c38-aece-d3877936bcb6-kube-api-access-m4h7k\") pod \"insights-runtime-extractor-sdn9m\" (UID: \"808d341a-97c0-4c38-aece-d3877936bcb6\") " pod="openshift-insights/insights-runtime-extractor-sdn9m" Apr 17 17:25:39.345543 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.345402 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/808d341a-97c0-4c38-aece-d3877936bcb6-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-sdn9m\" (UID: \"808d341a-97c0-4c38-aece-d3877936bcb6\") " pod="openshift-insights/insights-runtime-extractor-sdn9m" Apr 17 17:25:39.345543 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.345427 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/808d341a-97c0-4c38-aece-d3877936bcb6-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-sdn9m\" (UID: \"808d341a-97c0-4c38-aece-d3877936bcb6\") " pod="openshift-insights/insights-runtime-extractor-sdn9m" Apr 17 17:25:39.345543 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.345472 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/808d341a-97c0-4c38-aece-d3877936bcb6-data-volume\") pod \"insights-runtime-extractor-sdn9m\" (UID: \"808d341a-97c0-4c38-aece-d3877936bcb6\") " pod="openshift-insights/insights-runtime-extractor-sdn9m" Apr 17 17:25:39.345543 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.345504 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/624a9915-ae5a-4744-8433-7bcca05df5bf-config-volume\") pod \"dns-default-45bfq\" (UID: \"624a9915-ae5a-4744-8433-7bcca05df5bf\") " pod="openshift-dns/dns-default-45bfq" Apr 17 17:25:39.345543 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.345527 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/624a9915-ae5a-4744-8433-7bcca05df5bf-tmp-dir\") pod \"dns-default-45bfq\" (UID: \"624a9915-ae5a-4744-8433-7bcca05df5bf\") " pod="openshift-dns/dns-default-45bfq" Apr 17 17:25:39.345715 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.345551 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqhk6\" (UniqueName: \"kubernetes.io/projected/624a9915-ae5a-4744-8433-7bcca05df5bf-kube-api-access-vqhk6\") pod \"dns-default-45bfq\" (UID: \"624a9915-ae5a-4744-8433-7bcca05df5bf\") " pod="openshift-dns/dns-default-45bfq" Apr 17 17:25:39.345715 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.345578 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/808d341a-97c0-4c38-aece-d3877936bcb6-crio-socket\") pod \"insights-runtime-extractor-sdn9m\" (UID: \"808d341a-97c0-4c38-aece-d3877936bcb6\") " pod="openshift-insights/insights-runtime-extractor-sdn9m" Apr 17 17:25:39.345715 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.345622 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3bc8d11b-297a-4446-8745-9da775945615-bound-sa-token\") pod \"image-registry-5848fc8888-kv722\" (UID: \"3bc8d11b-297a-4446-8745-9da775945615\") " pod="openshift-image-registry/image-registry-5848fc8888-kv722" Apr 17 17:25:39.345715 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.345641 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3bc8d11b-297a-4446-8745-9da775945615-image-registry-private-configuration\") pod \"image-registry-5848fc8888-kv722\" (UID: \"3bc8d11b-297a-4446-8745-9da775945615\") " pod="openshift-image-registry/image-registry-5848fc8888-kv722" Apr 17 17:25:39.446892 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.446809 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3bc8d11b-297a-4446-8745-9da775945615-bound-sa-token\") pod \"image-registry-5848fc8888-kv722\" (UID: \"3bc8d11b-297a-4446-8745-9da775945615\") " pod="openshift-image-registry/image-registry-5848fc8888-kv722" Apr 17 17:25:39.446892 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.446848 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3bc8d11b-297a-4446-8745-9da775945615-image-registry-private-configuration\") pod \"image-registry-5848fc8888-kv722\" (UID: \"3bc8d11b-297a-4446-8745-9da775945615\") " pod="openshift-image-registry/image-registry-5848fc8888-kv722" Apr 17 17:25:39.447108 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.446893 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3bc8d11b-297a-4446-8745-9da775945615-ca-trust-extracted\") pod \"image-registry-5848fc8888-kv722\" (UID: \"3bc8d11b-297a-4446-8745-9da775945615\") " pod="openshift-image-registry/image-registry-5848fc8888-kv722" Apr 17 17:25:39.447108 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.446917 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3bc8d11b-297a-4446-8745-9da775945615-trusted-ca\") pod \"image-registry-5848fc8888-kv722\" (UID: \"3bc8d11b-297a-4446-8745-9da775945615\") " pod="openshift-image-registry/image-registry-5848fc8888-kv722" Apr 17 17:25:39.447108 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.446938 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3bc8d11b-297a-4446-8745-9da775945615-installation-pull-secrets\") pod \"image-registry-5848fc8888-kv722\" (UID: \"3bc8d11b-297a-4446-8745-9da775945615\") " pod="openshift-image-registry/image-registry-5848fc8888-kv722" Apr 17 17:25:39.447108 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.446970 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/624a9915-ae5a-4744-8433-7bcca05df5bf-metrics-tls\") pod \"dns-default-45bfq\" (UID: \"624a9915-ae5a-4744-8433-7bcca05df5bf\") " pod="openshift-dns/dns-default-45bfq" Apr 17 17:25:39.447108 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.446996 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/44fc3e42-b872-4c53-99af-7d32682facb5-cert\") pod \"ingress-canary-6bftx\" (UID: \"44fc3e42-b872-4c53-99af-7d32682facb5\") " pod="openshift-ingress-canary/ingress-canary-6bftx" Apr 17 17:25:39.447108 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.447031 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3bc8d11b-297a-4446-8745-9da775945615-registry-tls\") pod \"image-registry-5848fc8888-kv722\" (UID: \"3bc8d11b-297a-4446-8745-9da775945615\") " pod="openshift-image-registry/image-registry-5848fc8888-kv722" Apr 17 17:25:39.447108 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.447057 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3bc8d11b-297a-4446-8745-9da775945615-registry-certificates\") pod \"image-registry-5848fc8888-kv722\" (UID: \"3bc8d11b-297a-4446-8745-9da775945615\") " pod="openshift-image-registry/image-registry-5848fc8888-kv722" Apr 17 17:25:39.447108 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.447083 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rqhlf\" (UniqueName: \"kubernetes.io/projected/3bc8d11b-297a-4446-8745-9da775945615-kube-api-access-rqhlf\") pod \"image-registry-5848fc8888-kv722\" (UID: \"3bc8d11b-297a-4446-8745-9da775945615\") " pod="openshift-image-registry/image-registry-5848fc8888-kv722" Apr 17 17:25:39.447530 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.447111 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m4h7k\" (UniqueName: \"kubernetes.io/projected/808d341a-97c0-4c38-aece-d3877936bcb6-kube-api-access-m4h7k\") pod \"insights-runtime-extractor-sdn9m\" (UID: \"808d341a-97c0-4c38-aece-d3877936bcb6\") " pod="openshift-insights/insights-runtime-extractor-sdn9m" Apr 17 17:25:39.447530 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.447157 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/808d341a-97c0-4c38-aece-d3877936bcb6-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-sdn9m\" (UID: \"808d341a-97c0-4c38-aece-d3877936bcb6\") " pod="openshift-insights/insights-runtime-extractor-sdn9m" Apr 17 17:25:39.447530 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.447182 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/808d341a-97c0-4c38-aece-d3877936bcb6-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-sdn9m\" (UID: \"808d341a-97c0-4c38-aece-d3877936bcb6\") " pod="openshift-insights/insights-runtime-extractor-sdn9m" Apr 17 17:25:39.447530 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.447477 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/808d341a-97c0-4c38-aece-d3877936bcb6-data-volume\") pod \"insights-runtime-extractor-sdn9m\" (UID: \"808d341a-97c0-4c38-aece-d3877936bcb6\") " pod="openshift-insights/insights-runtime-extractor-sdn9m" Apr 17 17:25:39.447530 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.447516 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/624a9915-ae5a-4744-8433-7bcca05df5bf-config-volume\") pod \"dns-default-45bfq\" (UID: \"624a9915-ae5a-4744-8433-7bcca05df5bf\") " pod="openshift-dns/dns-default-45bfq" Apr 17 17:25:39.447772 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.447543 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/624a9915-ae5a-4744-8433-7bcca05df5bf-tmp-dir\") pod \"dns-default-45bfq\" (UID: \"624a9915-ae5a-4744-8433-7bcca05df5bf\") " pod="openshift-dns/dns-default-45bfq" Apr 17 17:25:39.447772 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.447569 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vqhk6\" (UniqueName: \"kubernetes.io/projected/624a9915-ae5a-4744-8433-7bcca05df5bf-kube-api-access-vqhk6\") pod \"dns-default-45bfq\" (UID: \"624a9915-ae5a-4744-8433-7bcca05df5bf\") " pod="openshift-dns/dns-default-45bfq" Apr 17 17:25:39.447772 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.447599 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwvqw\" (UniqueName: \"kubernetes.io/projected/44fc3e42-b872-4c53-99af-7d32682facb5-kube-api-access-cwvqw\") pod \"ingress-canary-6bftx\" (UID: \"44fc3e42-b872-4c53-99af-7d32682facb5\") " pod="openshift-ingress-canary/ingress-canary-6bftx" Apr 17 17:25:39.447772 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.447630 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/808d341a-97c0-4c38-aece-d3877936bcb6-crio-socket\") pod \"insights-runtime-extractor-sdn9m\" (UID: \"808d341a-97c0-4c38-aece-d3877936bcb6\") " pod="openshift-insights/insights-runtime-extractor-sdn9m" Apr 17 17:25:39.447772 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.447755 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/808d341a-97c0-4c38-aece-d3877936bcb6-crio-socket\") pod \"insights-runtime-extractor-sdn9m\" (UID: \"808d341a-97c0-4c38-aece-d3877936bcb6\") " pod="openshift-insights/insights-runtime-extractor-sdn9m" Apr 17 17:25:39.448023 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.447839 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/808d341a-97c0-4c38-aece-d3877936bcb6-data-volume\") pod \"insights-runtime-extractor-sdn9m\" (UID: \"808d341a-97c0-4c38-aece-d3877936bcb6\") " pod="openshift-insights/insights-runtime-extractor-sdn9m" Apr 17 17:25:39.448398 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.448242 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3bc8d11b-297a-4446-8745-9da775945615-registry-certificates\") pod \"image-registry-5848fc8888-kv722\" (UID: \"3bc8d11b-297a-4446-8745-9da775945615\") " pod="openshift-image-registry/image-registry-5848fc8888-kv722" Apr 17 17:25:39.448398 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.448280 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/624a9915-ae5a-4744-8433-7bcca05df5bf-tmp-dir\") pod \"dns-default-45bfq\" (UID: \"624a9915-ae5a-4744-8433-7bcca05df5bf\") " pod="openshift-dns/dns-default-45bfq" Apr 17 17:25:39.448398 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.448322 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/624a9915-ae5a-4744-8433-7bcca05df5bf-config-volume\") pod \"dns-default-45bfq\" (UID: \"624a9915-ae5a-4744-8433-7bcca05df5bf\") " pod="openshift-dns/dns-default-45bfq" Apr 17 17:25:39.448576 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.448438 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3bc8d11b-297a-4446-8745-9da775945615-trusted-ca\") pod \"image-registry-5848fc8888-kv722\" (UID: \"3bc8d11b-297a-4446-8745-9da775945615\") " pod="openshift-image-registry/image-registry-5848fc8888-kv722" Apr 17 17:25:39.448720 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.448687 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/808d341a-97c0-4c38-aece-d3877936bcb6-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-sdn9m\" (UID: \"808d341a-97c0-4c38-aece-d3877936bcb6\") " pod="openshift-insights/insights-runtime-extractor-sdn9m" Apr 17 17:25:39.451340 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.451317 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3bc8d11b-297a-4446-8745-9da775945615-installation-pull-secrets\") pod \"image-registry-5848fc8888-kv722\" (UID: \"3bc8d11b-297a-4446-8745-9da775945615\") " pod="openshift-image-registry/image-registry-5848fc8888-kv722" Apr 17 17:25:39.451423 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.451351 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/808d341a-97c0-4c38-aece-d3877936bcb6-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-sdn9m\" (UID: \"808d341a-97c0-4c38-aece-d3877936bcb6\") " pod="openshift-insights/insights-runtime-extractor-sdn9m" Apr 17 17:25:39.451471 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.451431 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3bc8d11b-297a-4446-8745-9da775945615-registry-tls\") pod \"image-registry-5848fc8888-kv722\" (UID: \"3bc8d11b-297a-4446-8745-9da775945615\") " pod="openshift-image-registry/image-registry-5848fc8888-kv722" Apr 17 17:25:39.451471 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.451433 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/624a9915-ae5a-4744-8433-7bcca05df5bf-metrics-tls\") pod \"dns-default-45bfq\" (UID: \"624a9915-ae5a-4744-8433-7bcca05df5bf\") " pod="openshift-dns/dns-default-45bfq" Apr 17 17:25:39.451538 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.451486 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3bc8d11b-297a-4446-8745-9da775945615-image-registry-private-configuration\") pod \"image-registry-5848fc8888-kv722\" (UID: \"3bc8d11b-297a-4446-8745-9da775945615\") " pod="openshift-image-registry/image-registry-5848fc8888-kv722" Apr 17 17:25:39.455949 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.455926 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3bc8d11b-297a-4446-8745-9da775945615-ca-trust-extracted\") pod \"image-registry-5848fc8888-kv722\" (UID: \"3bc8d11b-297a-4446-8745-9da775945615\") " pod="openshift-image-registry/image-registry-5848fc8888-kv722" Apr 17 17:25:39.460546 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.460524 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqhk6\" (UniqueName: \"kubernetes.io/projected/624a9915-ae5a-4744-8433-7bcca05df5bf-kube-api-access-vqhk6\") pod \"dns-default-45bfq\" (UID: \"624a9915-ae5a-4744-8433-7bcca05df5bf\") " pod="openshift-dns/dns-default-45bfq" Apr 17 17:25:39.460759 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.460739 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3bc8d11b-297a-4446-8745-9da775945615-bound-sa-token\") pod \"image-registry-5848fc8888-kv722\" (UID: \"3bc8d11b-297a-4446-8745-9da775945615\") " pod="openshift-image-registry/image-registry-5848fc8888-kv722" Apr 17 17:25:39.460810 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.460793 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqhlf\" (UniqueName: \"kubernetes.io/projected/3bc8d11b-297a-4446-8745-9da775945615-kube-api-access-rqhlf\") pod \"image-registry-5848fc8888-kv722\" (UID: \"3bc8d11b-297a-4446-8745-9da775945615\") " pod="openshift-image-registry/image-registry-5848fc8888-kv722" Apr 17 17:25:39.461372 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.461355 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4h7k\" (UniqueName: \"kubernetes.io/projected/808d341a-97c0-4c38-aece-d3877936bcb6-kube-api-access-m4h7k\") pod \"insights-runtime-extractor-sdn9m\" (UID: \"808d341a-97c0-4c38-aece-d3877936bcb6\") " pod="openshift-insights/insights-runtime-extractor-sdn9m" Apr 17 17:25:39.529579 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.529545 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5848fc8888-kv722" Apr 17 17:25:39.547294 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.547269 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-sdn9m" Apr 17 17:25:39.547987 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.547967 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/44fc3e42-b872-4c53-99af-7d32682facb5-cert\") pod \"ingress-canary-6bftx\" (UID: \"44fc3e42-b872-4c53-99af-7d32682facb5\") " pod="openshift-ingress-canary/ingress-canary-6bftx" Apr 17 17:25:39.548073 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.548054 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cwvqw\" (UniqueName: \"kubernetes.io/projected/44fc3e42-b872-4c53-99af-7d32682facb5-kube-api-access-cwvqw\") pod \"ingress-canary-6bftx\" (UID: \"44fc3e42-b872-4c53-99af-7d32682facb5\") " pod="openshift-ingress-canary/ingress-canary-6bftx" Apr 17 17:25:39.550385 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.550364 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/44fc3e42-b872-4c53-99af-7d32682facb5-cert\") pod \"ingress-canary-6bftx\" (UID: \"44fc3e42-b872-4c53-99af-7d32682facb5\") " pod="openshift-ingress-canary/ingress-canary-6bftx" Apr 17 17:25:39.558732 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.558709 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwvqw\" (UniqueName: \"kubernetes.io/projected/44fc3e42-b872-4c53-99af-7d32682facb5-kube-api-access-cwvqw\") pod \"ingress-canary-6bftx\" (UID: \"44fc3e42-b872-4c53-99af-7d32682facb5\") " pod="openshift-ingress-canary/ingress-canary-6bftx" Apr 17 17:25:39.578551 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.578516 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-45bfq" Apr 17 17:25:39.639037 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.638630 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6bftx" Apr 17 17:25:39.698489 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.698210 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5848fc8888-kv722"] Apr 17 17:25:39.702129 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:25:39.702098 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bc8d11b_297a_4446_8745_9da775945615.slice/crio-801c6beb00984dd55df167dd40190ac557563b408a5452aebd002984ee4cdd86 WatchSource:0}: Error finding container 801c6beb00984dd55df167dd40190ac557563b408a5452aebd002984ee4cdd86: Status 404 returned error can't find the container with id 801c6beb00984dd55df167dd40190ac557563b408a5452aebd002984ee4cdd86 Apr 17 17:25:39.706003 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.705981 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-sdn9m"] Apr 17 17:25:39.723963 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.723926 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-45bfq"] Apr 17 17:25:39.727805 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:25:39.727775 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod624a9915_ae5a_4744_8433_7bcca05df5bf.slice/crio-a38993692275d0fe02bf0b2d614e25ea49dbb9d4a2a97ca65799c467a6601a65 WatchSource:0}: Error finding container a38993692275d0fe02bf0b2d614e25ea49dbb9d4a2a97ca65799c467a6601a65: Status 404 returned error can't find the container with id a38993692275d0fe02bf0b2d614e25ea49dbb9d4a2a97ca65799c467a6601a65 Apr 17 17:25:39.770613 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:39.770591 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6bftx"] Apr 17 17:25:39.773554 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:25:39.773524 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44fc3e42_b872_4c53_99af_7d32682facb5.slice/crio-738f7013b768fc13b6794119848d2a2be5b47176999f61a30fa5662b8dfe67be WatchSource:0}: Error finding container 738f7013b768fc13b6794119848d2a2be5b47176999f61a30fa5662b8dfe67be: Status 404 returned error can't find the container with id 738f7013b768fc13b6794119848d2a2be5b47176999f61a30fa5662b8dfe67be Apr 17 17:25:40.044551 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:40.044467 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-45bfq" event={"ID":"624a9915-ae5a-4744-8433-7bcca05df5bf","Type":"ContainerStarted","Data":"a38993692275d0fe02bf0b2d614e25ea49dbb9d4a2a97ca65799c467a6601a65"} Apr 17 17:25:40.045684 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:40.045652 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6bftx" event={"ID":"44fc3e42-b872-4c53-99af-7d32682facb5","Type":"ContainerStarted","Data":"738f7013b768fc13b6794119848d2a2be5b47176999f61a30fa5662b8dfe67be"} Apr 17 17:25:40.047124 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:40.047101 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5848fc8888-kv722" event={"ID":"3bc8d11b-297a-4446-8745-9da775945615","Type":"ContainerStarted","Data":"b8f862579f0654a65d61c768491eb6069d3fce31a4644a6eae7f5f7d1f1b8ff8"} Apr 17 17:25:40.047124 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:40.047127 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5848fc8888-kv722" event={"ID":"3bc8d11b-297a-4446-8745-9da775945615","Type":"ContainerStarted","Data":"801c6beb00984dd55df167dd40190ac557563b408a5452aebd002984ee4cdd86"} Apr 17 17:25:40.047329 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:40.047295 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5848fc8888-kv722" Apr 17 17:25:40.048487 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:40.048469 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-sdn9m" event={"ID":"808d341a-97c0-4c38-aece-d3877936bcb6","Type":"ContainerStarted","Data":"91446dd8c863621a4e1761e3de9ce001306cb8b5fd32479b3570bc38251e3f2c"} Apr 17 17:25:40.048487 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:40.048493 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-sdn9m" event={"ID":"808d341a-97c0-4c38-aece-d3877936bcb6","Type":"ContainerStarted","Data":"3fee6c203b31ba6ea06174c21c81e2d31ffb3f4bd2b4d1e3cf603d12c549cbe8"} Apr 17 17:25:40.074039 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:40.073988 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5848fc8888-kv722" podStartSLOduration=2.073972622 podStartE2EDuration="2.073972622s" podCreationTimestamp="2026-04-17 17:25:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:25:40.073523747 +0000 UTC m=+51.743603391" watchObservedRunningTime="2026-04-17 17:25:40.073972622 +0000 UTC m=+51.744052264" Apr 17 17:25:40.833643 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:40.833615 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q4gqn" Apr 17 17:25:40.833824 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:40.833796 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wgmfp" Apr 17 17:25:40.834486 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:40.833643 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vdhlz" Apr 17 17:25:40.837910 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:40.837623 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 17:25:40.837910 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:40.837707 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vr6gd\"" Apr 17 17:25:40.837910 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:40.837748 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-bp9pv\"" Apr 17 17:25:40.837910 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:40.837877 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 17:25:40.838149 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:40.837953 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 17:25:40.838149 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:40.838053 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 17:25:41.052564 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:41.052529 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-sdn9m" event={"ID":"808d341a-97c0-4c38-aece-d3877936bcb6","Type":"ContainerStarted","Data":"cf9693037affbc5ccc3f96756731d27d8242ca722e05d8d15c2d6880bd652aca"} Apr 17 17:25:43.061422 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:43.059961 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-45bfq" event={"ID":"624a9915-ae5a-4744-8433-7bcca05df5bf","Type":"ContainerStarted","Data":"295e1ab8a8cf74bb926ddf37feca1b523be7461d4cb8604d4dde5aad0e0c3a25"} Apr 17 17:25:43.061422 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:43.060001 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-45bfq" event={"ID":"624a9915-ae5a-4744-8433-7bcca05df5bf","Type":"ContainerStarted","Data":"3f1ad0ddab63b16ee21d2310a44f8b85bced22c502335ba0d8e057ba77471917"} Apr 17 17:25:43.061422 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:43.060812 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-45bfq" Apr 17 17:25:43.062816 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:43.062791 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6bftx" event={"ID":"44fc3e42-b872-4c53-99af-7d32682facb5","Type":"ContainerStarted","Data":"82e109b6478365e9b20646288df8c4120f29e375f7571f4b479bf20b51b6ab5c"} Apr 17 17:25:43.077816 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:43.077775 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-45bfq" podStartSLOduration=1.497291854 podStartE2EDuration="4.077761628s" podCreationTimestamp="2026-04-17 17:25:39 +0000 UTC" firstStartedPulling="2026-04-17 17:25:39.729651368 +0000 UTC m=+51.399730988" lastFinishedPulling="2026-04-17 17:25:42.310121139 +0000 UTC m=+53.980200762" observedRunningTime="2026-04-17 17:25:43.076918878 +0000 UTC m=+54.746998541" watchObservedRunningTime="2026-04-17 17:25:43.077761628 +0000 UTC m=+54.747841248" Apr 17 17:25:43.093221 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:43.093175 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-6bftx" podStartSLOduration=1.553381308 podStartE2EDuration="4.093164394s" podCreationTimestamp="2026-04-17 17:25:39 +0000 UTC" firstStartedPulling="2026-04-17 17:25:39.775512981 +0000 UTC m=+51.445592620" lastFinishedPulling="2026-04-17 17:25:42.315296083 +0000 UTC m=+53.985375706" observedRunningTime="2026-04-17 17:25:43.092489142 +0000 UTC m=+54.762568788" watchObservedRunningTime="2026-04-17 17:25:43.093164394 +0000 UTC m=+54.763244014" Apr 17 17:25:44.069737 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:44.069703 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-sdn9m" event={"ID":"808d341a-97c0-4c38-aece-d3877936bcb6","Type":"ContainerStarted","Data":"57585af796cb6cd2a02295259c9f6abba3e5ca7ab75b251e57bf0b6023bc2a79"} Apr 17 17:25:44.089046 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:44.089007 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-sdn9m" podStartSLOduration=1.575093329 podStartE2EDuration="5.08899409s" podCreationTimestamp="2026-04-17 17:25:39 +0000 UTC" firstStartedPulling="2026-04-17 17:25:39.81562903 +0000 UTC m=+51.485708651" lastFinishedPulling="2026-04-17 17:25:43.329529783 +0000 UTC m=+54.999609412" observedRunningTime="2026-04-17 17:25:44.087763575 +0000 UTC m=+55.757843217" watchObservedRunningTime="2026-04-17 17:25:44.08899409 +0000 UTC m=+55.759073729" Apr 17 17:25:47.036824 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:47.036799 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nz777" Apr 17 17:25:54.075435 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:54.075402 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-45bfq" Apr 17 17:25:54.671745 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:54.671710 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a-metrics-certs\") pod \"network-metrics-daemon-wgmfp\" (UID: \"b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a\") " pod="openshift-multus/network-metrics-daemon-wgmfp" Apr 17 17:25:54.674107 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:54.674087 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 17:25:54.684056 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:54.684030 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a-metrics-certs\") pod \"network-metrics-daemon-wgmfp\" (UID: \"b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a\") " pod="openshift-multus/network-metrics-daemon-wgmfp" Apr 17 17:25:54.772510 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:54.772473 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvbnr\" (UniqueName: \"kubernetes.io/projected/d29a1468-8ac5-454a-a993-6a5055191ec4-kube-api-access-rvbnr\") pod \"network-check-target-vdhlz\" (UID: \"d29a1468-8ac5-454a-a993-6a5055191ec4\") " pod="openshift-network-diagnostics/network-check-target-vdhlz" Apr 17 17:25:54.775003 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:54.774981 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 17:25:54.785425 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:54.785409 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 17:25:54.795764 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:54.795744 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvbnr\" (UniqueName: \"kubernetes.io/projected/d29a1468-8ac5-454a-a993-6a5055191ec4-kube-api-access-rvbnr\") pod \"network-check-target-vdhlz\" (UID: \"d29a1468-8ac5-454a-a993-6a5055191ec4\") " pod="openshift-network-diagnostics/network-check-target-vdhlz" Apr 17 17:25:54.959142 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:54.959066 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vr6gd\"" Apr 17 17:25:54.964936 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:54.964919 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-bp9pv\"" Apr 17 17:25:54.967714 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:54.967700 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wgmfp" Apr 17 17:25:54.973309 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:54.973291 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vdhlz" Apr 17 17:25:55.101566 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:55.101539 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-vdhlz"] Apr 17 17:25:55.104548 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:25:55.104520 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd29a1468_8ac5_454a_a993_6a5055191ec4.slice/crio-ede757289d6c26aafdc55192cc7f413cca3711e2e5aa2961ed456d5eaad9ee83 WatchSource:0}: Error finding container ede757289d6c26aafdc55192cc7f413cca3711e2e5aa2961ed456d5eaad9ee83: Status 404 returned error can't find the container with id ede757289d6c26aafdc55192cc7f413cca3711e2e5aa2961ed456d5eaad9ee83 Apr 17 17:25:55.118239 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:55.118216 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wgmfp"] Apr 17 17:25:55.120612 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:25:55.120590 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1844a3f_8bb0_4d0d_92b3_6ec5fa1b443a.slice/crio-b6d6a1b947f95b1a62816ab6b149235a7193a18093c776beeedcc4d91f3e3822 WatchSource:0}: Error finding container b6d6a1b947f95b1a62816ab6b149235a7193a18093c776beeedcc4d91f3e3822: Status 404 returned error can't find the container with id b6d6a1b947f95b1a62816ab6b149235a7193a18093c776beeedcc4d91f3e3822 Apr 17 17:25:56.098269 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:56.098216 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wgmfp" event={"ID":"b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a","Type":"ContainerStarted","Data":"b6d6a1b947f95b1a62816ab6b149235a7193a18093c776beeedcc4d91f3e3822"} Apr 17 17:25:56.099376 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:56.099347 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-vdhlz" event={"ID":"d29a1468-8ac5-454a-a993-6a5055191ec4","Type":"ContainerStarted","Data":"ede757289d6c26aafdc55192cc7f413cca3711e2e5aa2961ed456d5eaad9ee83"} Apr 17 17:25:56.972609 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:56.972580 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-8882s"] Apr 17 17:25:56.975953 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:56.975927 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8882s" Apr 17 17:25:56.978810 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:56.978784 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 17:25:56.978927 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:56.978785 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 17:25:56.979512 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:56.979492 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-cqt9f\"" Apr 17 17:25:56.979633 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:56.979531 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 17 17:25:56.979633 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:56.979545 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 17:25:56.981343 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:56.981327 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 17 17:25:56.985785 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:56.985765 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-2ss72"] Apr 17 17:25:56.989230 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:56.989213 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-8882s"] Apr 17 17:25:56.989368 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:56.989354 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-2ss72" Apr 17 17:25:56.991653 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:56.991582 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 17 17:25:56.991653 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:56.991631 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 17 17:25:56.991653 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:56.991652 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-4dtkh\"" Apr 17 17:25:56.991879 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:56.991585 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 17 17:25:57.001681 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.001661 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-kfhmc"] Apr 17 17:25:57.005124 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.005104 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-2ss72"] Apr 17 17:25:57.005228 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.005220 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-kfhmc" Apr 17 17:25:57.007765 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.007571 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-29gws\"" Apr 17 17:25:57.007765 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.007611 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 17:25:57.007765 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.007638 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 17:25:57.007765 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.007649 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 17:25:57.089146 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.089072 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8c08d72f-60f0-4355-8f32-3f4cb23cf290-sys\") pod \"node-exporter-kfhmc\" (UID: \"8c08d72f-60f0-4355-8f32-3f4cb23cf290\") " pod="openshift-monitoring/node-exporter-kfhmc" Apr 17 17:25:57.089146 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.089115 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8c08d72f-60f0-4355-8f32-3f4cb23cf290-node-exporter-textfile\") pod \"node-exporter-kfhmc\" (UID: \"8c08d72f-60f0-4355-8f32-3f4cb23cf290\") " pod="openshift-monitoring/node-exporter-kfhmc" Apr 17 17:25:57.089350 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.089145 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/83bdc5ec-af0a-4592-ab21-2e80793d42c9-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-8882s\" (UID: \"83bdc5ec-af0a-4592-ab21-2e80793d42c9\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8882s" Apr 17 17:25:57.089350 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.089176 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8c08d72f-60f0-4355-8f32-3f4cb23cf290-metrics-client-ca\") pod \"node-exporter-kfhmc\" (UID: \"8c08d72f-60f0-4355-8f32-3f4cb23cf290\") " pod="openshift-monitoring/node-exporter-kfhmc" Apr 17 17:25:57.089350 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.089228 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/83bdc5ec-af0a-4592-ab21-2e80793d42c9-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-8882s\" (UID: \"83bdc5ec-af0a-4592-ab21-2e80793d42c9\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8882s" Apr 17 17:25:57.089477 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.089353 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/24cc7964-d06f-4bc3-a617-a62c30500317-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-2ss72\" (UID: \"24cc7964-d06f-4bc3-a617-a62c30500317\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2ss72" Apr 17 17:25:57.089477 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.089388 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/24cc7964-d06f-4bc3-a617-a62c30500317-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-2ss72\" (UID: \"24cc7964-d06f-4bc3-a617-a62c30500317\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2ss72" Apr 17 17:25:57.089477 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.089418 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sgnt\" (UniqueName: \"kubernetes.io/projected/8c08d72f-60f0-4355-8f32-3f4cb23cf290-kube-api-access-5sgnt\") pod \"node-exporter-kfhmc\" (UID: \"8c08d72f-60f0-4355-8f32-3f4cb23cf290\") " pod="openshift-monitoring/node-exporter-kfhmc" Apr 17 17:25:57.089477 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.089444 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8c08d72f-60f0-4355-8f32-3f4cb23cf290-node-exporter-wtmp\") pod \"node-exporter-kfhmc\" (UID: \"8c08d72f-60f0-4355-8f32-3f4cb23cf290\") " pod="openshift-monitoring/node-exporter-kfhmc" Apr 17 17:25:57.089648 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.089474 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/24cc7964-d06f-4bc3-a617-a62c30500317-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-2ss72\" (UID: \"24cc7964-d06f-4bc3-a617-a62c30500317\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2ss72" Apr 17 17:25:57.089648 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.089508 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8c08d72f-60f0-4355-8f32-3f4cb23cf290-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-kfhmc\" (UID: \"8c08d72f-60f0-4355-8f32-3f4cb23cf290\") " pod="openshift-monitoring/node-exporter-kfhmc" Apr 17 17:25:57.089648 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.089563 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/83bdc5ec-af0a-4592-ab21-2e80793d42c9-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-8882s\" (UID: \"83bdc5ec-af0a-4592-ab21-2e80793d42c9\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8882s" Apr 17 17:25:57.089648 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.089618 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/24cc7964-d06f-4bc3-a617-a62c30500317-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-2ss72\" (UID: \"24cc7964-d06f-4bc3-a617-a62c30500317\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2ss72" Apr 17 17:25:57.089837 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.089658 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8c08d72f-60f0-4355-8f32-3f4cb23cf290-root\") pod \"node-exporter-kfhmc\" (UID: \"8c08d72f-60f0-4355-8f32-3f4cb23cf290\") " pod="openshift-monitoring/node-exporter-kfhmc" Apr 17 17:25:57.089837 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.089688 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8hw7\" (UniqueName: \"kubernetes.io/projected/24cc7964-d06f-4bc3-a617-a62c30500317-kube-api-access-r8hw7\") pod \"kube-state-metrics-69db897b98-2ss72\" (UID: \"24cc7964-d06f-4bc3-a617-a62c30500317\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2ss72" Apr 17 17:25:57.089837 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.089716 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8c08d72f-60f0-4355-8f32-3f4cb23cf290-node-exporter-tls\") pod \"node-exporter-kfhmc\" (UID: \"8c08d72f-60f0-4355-8f32-3f4cb23cf290\") " pod="openshift-monitoring/node-exporter-kfhmc" Apr 17 17:25:57.089837 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.089743 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/24cc7964-d06f-4bc3-a617-a62c30500317-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-2ss72\" (UID: \"24cc7964-d06f-4bc3-a617-a62c30500317\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2ss72" Apr 17 17:25:57.089837 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.089793 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j6kc\" (UniqueName: \"kubernetes.io/projected/83bdc5ec-af0a-4592-ab21-2e80793d42c9-kube-api-access-5j6kc\") pod \"openshift-state-metrics-9d44df66c-8882s\" (UID: \"83bdc5ec-af0a-4592-ab21-2e80793d42c9\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8882s" Apr 17 17:25:57.089837 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.089818 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8c08d72f-60f0-4355-8f32-3f4cb23cf290-node-exporter-accelerators-collector-config\") pod \"node-exporter-kfhmc\" (UID: \"8c08d72f-60f0-4355-8f32-3f4cb23cf290\") " pod="openshift-monitoring/node-exporter-kfhmc" Apr 17 17:25:57.104209 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.104172 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wgmfp" event={"ID":"b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a","Type":"ContainerStarted","Data":"f09827d1899eee8e8cc08e8e8d3ad02eb3475e507fbc54bee67f83c56f9d6f77"} Apr 17 17:25:57.104209 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.104210 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wgmfp" event={"ID":"b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a","Type":"ContainerStarted","Data":"c6e62e0eae25f8392846d462dd3054a367ad17827e377ea7787553c07c7111c1"} Apr 17 17:25:57.122497 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.122447 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-wgmfp" podStartSLOduration=66.672426937 podStartE2EDuration="1m8.122432285s" podCreationTimestamp="2026-04-17 17:24:49 +0000 UTC" firstStartedPulling="2026-04-17 17:25:55.122351753 +0000 UTC m=+66.792431374" lastFinishedPulling="2026-04-17 17:25:56.572357089 +0000 UTC m=+68.242436722" observedRunningTime="2026-04-17 17:25:57.121768715 +0000 UTC m=+68.791848358" watchObservedRunningTime="2026-04-17 17:25:57.122432285 +0000 UTC m=+68.792511926" Apr 17 17:25:57.190329 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.190296 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5j6kc\" (UniqueName: \"kubernetes.io/projected/83bdc5ec-af0a-4592-ab21-2e80793d42c9-kube-api-access-5j6kc\") pod \"openshift-state-metrics-9d44df66c-8882s\" (UID: \"83bdc5ec-af0a-4592-ab21-2e80793d42c9\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8882s" Apr 17 17:25:57.190329 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.190332 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8c08d72f-60f0-4355-8f32-3f4cb23cf290-node-exporter-accelerators-collector-config\") pod \"node-exporter-kfhmc\" (UID: \"8c08d72f-60f0-4355-8f32-3f4cb23cf290\") " pod="openshift-monitoring/node-exporter-kfhmc" Apr 17 17:25:57.190544 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.190356 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8c08d72f-60f0-4355-8f32-3f4cb23cf290-sys\") pod \"node-exporter-kfhmc\" (UID: \"8c08d72f-60f0-4355-8f32-3f4cb23cf290\") " pod="openshift-monitoring/node-exporter-kfhmc" Apr 17 17:25:57.190544 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.190384 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8c08d72f-60f0-4355-8f32-3f4cb23cf290-node-exporter-textfile\") pod \"node-exporter-kfhmc\" (UID: \"8c08d72f-60f0-4355-8f32-3f4cb23cf290\") " pod="openshift-monitoring/node-exporter-kfhmc" Apr 17 17:25:57.190544 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.190422 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/83bdc5ec-af0a-4592-ab21-2e80793d42c9-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-8882s\" (UID: \"83bdc5ec-af0a-4592-ab21-2e80793d42c9\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8882s" Apr 17 17:25:57.190544 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.190442 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8c08d72f-60f0-4355-8f32-3f4cb23cf290-sys\") pod \"node-exporter-kfhmc\" (UID: \"8c08d72f-60f0-4355-8f32-3f4cb23cf290\") " pod="openshift-monitoring/node-exporter-kfhmc" Apr 17 17:25:57.190544 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.190456 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8c08d72f-60f0-4355-8f32-3f4cb23cf290-metrics-client-ca\") pod \"node-exporter-kfhmc\" (UID: \"8c08d72f-60f0-4355-8f32-3f4cb23cf290\") " pod="openshift-monitoring/node-exporter-kfhmc" Apr 17 17:25:57.190544 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.190485 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/83bdc5ec-af0a-4592-ab21-2e80793d42c9-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-8882s\" (UID: \"83bdc5ec-af0a-4592-ab21-2e80793d42c9\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8882s" Apr 17 17:25:57.190544 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.190519 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/24cc7964-d06f-4bc3-a617-a62c30500317-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-2ss72\" (UID: \"24cc7964-d06f-4bc3-a617-a62c30500317\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2ss72" Apr 17 17:25:57.190893 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.190548 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/24cc7964-d06f-4bc3-a617-a62c30500317-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-2ss72\" (UID: \"24cc7964-d06f-4bc3-a617-a62c30500317\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2ss72" Apr 17 17:25:57.190893 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.190576 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5sgnt\" (UniqueName: \"kubernetes.io/projected/8c08d72f-60f0-4355-8f32-3f4cb23cf290-kube-api-access-5sgnt\") pod \"node-exporter-kfhmc\" (UID: \"8c08d72f-60f0-4355-8f32-3f4cb23cf290\") " pod="openshift-monitoring/node-exporter-kfhmc" Apr 17 17:25:57.190893 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.190599 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8c08d72f-60f0-4355-8f32-3f4cb23cf290-node-exporter-wtmp\") pod \"node-exporter-kfhmc\" (UID: \"8c08d72f-60f0-4355-8f32-3f4cb23cf290\") " pod="openshift-monitoring/node-exporter-kfhmc" Apr 17 17:25:57.190893 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.190642 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/24cc7964-d06f-4bc3-a617-a62c30500317-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-2ss72\" (UID: \"24cc7964-d06f-4bc3-a617-a62c30500317\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2ss72" Apr 17 17:25:57.190893 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.190681 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8c08d72f-60f0-4355-8f32-3f4cb23cf290-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-kfhmc\" (UID: \"8c08d72f-60f0-4355-8f32-3f4cb23cf290\") " pod="openshift-monitoring/node-exporter-kfhmc" Apr 17 17:25:57.190893 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.190711 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8c08d72f-60f0-4355-8f32-3f4cb23cf290-node-exporter-textfile\") pod \"node-exporter-kfhmc\" (UID: \"8c08d72f-60f0-4355-8f32-3f4cb23cf290\") " pod="openshift-monitoring/node-exporter-kfhmc" Apr 17 17:25:57.190893 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.190730 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/83bdc5ec-af0a-4592-ab21-2e80793d42c9-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-8882s\" (UID: \"83bdc5ec-af0a-4592-ab21-2e80793d42c9\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8882s" Apr 17 17:25:57.191234 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.190912 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8c08d72f-60f0-4355-8f32-3f4cb23cf290-node-exporter-wtmp\") pod \"node-exporter-kfhmc\" (UID: \"8c08d72f-60f0-4355-8f32-3f4cb23cf290\") " pod="openshift-monitoring/node-exporter-kfhmc" Apr 17 17:25:57.191234 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.191020 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8c08d72f-60f0-4355-8f32-3f4cb23cf290-node-exporter-accelerators-collector-config\") pod \"node-exporter-kfhmc\" (UID: \"8c08d72f-60f0-4355-8f32-3f4cb23cf290\") " pod="openshift-monitoring/node-exporter-kfhmc" Apr 17 17:25:57.191234 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:57.191117 2566 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 17 17:25:57.191234 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.191149 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8c08d72f-60f0-4355-8f32-3f4cb23cf290-metrics-client-ca\") pod \"node-exporter-kfhmc\" (UID: \"8c08d72f-60f0-4355-8f32-3f4cb23cf290\") " pod="openshift-monitoring/node-exporter-kfhmc" Apr 17 17:25:57.191234 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:57.191194 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83bdc5ec-af0a-4592-ab21-2e80793d42c9-openshift-state-metrics-tls podName:83bdc5ec-af0a-4592-ab21-2e80793d42c9 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:57.69117696 +0000 UTC m=+69.361256595 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/83bdc5ec-af0a-4592-ab21-2e80793d42c9-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-8882s" (UID: "83bdc5ec-af0a-4592-ab21-2e80793d42c9") : secret "openshift-state-metrics-tls" not found Apr 17 17:25:57.191529 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.191489 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/24cc7964-d06f-4bc3-a617-a62c30500317-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-2ss72\" (UID: \"24cc7964-d06f-4bc3-a617-a62c30500317\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2ss72" Apr 17 17:25:57.191592 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.191534 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/24cc7964-d06f-4bc3-a617-a62c30500317-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-2ss72\" (UID: \"24cc7964-d06f-4bc3-a617-a62c30500317\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2ss72" Apr 17 17:25:57.191592 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.191565 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8c08d72f-60f0-4355-8f32-3f4cb23cf290-root\") pod \"node-exporter-kfhmc\" (UID: \"8c08d72f-60f0-4355-8f32-3f4cb23cf290\") " pod="openshift-monitoring/node-exporter-kfhmc" Apr 17 17:25:57.191717 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.191592 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r8hw7\" (UniqueName: \"kubernetes.io/projected/24cc7964-d06f-4bc3-a617-a62c30500317-kube-api-access-r8hw7\") pod \"kube-state-metrics-69db897b98-2ss72\" (UID: \"24cc7964-d06f-4bc3-a617-a62c30500317\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2ss72" Apr 17 17:25:57.191717 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.191625 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8c08d72f-60f0-4355-8f32-3f4cb23cf290-node-exporter-tls\") pod \"node-exporter-kfhmc\" (UID: \"8c08d72f-60f0-4355-8f32-3f4cb23cf290\") " pod="openshift-monitoring/node-exporter-kfhmc" Apr 17 17:25:57.191717 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.191654 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/24cc7964-d06f-4bc3-a617-a62c30500317-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-2ss72\" (UID: \"24cc7964-d06f-4bc3-a617-a62c30500317\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2ss72" Apr 17 17:25:57.192573 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.191842 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8c08d72f-60f0-4355-8f32-3f4cb23cf290-root\") pod \"node-exporter-kfhmc\" (UID: \"8c08d72f-60f0-4355-8f32-3f4cb23cf290\") " pod="openshift-monitoring/node-exporter-kfhmc" Apr 17 17:25:57.192573 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:57.191867 2566 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 17:25:57.192573 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:57.191920 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c08d72f-60f0-4355-8f32-3f4cb23cf290-node-exporter-tls podName:8c08d72f-60f0-4355-8f32-3f4cb23cf290 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:57.691904839 +0000 UTC m=+69.361984461 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/8c08d72f-60f0-4355-8f32-3f4cb23cf290-node-exporter-tls") pod "node-exporter-kfhmc" (UID: "8c08d72f-60f0-4355-8f32-3f4cb23cf290") : secret "node-exporter-tls" not found Apr 17 17:25:57.192573 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:57.191921 2566 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 17 17:25:57.192573 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:57.191988 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24cc7964-d06f-4bc3-a617-a62c30500317-kube-state-metrics-tls podName:24cc7964-d06f-4bc3-a617-a62c30500317 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:57.69196796 +0000 UTC m=+69.362047585 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/24cc7964-d06f-4bc3-a617-a62c30500317-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-2ss72" (UID: "24cc7964-d06f-4bc3-a617-a62c30500317") : secret "kube-state-metrics-tls" not found Apr 17 17:25:57.192573 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.192224 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/24cc7964-d06f-4bc3-a617-a62c30500317-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-2ss72\" (UID: \"24cc7964-d06f-4bc3-a617-a62c30500317\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2ss72" Apr 17 17:25:57.192573 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.192280 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/24cc7964-d06f-4bc3-a617-a62c30500317-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-2ss72\" (UID: \"24cc7964-d06f-4bc3-a617-a62c30500317\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2ss72" Apr 17 17:25:57.192573 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.192334 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/83bdc5ec-af0a-4592-ab21-2e80793d42c9-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-8882s\" (UID: \"83bdc5ec-af0a-4592-ab21-2e80793d42c9\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8882s" Apr 17 17:25:57.193887 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.193861 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/83bdc5ec-af0a-4592-ab21-2e80793d42c9-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-8882s\" (UID: \"83bdc5ec-af0a-4592-ab21-2e80793d42c9\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8882s" Apr 17 17:25:57.194006 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.193871 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/24cc7964-d06f-4bc3-a617-a62c30500317-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-2ss72\" (UID: \"24cc7964-d06f-4bc3-a617-a62c30500317\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2ss72" Apr 17 17:25:57.194349 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.194331 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8c08d72f-60f0-4355-8f32-3f4cb23cf290-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-kfhmc\" (UID: \"8c08d72f-60f0-4355-8f32-3f4cb23cf290\") " pod="openshift-monitoring/node-exporter-kfhmc" Apr 17 17:25:57.200408 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.200380 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sgnt\" (UniqueName: \"kubernetes.io/projected/8c08d72f-60f0-4355-8f32-3f4cb23cf290-kube-api-access-5sgnt\") pod \"node-exporter-kfhmc\" (UID: \"8c08d72f-60f0-4355-8f32-3f4cb23cf290\") " pod="openshift-monitoring/node-exporter-kfhmc" Apr 17 17:25:57.201975 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.201949 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j6kc\" (UniqueName: \"kubernetes.io/projected/83bdc5ec-af0a-4592-ab21-2e80793d42c9-kube-api-access-5j6kc\") pod \"openshift-state-metrics-9d44df66c-8882s\" (UID: \"83bdc5ec-af0a-4592-ab21-2e80793d42c9\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8882s" Apr 17 17:25:57.207269 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.207235 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8hw7\" (UniqueName: \"kubernetes.io/projected/24cc7964-d06f-4bc3-a617-a62c30500317-kube-api-access-r8hw7\") pod \"kube-state-metrics-69db897b98-2ss72\" (UID: \"24cc7964-d06f-4bc3-a617-a62c30500317\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2ss72" Apr 17 17:25:57.697145 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.697111 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/83bdc5ec-af0a-4592-ab21-2e80793d42c9-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-8882s\" (UID: \"83bdc5ec-af0a-4592-ab21-2e80793d42c9\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8882s" Apr 17 17:25:57.697367 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.697199 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8c08d72f-60f0-4355-8f32-3f4cb23cf290-node-exporter-tls\") pod \"node-exporter-kfhmc\" (UID: \"8c08d72f-60f0-4355-8f32-3f4cb23cf290\") " pod="openshift-monitoring/node-exporter-kfhmc" Apr 17 17:25:57.697367 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.697225 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/24cc7964-d06f-4bc3-a617-a62c30500317-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-2ss72\" (UID: \"24cc7964-d06f-4bc3-a617-a62c30500317\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2ss72" Apr 17 17:25:57.697367 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:57.697299 2566 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 17 17:25:57.697367 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:57.697331 2566 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 17:25:57.697580 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:57.697386 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83bdc5ec-af0a-4592-ab21-2e80793d42c9-openshift-state-metrics-tls podName:83bdc5ec-af0a-4592-ab21-2e80793d42c9 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:58.697361976 +0000 UTC m=+70.367441609 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/83bdc5ec-af0a-4592-ab21-2e80793d42c9-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-8882s" (UID: "83bdc5ec-af0a-4592-ab21-2e80793d42c9") : secret "openshift-state-metrics-tls" not found Apr 17 17:25:57.697580 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:25:57.697407 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c08d72f-60f0-4355-8f32-3f4cb23cf290-node-exporter-tls podName:8c08d72f-60f0-4355-8f32-3f4cb23cf290 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:58.697396719 +0000 UTC m=+70.367476344 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/8c08d72f-60f0-4355-8f32-3f4cb23cf290-node-exporter-tls") pod "node-exporter-kfhmc" (UID: "8c08d72f-60f0-4355-8f32-3f4cb23cf290") : secret "node-exporter-tls" not found Apr 17 17:25:57.699853 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.699830 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/24cc7964-d06f-4bc3-a617-a62c30500317-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-2ss72\" (UID: \"24cc7964-d06f-4bc3-a617-a62c30500317\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2ss72" Apr 17 17:25:57.899216 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:57.899181 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-2ss72" Apr 17 17:25:58.086640 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.086613 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 17:25:58.090183 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.090163 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:58.094996 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.094744 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-q4v94\"" Apr 17 17:25:58.095689 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.095452 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 17:25:58.095689 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.095497 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 17:25:58.095689 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.095668 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 17:25:58.095843 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.095744 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 17:25:58.095892 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.095880 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 17:25:58.095941 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.095910 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 17:25:58.095986 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.095971 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 17:25:58.096412 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.096110 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 17:25:58.096412 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.096371 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 17:25:58.119422 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.119398 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 17:25:58.201347 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.201312 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8bebf9bf-4deb-4828-8e6f-cf7ab5513932-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8bebf9bf-4deb-4828-8e6f-cf7ab5513932\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:58.201454 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.201356 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8bebf9bf-4deb-4828-8e6f-cf7ab5513932-config-out\") pod \"alertmanager-main-0\" (UID: \"8bebf9bf-4deb-4828-8e6f-cf7ab5513932\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:58.201454 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.201437 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8bebf9bf-4deb-4828-8e6f-cf7ab5513932-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8bebf9bf-4deb-4828-8e6f-cf7ab5513932\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:58.201539 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.201480 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8bebf9bf-4deb-4828-8e6f-cf7ab5513932-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8bebf9bf-4deb-4828-8e6f-cf7ab5513932\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:58.201539 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.201511 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8bebf9bf-4deb-4828-8e6f-cf7ab5513932-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8bebf9bf-4deb-4828-8e6f-cf7ab5513932\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:58.201633 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.201616 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8bebf9bf-4deb-4828-8e6f-cf7ab5513932-web-config\") pod \"alertmanager-main-0\" (UID: \"8bebf9bf-4deb-4828-8e6f-cf7ab5513932\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:58.201688 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.201671 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8bebf9bf-4deb-4828-8e6f-cf7ab5513932-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8bebf9bf-4deb-4828-8e6f-cf7ab5513932\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:58.201733 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.201717 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8bebf9bf-4deb-4828-8e6f-cf7ab5513932-config-volume\") pod \"alertmanager-main-0\" (UID: \"8bebf9bf-4deb-4828-8e6f-cf7ab5513932\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:58.201782 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.201749 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8bebf9bf-4deb-4828-8e6f-cf7ab5513932-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8bebf9bf-4deb-4828-8e6f-cf7ab5513932\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:58.201931 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.201826 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x67zq\" (UniqueName: \"kubernetes.io/projected/8bebf9bf-4deb-4828-8e6f-cf7ab5513932-kube-api-access-x67zq\") pod \"alertmanager-main-0\" (UID: \"8bebf9bf-4deb-4828-8e6f-cf7ab5513932\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:58.201931 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.201877 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8bebf9bf-4deb-4828-8e6f-cf7ab5513932-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8bebf9bf-4deb-4828-8e6f-cf7ab5513932\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:58.201931 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.201912 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8bebf9bf-4deb-4828-8e6f-cf7ab5513932-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8bebf9bf-4deb-4828-8e6f-cf7ab5513932\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:58.202076 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.201943 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8bebf9bf-4deb-4828-8e6f-cf7ab5513932-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8bebf9bf-4deb-4828-8e6f-cf7ab5513932\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:58.226775 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.226753 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-2ss72"] Apr 17 17:25:58.228788 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:25:58.228767 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24cc7964_d06f_4bc3_a617_a62c30500317.slice/crio-8539ae7ce5b90bb92971fcd5ec56ea00fa9ad2cde9e7e845a9e4026ec8815d4a WatchSource:0}: Error finding container 8539ae7ce5b90bb92971fcd5ec56ea00fa9ad2cde9e7e845a9e4026ec8815d4a: Status 404 returned error can't find the container with id 8539ae7ce5b90bb92971fcd5ec56ea00fa9ad2cde9e7e845a9e4026ec8815d4a Apr 17 17:25:58.302700 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.302661 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8bebf9bf-4deb-4828-8e6f-cf7ab5513932-web-config\") pod \"alertmanager-main-0\" (UID: \"8bebf9bf-4deb-4828-8e6f-cf7ab5513932\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:58.302700 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.302714 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8bebf9bf-4deb-4828-8e6f-cf7ab5513932-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8bebf9bf-4deb-4828-8e6f-cf7ab5513932\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:58.302953 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.302738 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8bebf9bf-4deb-4828-8e6f-cf7ab5513932-config-volume\") pod \"alertmanager-main-0\" (UID: \"8bebf9bf-4deb-4828-8e6f-cf7ab5513932\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:58.302953 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.302755 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8bebf9bf-4deb-4828-8e6f-cf7ab5513932-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8bebf9bf-4deb-4828-8e6f-cf7ab5513932\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:58.302953 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.302773 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x67zq\" (UniqueName: \"kubernetes.io/projected/8bebf9bf-4deb-4828-8e6f-cf7ab5513932-kube-api-access-x67zq\") pod \"alertmanager-main-0\" (UID: \"8bebf9bf-4deb-4828-8e6f-cf7ab5513932\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:58.302953 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.302895 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8bebf9bf-4deb-4828-8e6f-cf7ab5513932-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8bebf9bf-4deb-4828-8e6f-cf7ab5513932\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:58.302953 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.302939 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8bebf9bf-4deb-4828-8e6f-cf7ab5513932-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8bebf9bf-4deb-4828-8e6f-cf7ab5513932\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:58.303182 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.302970 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8bebf9bf-4deb-4828-8e6f-cf7ab5513932-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8bebf9bf-4deb-4828-8e6f-cf7ab5513932\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:58.303182 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.303027 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8bebf9bf-4deb-4828-8e6f-cf7ab5513932-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8bebf9bf-4deb-4828-8e6f-cf7ab5513932\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:58.303182 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.303053 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8bebf9bf-4deb-4828-8e6f-cf7ab5513932-config-out\") pod \"alertmanager-main-0\" (UID: \"8bebf9bf-4deb-4828-8e6f-cf7ab5513932\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:58.303182 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.303080 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8bebf9bf-4deb-4828-8e6f-cf7ab5513932-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8bebf9bf-4deb-4828-8e6f-cf7ab5513932\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:58.303182 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.303106 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8bebf9bf-4deb-4828-8e6f-cf7ab5513932-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8bebf9bf-4deb-4828-8e6f-cf7ab5513932\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:58.303182 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.303139 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8bebf9bf-4deb-4828-8e6f-cf7ab5513932-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8bebf9bf-4deb-4828-8e6f-cf7ab5513932\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:58.303506 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.303235 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8bebf9bf-4deb-4828-8e6f-cf7ab5513932-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8bebf9bf-4deb-4828-8e6f-cf7ab5513932\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:58.304016 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.303945 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8bebf9bf-4deb-4828-8e6f-cf7ab5513932-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8bebf9bf-4deb-4828-8e6f-cf7ab5513932\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:58.304016 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.303950 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8bebf9bf-4deb-4828-8e6f-cf7ab5513932-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8bebf9bf-4deb-4828-8e6f-cf7ab5513932\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:58.306350 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.306217 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8bebf9bf-4deb-4828-8e6f-cf7ab5513932-web-config\") pod \"alertmanager-main-0\" (UID: \"8bebf9bf-4deb-4828-8e6f-cf7ab5513932\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:58.306350 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.306305 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8bebf9bf-4deb-4828-8e6f-cf7ab5513932-config-volume\") pod \"alertmanager-main-0\" (UID: \"8bebf9bf-4deb-4828-8e6f-cf7ab5513932\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:58.306522 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.306409 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8bebf9bf-4deb-4828-8e6f-cf7ab5513932-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8bebf9bf-4deb-4828-8e6f-cf7ab5513932\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:58.306643 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.306597 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8bebf9bf-4deb-4828-8e6f-cf7ab5513932-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8bebf9bf-4deb-4828-8e6f-cf7ab5513932\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:58.306643 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.306633 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8bebf9bf-4deb-4828-8e6f-cf7ab5513932-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8bebf9bf-4deb-4828-8e6f-cf7ab5513932\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:58.307164 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.307143 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8bebf9bf-4deb-4828-8e6f-cf7ab5513932-config-out\") pod \"alertmanager-main-0\" (UID: \"8bebf9bf-4deb-4828-8e6f-cf7ab5513932\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:58.307680 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.307659 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8bebf9bf-4deb-4828-8e6f-cf7ab5513932-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8bebf9bf-4deb-4828-8e6f-cf7ab5513932\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:58.307821 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.307806 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8bebf9bf-4deb-4828-8e6f-cf7ab5513932-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8bebf9bf-4deb-4828-8e6f-cf7ab5513932\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:58.307875 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.307808 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8bebf9bf-4deb-4828-8e6f-cf7ab5513932-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8bebf9bf-4deb-4828-8e6f-cf7ab5513932\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:58.311451 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.311434 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x67zq\" (UniqueName: \"kubernetes.io/projected/8bebf9bf-4deb-4828-8e6f-cf7ab5513932-kube-api-access-x67zq\") pod \"alertmanager-main-0\" (UID: \"8bebf9bf-4deb-4828-8e6f-cf7ab5513932\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:58.411700 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.411614 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:58.535246 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.535212 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 17:25:58.539460 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:25:58.539432 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bebf9bf_4deb_4828_8e6f_cf7ab5513932.slice/crio-9a4ba7e6f55314664ecff738a8a180515cc4f8c0876ff2771d00dfa49c39cf66 WatchSource:0}: Error finding container 9a4ba7e6f55314664ecff738a8a180515cc4f8c0876ff2771d00dfa49c39cf66: Status 404 returned error can't find the container with id 9a4ba7e6f55314664ecff738a8a180515cc4f8c0876ff2771d00dfa49c39cf66 Apr 17 17:25:58.706600 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.706513 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8c08d72f-60f0-4355-8f32-3f4cb23cf290-node-exporter-tls\") pod \"node-exporter-kfhmc\" (UID: \"8c08d72f-60f0-4355-8f32-3f4cb23cf290\") " pod="openshift-monitoring/node-exporter-kfhmc" Apr 17 17:25:58.706600 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.706569 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/83bdc5ec-af0a-4592-ab21-2e80793d42c9-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-8882s\" (UID: \"83bdc5ec-af0a-4592-ab21-2e80793d42c9\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8882s" Apr 17 17:25:58.708803 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.708778 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8c08d72f-60f0-4355-8f32-3f4cb23cf290-node-exporter-tls\") pod \"node-exporter-kfhmc\" (UID: \"8c08d72f-60f0-4355-8f32-3f4cb23cf290\") " pod="openshift-monitoring/node-exporter-kfhmc" Apr 17 17:25:58.708937 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.708919 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/83bdc5ec-af0a-4592-ab21-2e80793d42c9-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-8882s\" (UID: \"83bdc5ec-af0a-4592-ab21-2e80793d42c9\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8882s" Apr 17 17:25:58.786096 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.786060 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8882s" Apr 17 17:25:58.814965 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.814931 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-kfhmc" Apr 17 17:25:58.824863 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:25:58.824836 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c08d72f_60f0_4355_8f32_3f4cb23cf290.slice/crio-4623721a588d46580ad148e3a5478859639e08895cb8d52306ba0dc7db58bd4e WatchSource:0}: Error finding container 4623721a588d46580ad148e3a5478859639e08895cb8d52306ba0dc7db58bd4e: Status 404 returned error can't find the container with id 4623721a588d46580ad148e3a5478859639e08895cb8d52306ba0dc7db58bd4e Apr 17 17:25:58.946372 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:58.946345 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-8882s"] Apr 17 17:25:58.950267 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:25:58.950222 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83bdc5ec_af0a_4592_ab21_2e80793d42c9.slice/crio-8c4d01b4e1feede011da2dac1745f96c30acb813ebe6aa98ab85f091e521c3a0 WatchSource:0}: Error finding container 8c4d01b4e1feede011da2dac1745f96c30acb813ebe6aa98ab85f091e521c3a0: Status 404 returned error can't find the container with id 8c4d01b4e1feede011da2dac1745f96c30acb813ebe6aa98ab85f091e521c3a0 Apr 17 17:25:59.113931 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:59.113897 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-vdhlz" event={"ID":"d29a1468-8ac5-454a-a993-6a5055191ec4","Type":"ContainerStarted","Data":"3885d15a226d5dbbc69d08843ab3738fe5a0b596d53097d19db2a9484061cafe"} Apr 17 17:25:59.114481 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:59.114461 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-vdhlz" Apr 17 17:25:59.116387 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:59.116317 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-kfhmc" event={"ID":"8c08d72f-60f0-4355-8f32-3f4cb23cf290","Type":"ContainerStarted","Data":"4623721a588d46580ad148e3a5478859639e08895cb8d52306ba0dc7db58bd4e"} Apr 17 17:25:59.119462 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:59.119389 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8882s" event={"ID":"83bdc5ec-af0a-4592-ab21-2e80793d42c9","Type":"ContainerStarted","Data":"23540d461a0b80e54afa0549117cb198f39f413afbcfb9f73e943d2abbd51935"} Apr 17 17:25:59.119462 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:59.119431 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8882s" event={"ID":"83bdc5ec-af0a-4592-ab21-2e80793d42c9","Type":"ContainerStarted","Data":"26c7cb669627d793f72afb81f09ea4fff518fe4dc81010cd545ba06c433d8d21"} Apr 17 17:25:59.119462 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:59.119446 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8882s" event={"ID":"83bdc5ec-af0a-4592-ab21-2e80793d42c9","Type":"ContainerStarted","Data":"8c4d01b4e1feede011da2dac1745f96c30acb813ebe6aa98ab85f091e521c3a0"} Apr 17 17:25:59.120820 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:59.120775 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8bebf9bf-4deb-4828-8e6f-cf7ab5513932","Type":"ContainerStarted","Data":"9a4ba7e6f55314664ecff738a8a180515cc4f8c0876ff2771d00dfa49c39cf66"} Apr 17 17:25:59.122392 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:59.122356 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-2ss72" event={"ID":"24cc7964-d06f-4bc3-a617-a62c30500317","Type":"ContainerStarted","Data":"8539ae7ce5b90bb92971fcd5ec56ea00fa9ad2cde9e7e845a9e4026ec8815d4a"} Apr 17 17:25:59.135650 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:59.135603 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-vdhlz" podStartSLOduration=68.088761892 podStartE2EDuration="1m11.135586538s" podCreationTimestamp="2026-04-17 17:24:48 +0000 UTC" firstStartedPulling="2026-04-17 17:25:55.106390745 +0000 UTC m=+66.776470387" lastFinishedPulling="2026-04-17 17:25:58.153215407 +0000 UTC m=+69.823295033" observedRunningTime="2026-04-17 17:25:59.134666953 +0000 UTC m=+70.804746596" watchObservedRunningTime="2026-04-17 17:25:59.135586538 +0000 UTC m=+70.805666183" Apr 17 17:25:59.534631 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:59.534597 2566 patch_prober.go:28] interesting pod/image-registry-5848fc8888-kv722 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 17:25:59.534771 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:25:59.534654 2566 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-5848fc8888-kv722" podUID="3bc8d11b-297a-4446-8745-9da775945615" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:26:00.129560 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:00.129503 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-kfhmc" event={"ID":"8c08d72f-60f0-4355-8f32-3f4cb23cf290","Type":"ContainerStarted","Data":"828cb81a7bfe1b0e391f61b703bd8e0198a840e771c520a308dc1e7a0cf58325"} Apr 17 17:26:00.137866 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:00.131908 2566 generic.go:358] "Generic (PLEG): container finished" podID="8bebf9bf-4deb-4828-8e6f-cf7ab5513932" containerID="b8e6bedc47b3a906a2808634d31a2b56b0fac83306c907ddcc671ecb6a740537" exitCode=0 Apr 17 17:26:00.137866 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:00.131976 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8bebf9bf-4deb-4828-8e6f-cf7ab5513932","Type":"ContainerDied","Data":"b8e6bedc47b3a906a2808634d31a2b56b0fac83306c907ddcc671ecb6a740537"} Apr 17 17:26:00.140715 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:00.140576 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-2ss72" event={"ID":"24cc7964-d06f-4bc3-a617-a62c30500317","Type":"ContainerStarted","Data":"4357676b3fb374868030a548d1600c37531a780e329b7ae7bbf8ecacf54766af"} Apr 17 17:26:01.057998 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:01.057967 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5848fc8888-kv722" Apr 17 17:26:01.145424 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:01.145394 2566 generic.go:358] "Generic (PLEG): container finished" podID="8c08d72f-60f0-4355-8f32-3f4cb23cf290" containerID="828cb81a7bfe1b0e391f61b703bd8e0198a840e771c520a308dc1e7a0cf58325" exitCode=0 Apr 17 17:26:01.145836 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:01.145464 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-kfhmc" event={"ID":"8c08d72f-60f0-4355-8f32-3f4cb23cf290","Type":"ContainerDied","Data":"828cb81a7bfe1b0e391f61b703bd8e0198a840e771c520a308dc1e7a0cf58325"} Apr 17 17:26:01.147744 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:01.147709 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8882s" event={"ID":"83bdc5ec-af0a-4592-ab21-2e80793d42c9","Type":"ContainerStarted","Data":"be5f097b793882b2b7a685ccd8d269ca268c1e5b17b8e303087ff706a8ab69fc"} Apr 17 17:26:01.149802 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:01.149778 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-2ss72" event={"ID":"24cc7964-d06f-4bc3-a617-a62c30500317","Type":"ContainerStarted","Data":"cc35735dc65e87ced5929dcb6277e96edf5294b31d594deb7ff72c9a7f934134"} Apr 17 17:26:01.149896 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:01.149809 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-2ss72" event={"ID":"24cc7964-d06f-4bc3-a617-a62c30500317","Type":"ContainerStarted","Data":"befe7114e3112ae2b1f86dc74e87b1c9bdf536e314e8469ee37b8fa10e59e190"} Apr 17 17:26:01.188782 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:01.188725 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-2ss72" podStartSLOduration=3.47692248 podStartE2EDuration="5.188711677s" podCreationTimestamp="2026-04-17 17:25:56 +0000 UTC" firstStartedPulling="2026-04-17 17:25:58.230613963 +0000 UTC m=+69.900693583" lastFinishedPulling="2026-04-17 17:25:59.942403147 +0000 UTC m=+71.612482780" observedRunningTime="2026-04-17 17:26:01.187072753 +0000 UTC m=+72.857152444" watchObservedRunningTime="2026-04-17 17:26:01.188711677 +0000 UTC m=+72.858791319" Apr 17 17:26:01.220384 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:01.220329 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8882s" podStartSLOduration=3.811902421 podStartE2EDuration="5.220312181s" podCreationTimestamp="2026-04-17 17:25:56 +0000 UTC" firstStartedPulling="2026-04-17 17:25:59.097173893 +0000 UTC m=+70.767253519" lastFinishedPulling="2026-04-17 17:26:00.505583653 +0000 UTC m=+72.175663279" observedRunningTime="2026-04-17 17:26:01.220137986 +0000 UTC m=+72.890217628" watchObservedRunningTime="2026-04-17 17:26:01.220312181 +0000 UTC m=+72.890391820" Apr 17 17:26:02.153965 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:02.153932 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-kfhmc" event={"ID":"8c08d72f-60f0-4355-8f32-3f4cb23cf290","Type":"ContainerStarted","Data":"0fb64b2c1df7dc23b1b054dbbfd74d5b494cf60e5a1b95e6e2e94bd387f1010a"} Apr 17 17:26:02.153965 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:02.153971 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-kfhmc" event={"ID":"8c08d72f-60f0-4355-8f32-3f4cb23cf290","Type":"ContainerStarted","Data":"3533f9ef0cef41f8c432b49802587dd784a11158f39a6fa2a7a11f7840f58129"} Apr 17 17:26:02.156487 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:02.156461 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8bebf9bf-4deb-4828-8e6f-cf7ab5513932","Type":"ContainerStarted","Data":"074f5076fcfc18bb1b41c5ac4594b861249e53ff99dc0c58969b1b7e586b91b3"} Apr 17 17:26:02.156487 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:02.156490 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8bebf9bf-4deb-4828-8e6f-cf7ab5513932","Type":"ContainerStarted","Data":"16d123a14cae3f6f7fcc6a0eeea8abaa6b43db0ab0da5b72b94760c9d1e9a182"} Apr 17 17:26:02.156660 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:02.156500 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8bebf9bf-4deb-4828-8e6f-cf7ab5513932","Type":"ContainerStarted","Data":"b4c3d4fe06441f9f9bdf0d8cca0239c148db8c01179ce016fcf1546c8aa6c73a"} Apr 17 17:26:02.156660 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:02.156509 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8bebf9bf-4deb-4828-8e6f-cf7ab5513932","Type":"ContainerStarted","Data":"f35b6e0bfb56f81196ea8933067984503703bbe5c327d0f9f582e22dfa28ac1a"} Apr 17 17:26:02.156660 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:02.156517 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8bebf9bf-4deb-4828-8e6f-cf7ab5513932","Type":"ContainerStarted","Data":"534a679b2b84d0356881f876962e1a43a4d8a19f6b04b28a02dc1dc9542e5c91"} Apr 17 17:26:02.184305 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:02.184229 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-kfhmc" podStartSLOduration=5.019822246 podStartE2EDuration="6.184213592s" podCreationTimestamp="2026-04-17 17:25:56 +0000 UTC" firstStartedPulling="2026-04-17 17:25:58.827078483 +0000 UTC m=+70.497158103" lastFinishedPulling="2026-04-17 17:25:59.991469818 +0000 UTC m=+71.661549449" observedRunningTime="2026-04-17 17:26:02.183547485 +0000 UTC m=+73.853627127" watchObservedRunningTime="2026-04-17 17:26:02.184213592 +0000 UTC m=+73.854293233" Apr 17 17:26:02.205573 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:02.205530 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-8b49859d5-x88sb"] Apr 17 17:26:02.210164 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:02.210139 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-8b49859d5-x88sb" Apr 17 17:26:02.212704 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:02.212681 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 17 17:26:02.212815 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:02.212727 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 17 17:26:02.212815 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:02.212782 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 17 17:26:02.212995 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:02.212975 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 17 17:26:02.213082 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:02.212991 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 17 17:26:02.213082 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:02.212991 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-k7m2w\"" Apr 17 17:26:02.219204 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:02.219187 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 17 17:26:02.222399 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:02.222376 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-8b49859d5-x88sb"] Apr 17 17:26:02.338686 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:02.338648 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd58a1a6-2c41-40e0-adfd-e35fe24330cb-serving-certs-ca-bundle\") pod \"telemeter-client-8b49859d5-x88sb\" (UID: \"bd58a1a6-2c41-40e0-adfd-e35fe24330cb\") " pod="openshift-monitoring/telemeter-client-8b49859d5-x88sb" Apr 17 17:26:02.338855 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:02.338716 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/bd58a1a6-2c41-40e0-adfd-e35fe24330cb-federate-client-tls\") pod \"telemeter-client-8b49859d5-x88sb\" (UID: \"bd58a1a6-2c41-40e0-adfd-e35fe24330cb\") " pod="openshift-monitoring/telemeter-client-8b49859d5-x88sb" Apr 17 17:26:02.338855 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:02.338743 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bd58a1a6-2c41-40e0-adfd-e35fe24330cb-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-8b49859d5-x88sb\" (UID: \"bd58a1a6-2c41-40e0-adfd-e35fe24330cb\") " pod="openshift-monitoring/telemeter-client-8b49859d5-x88sb" Apr 17 17:26:02.338855 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:02.338787 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/bd58a1a6-2c41-40e0-adfd-e35fe24330cb-telemeter-client-tls\") pod \"telemeter-client-8b49859d5-x88sb\" (UID: \"bd58a1a6-2c41-40e0-adfd-e35fe24330cb\") " pod="openshift-monitoring/telemeter-client-8b49859d5-x88sb" Apr 17 17:26:02.339226 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:02.339206 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn6rm\" (UniqueName: \"kubernetes.io/projected/bd58a1a6-2c41-40e0-adfd-e35fe24330cb-kube-api-access-bn6rm\") pod \"telemeter-client-8b49859d5-x88sb\" (UID: \"bd58a1a6-2c41-40e0-adfd-e35fe24330cb\") " pod="openshift-monitoring/telemeter-client-8b49859d5-x88sb" Apr 17 17:26:02.339324 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:02.339244 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bd58a1a6-2c41-40e0-adfd-e35fe24330cb-metrics-client-ca\") pod \"telemeter-client-8b49859d5-x88sb\" (UID: \"bd58a1a6-2c41-40e0-adfd-e35fe24330cb\") " pod="openshift-monitoring/telemeter-client-8b49859d5-x88sb" Apr 17 17:26:02.339379 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:02.339335 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd58a1a6-2c41-40e0-adfd-e35fe24330cb-telemeter-trusted-ca-bundle\") pod \"telemeter-client-8b49859d5-x88sb\" (UID: \"bd58a1a6-2c41-40e0-adfd-e35fe24330cb\") " pod="openshift-monitoring/telemeter-client-8b49859d5-x88sb" Apr 17 17:26:02.339454 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:02.339441 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/bd58a1a6-2c41-40e0-adfd-e35fe24330cb-secret-telemeter-client\") pod \"telemeter-client-8b49859d5-x88sb\" (UID: \"bd58a1a6-2c41-40e0-adfd-e35fe24330cb\") " pod="openshift-monitoring/telemeter-client-8b49859d5-x88sb" Apr 17 17:26:02.440583 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:02.440506 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bn6rm\" (UniqueName: \"kubernetes.io/projected/bd58a1a6-2c41-40e0-adfd-e35fe24330cb-kube-api-access-bn6rm\") pod \"telemeter-client-8b49859d5-x88sb\" (UID: \"bd58a1a6-2c41-40e0-adfd-e35fe24330cb\") " pod="openshift-monitoring/telemeter-client-8b49859d5-x88sb" Apr 17 17:26:02.440583 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:02.440542 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bd58a1a6-2c41-40e0-adfd-e35fe24330cb-metrics-client-ca\") pod \"telemeter-client-8b49859d5-x88sb\" (UID: \"bd58a1a6-2c41-40e0-adfd-e35fe24330cb\") " pod="openshift-monitoring/telemeter-client-8b49859d5-x88sb" Apr 17 17:26:02.440780 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:02.440583 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd58a1a6-2c41-40e0-adfd-e35fe24330cb-telemeter-trusted-ca-bundle\") pod \"telemeter-client-8b49859d5-x88sb\" (UID: \"bd58a1a6-2c41-40e0-adfd-e35fe24330cb\") " pod="openshift-monitoring/telemeter-client-8b49859d5-x88sb" Apr 17 17:26:02.440780 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:02.440637 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/bd58a1a6-2c41-40e0-adfd-e35fe24330cb-secret-telemeter-client\") pod \"telemeter-client-8b49859d5-x88sb\" (UID: \"bd58a1a6-2c41-40e0-adfd-e35fe24330cb\") " pod="openshift-monitoring/telemeter-client-8b49859d5-x88sb" Apr 17 17:26:02.440780 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:02.440715 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd58a1a6-2c41-40e0-adfd-e35fe24330cb-serving-certs-ca-bundle\") pod \"telemeter-client-8b49859d5-x88sb\" (UID: \"bd58a1a6-2c41-40e0-adfd-e35fe24330cb\") " pod="openshift-monitoring/telemeter-client-8b49859d5-x88sb" Apr 17 17:26:02.440780 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:02.440749 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/bd58a1a6-2c41-40e0-adfd-e35fe24330cb-federate-client-tls\") pod \"telemeter-client-8b49859d5-x88sb\" (UID: \"bd58a1a6-2c41-40e0-adfd-e35fe24330cb\") " pod="openshift-monitoring/telemeter-client-8b49859d5-x88sb" Apr 17 17:26:02.440997 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:02.440778 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bd58a1a6-2c41-40e0-adfd-e35fe24330cb-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-8b49859d5-x88sb\" (UID: \"bd58a1a6-2c41-40e0-adfd-e35fe24330cb\") " pod="openshift-monitoring/telemeter-client-8b49859d5-x88sb" Apr 17 17:26:02.440997 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:02.440905 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/bd58a1a6-2c41-40e0-adfd-e35fe24330cb-telemeter-client-tls\") pod \"telemeter-client-8b49859d5-x88sb\" (UID: \"bd58a1a6-2c41-40e0-adfd-e35fe24330cb\") " pod="openshift-monitoring/telemeter-client-8b49859d5-x88sb" Apr 17 17:26:02.441645 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:02.441508 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd58a1a6-2c41-40e0-adfd-e35fe24330cb-serving-certs-ca-bundle\") pod \"telemeter-client-8b49859d5-x88sb\" (UID: \"bd58a1a6-2c41-40e0-adfd-e35fe24330cb\") " pod="openshift-monitoring/telemeter-client-8b49859d5-x88sb" Apr 17 17:26:02.441645 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:02.441558 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bd58a1a6-2c41-40e0-adfd-e35fe24330cb-metrics-client-ca\") pod \"telemeter-client-8b49859d5-x88sb\" (UID: \"bd58a1a6-2c41-40e0-adfd-e35fe24330cb\") " pod="openshift-monitoring/telemeter-client-8b49859d5-x88sb" Apr 17 17:26:02.442127 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:02.442095 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd58a1a6-2c41-40e0-adfd-e35fe24330cb-telemeter-trusted-ca-bundle\") pod \"telemeter-client-8b49859d5-x88sb\" (UID: \"bd58a1a6-2c41-40e0-adfd-e35fe24330cb\") " pod="openshift-monitoring/telemeter-client-8b49859d5-x88sb" Apr 17 17:26:02.443643 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:02.443621 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/bd58a1a6-2c41-40e0-adfd-e35fe24330cb-telemeter-client-tls\") pod \"telemeter-client-8b49859d5-x88sb\" (UID: \"bd58a1a6-2c41-40e0-adfd-e35fe24330cb\") " pod="openshift-monitoring/telemeter-client-8b49859d5-x88sb" Apr 17 17:26:02.443743 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:02.443659 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bd58a1a6-2c41-40e0-adfd-e35fe24330cb-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-8b49859d5-x88sb\" (UID: \"bd58a1a6-2c41-40e0-adfd-e35fe24330cb\") " pod="openshift-monitoring/telemeter-client-8b49859d5-x88sb" Apr 17 17:26:02.443798 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:02.443778 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/bd58a1a6-2c41-40e0-adfd-e35fe24330cb-secret-telemeter-client\") pod \"telemeter-client-8b49859d5-x88sb\" (UID: \"bd58a1a6-2c41-40e0-adfd-e35fe24330cb\") " pod="openshift-monitoring/telemeter-client-8b49859d5-x88sb" Apr 17 17:26:02.444248 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:02.444223 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/bd58a1a6-2c41-40e0-adfd-e35fe24330cb-federate-client-tls\") pod \"telemeter-client-8b49859d5-x88sb\" (UID: \"bd58a1a6-2c41-40e0-adfd-e35fe24330cb\") " pod="openshift-monitoring/telemeter-client-8b49859d5-x88sb" Apr 17 17:26:02.449909 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:02.449887 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn6rm\" (UniqueName: \"kubernetes.io/projected/bd58a1a6-2c41-40e0-adfd-e35fe24330cb-kube-api-access-bn6rm\") pod \"telemeter-client-8b49859d5-x88sb\" (UID: \"bd58a1a6-2c41-40e0-adfd-e35fe24330cb\") " pod="openshift-monitoring/telemeter-client-8b49859d5-x88sb" Apr 17 17:26:02.519490 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:02.519457 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-8b49859d5-x88sb" Apr 17 17:26:02.759231 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:02.759180 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-8b49859d5-x88sb"] Apr 17 17:26:02.763222 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:26:02.763187 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd58a1a6_2c41_40e0_adfd_e35fe24330cb.slice/crio-752fbff3377e20f48fc35057dd4f725508f504d5aac045d5a47c3474d624ac59 WatchSource:0}: Error finding container 752fbff3377e20f48fc35057dd4f725508f504d5aac045d5a47c3474d624ac59: Status 404 returned error can't find the container with id 752fbff3377e20f48fc35057dd4f725508f504d5aac045d5a47c3474d624ac59 Apr 17 17:26:03.163490 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.163450 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8bebf9bf-4deb-4828-8e6f-cf7ab5513932","Type":"ContainerStarted","Data":"24424f2ce19292cc861386f1435f186e3ff850fae9f8c4103a14a865ac2563c0"} Apr 17 17:26:03.164704 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.164679 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-8b49859d5-x88sb" event={"ID":"bd58a1a6-2c41-40e0-adfd-e35fe24330cb","Type":"ContainerStarted","Data":"752fbff3377e20f48fc35057dd4f725508f504d5aac045d5a47c3474d624ac59"} Apr 17 17:26:03.205790 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.205739 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.075878646 podStartE2EDuration="5.205724052s" podCreationTimestamp="2026-04-17 17:25:58 +0000 UTC" firstStartedPulling="2026-04-17 17:25:58.541359037 +0000 UTC m=+70.211438658" lastFinishedPulling="2026-04-17 17:26:02.671204444 +0000 UTC m=+74.341284064" observedRunningTime="2026-04-17 17:26:03.201159707 +0000 UTC m=+74.871239348" watchObservedRunningTime="2026-04-17 17:26:03.205724052 +0000 UTC m=+74.875803691" Apr 17 17:26:03.267220 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.267190 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 17:26:03.270993 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.270974 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.273769 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.273749 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 17:26:03.273866 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.273807 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 17:26:03.274223 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.274200 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 17:26:03.274347 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.274224 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 17:26:03.274347 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.274203 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 17:26:03.274347 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.274201 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 17:26:03.274513 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.274400 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-cjl48\"" Apr 17 17:26:03.274850 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.274835 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 17:26:03.275063 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.275045 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-c0s5pt2u8o78h\"" Apr 17 17:26:03.275063 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.275058 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 17:26:03.275194 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.275082 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 17:26:03.275194 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.275105 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 17:26:03.277245 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.277222 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 17:26:03.280105 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.279347 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 17:26:03.282988 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.282963 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 17:26:03.290457 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.290435 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 17:26:03.451488 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.451418 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/90ed7337-b3e7-477c-967b-b847ef1e7833-config\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.451488 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.451454 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/90ed7337-b3e7-477c-967b-b847ef1e7833-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.451488 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.451480 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/90ed7337-b3e7-477c-967b-b847ef1e7833-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.451690 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.451580 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90ed7337-b3e7-477c-967b-b847ef1e7833-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.451690 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.451634 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/90ed7337-b3e7-477c-967b-b847ef1e7833-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.451690 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.451652 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/90ed7337-b3e7-477c-967b-b847ef1e7833-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.451690 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.451674 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/90ed7337-b3e7-477c-967b-b847ef1e7833-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.451811 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.451692 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/90ed7337-b3e7-477c-967b-b847ef1e7833-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.451811 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.451713 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/90ed7337-b3e7-477c-967b-b847ef1e7833-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.451811 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.451743 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/90ed7337-b3e7-477c-967b-b847ef1e7833-config-out\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.451811 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.451761 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/90ed7337-b3e7-477c-967b-b847ef1e7833-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.451811 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.451777 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90ed7337-b3e7-477c-967b-b847ef1e7833-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.451811 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.451797 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h69f7\" (UniqueName: \"kubernetes.io/projected/90ed7337-b3e7-477c-967b-b847ef1e7833-kube-api-access-h69f7\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.451981 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.451853 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90ed7337-b3e7-477c-967b-b847ef1e7833-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.451981 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.451882 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/90ed7337-b3e7-477c-967b-b847ef1e7833-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.451981 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.451905 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/90ed7337-b3e7-477c-967b-b847ef1e7833-web-config\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.451981 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.451922 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/90ed7337-b3e7-477c-967b-b847ef1e7833-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.451981 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.451942 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/90ed7337-b3e7-477c-967b-b847ef1e7833-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.552917 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.552878 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/90ed7337-b3e7-477c-967b-b847ef1e7833-config\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.553159 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.552930 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/90ed7337-b3e7-477c-967b-b847ef1e7833-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.553159 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.552969 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/90ed7337-b3e7-477c-967b-b847ef1e7833-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.553159 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.553005 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90ed7337-b3e7-477c-967b-b847ef1e7833-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.553159 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.553054 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/90ed7337-b3e7-477c-967b-b847ef1e7833-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.553159 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.553080 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/90ed7337-b3e7-477c-967b-b847ef1e7833-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.553159 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.553110 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/90ed7337-b3e7-477c-967b-b847ef1e7833-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.553159 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.553138 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/90ed7337-b3e7-477c-967b-b847ef1e7833-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.553550 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.553169 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/90ed7337-b3e7-477c-967b-b847ef1e7833-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.553550 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.553193 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/90ed7337-b3e7-477c-967b-b847ef1e7833-config-out\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.553550 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.553218 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/90ed7337-b3e7-477c-967b-b847ef1e7833-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.553550 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.553243 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90ed7337-b3e7-477c-967b-b847ef1e7833-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.553550 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.553302 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h69f7\" (UniqueName: \"kubernetes.io/projected/90ed7337-b3e7-477c-967b-b847ef1e7833-kube-api-access-h69f7\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.553550 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.553345 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90ed7337-b3e7-477c-967b-b847ef1e7833-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.553550 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.553370 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/90ed7337-b3e7-477c-967b-b847ef1e7833-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.553550 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.553401 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/90ed7337-b3e7-477c-967b-b847ef1e7833-web-config\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.553550 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.553425 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/90ed7337-b3e7-477c-967b-b847ef1e7833-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.553550 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.553450 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/90ed7337-b3e7-477c-967b-b847ef1e7833-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.553989 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.553940 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90ed7337-b3e7-477c-967b-b847ef1e7833-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.554829 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.554092 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/90ed7337-b3e7-477c-967b-b847ef1e7833-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.554829 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.554717 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90ed7337-b3e7-477c-967b-b847ef1e7833-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.555789 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.555555 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/90ed7337-b3e7-477c-967b-b847ef1e7833-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.555789 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.555636 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90ed7337-b3e7-477c-967b-b847ef1e7833-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.557085 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.557056 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/90ed7337-b3e7-477c-967b-b847ef1e7833-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.557218 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.557196 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/90ed7337-b3e7-477c-967b-b847ef1e7833-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.557493 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.557471 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/90ed7337-b3e7-477c-967b-b847ef1e7833-config\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.558232 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.558209 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/90ed7337-b3e7-477c-967b-b847ef1e7833-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.558391 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.558366 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/90ed7337-b3e7-477c-967b-b847ef1e7833-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.558486 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.558443 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/90ed7337-b3e7-477c-967b-b847ef1e7833-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.558805 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.558776 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/90ed7337-b3e7-477c-967b-b847ef1e7833-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.558900 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.558860 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/90ed7337-b3e7-477c-967b-b847ef1e7833-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.559158 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.559140 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/90ed7337-b3e7-477c-967b-b847ef1e7833-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.559206 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.559164 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/90ed7337-b3e7-477c-967b-b847ef1e7833-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.559270 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.559232 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/90ed7337-b3e7-477c-967b-b847ef1e7833-web-config\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.559383 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.559231 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/90ed7337-b3e7-477c-967b-b847ef1e7833-config-out\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.562672 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.562652 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h69f7\" (UniqueName: \"kubernetes.io/projected/90ed7337-b3e7-477c-967b-b847ef1e7833-kube-api-access-h69f7\") pod \"prometheus-k8s-0\" (UID: \"90ed7337-b3e7-477c-967b-b847ef1e7833\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.580614 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.580567 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:03.715280 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:03.715234 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 17:26:03.716626 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:26:03.716601 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90ed7337_b3e7_477c_967b_b847ef1e7833.slice/crio-00723825255f2dbb5ec81e323c28831dca67bb34d6a27b56c6b11c5f05ddc108 WatchSource:0}: Error finding container 00723825255f2dbb5ec81e323c28831dca67bb34d6a27b56c6b11c5f05ddc108: Status 404 returned error can't find the container with id 00723825255f2dbb5ec81e323c28831dca67bb34d6a27b56c6b11c5f05ddc108 Apr 17 17:26:04.169135 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:04.169103 2566 generic.go:358] "Generic (PLEG): container finished" podID="90ed7337-b3e7-477c-967b-b847ef1e7833" containerID="c9e843d526ba7f95046f99cd5f57bca5b4270b962ba2d34988e8008a479e8017" exitCode=0 Apr 17 17:26:04.169559 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:04.169186 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"90ed7337-b3e7-477c-967b-b847ef1e7833","Type":"ContainerDied","Data":"c9e843d526ba7f95046f99cd5f57bca5b4270b962ba2d34988e8008a479e8017"} Apr 17 17:26:04.169559 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:04.169220 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"90ed7337-b3e7-477c-967b-b847ef1e7833","Type":"ContainerStarted","Data":"00723825255f2dbb5ec81e323c28831dca67bb34d6a27b56c6b11c5f05ddc108"} Apr 17 17:26:05.174161 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:05.174123 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-8b49859d5-x88sb" event={"ID":"bd58a1a6-2c41-40e0-adfd-e35fe24330cb","Type":"ContainerStarted","Data":"8669ca3e26dc77303ab9a2e019d8b2cb73f7f4345a816e8db3bcf7a109ec0fdc"} Apr 17 17:26:05.174561 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:05.174168 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-8b49859d5-x88sb" event={"ID":"bd58a1a6-2c41-40e0-adfd-e35fe24330cb","Type":"ContainerStarted","Data":"e6927d7b1f1c926f770a447b5a368ebb2d3a9346b7eacddb9487fccdd22bac5d"} Apr 17 17:26:05.174561 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:05.174184 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-8b49859d5-x88sb" event={"ID":"bd58a1a6-2c41-40e0-adfd-e35fe24330cb","Type":"ContainerStarted","Data":"b347e732dc1770c5ef609d6ed5b72aa6417c57da763fbe1da02eb6d7f94e1aee"} Apr 17 17:26:05.196288 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:05.196161 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-8b49859d5-x88sb" podStartSLOduration=1.033216297 podStartE2EDuration="3.19614249s" podCreationTimestamp="2026-04-17 17:26:02 +0000 UTC" firstStartedPulling="2026-04-17 17:26:02.765045898 +0000 UTC m=+74.435125519" lastFinishedPulling="2026-04-17 17:26:04.927972088 +0000 UTC m=+76.598051712" observedRunningTime="2026-04-17 17:26:05.194874501 +0000 UTC m=+76.864954144" watchObservedRunningTime="2026-04-17 17:26:05.19614249 +0000 UTC m=+76.866222130" Apr 17 17:26:08.185028 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:08.184994 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"90ed7337-b3e7-477c-967b-b847ef1e7833","Type":"ContainerStarted","Data":"e084053f42ce624584b679abfb2291b163d92e6648a57832d9307444d88e6a3c"} Apr 17 17:26:08.185028 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:08.185033 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"90ed7337-b3e7-477c-967b-b847ef1e7833","Type":"ContainerStarted","Data":"5ba24b869b0dac7f167c2b48f2098f2a0dd1364dab2b4e8a9d87431398b1d862"} Apr 17 17:26:09.191740 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:09.191701 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"90ed7337-b3e7-477c-967b-b847ef1e7833","Type":"ContainerStarted","Data":"19365945109b9d5cd2359f12d7f00777089a46267efc150f9b8c2c0ab54909ae"} Apr 17 17:26:09.192074 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:09.191749 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"90ed7337-b3e7-477c-967b-b847ef1e7833","Type":"ContainerStarted","Data":"8b3822dc02f7688dceb79a0caa6b722f2445ff3b90cd5fe8db66d3596d17f057"} Apr 17 17:26:09.192074 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:09.191763 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"90ed7337-b3e7-477c-967b-b847ef1e7833","Type":"ContainerStarted","Data":"937c58645429ce343a0715c81301c81e67d01087f5bfa615484fd0232edf616c"} Apr 17 17:26:09.310829 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:09.310796 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b2a4b11e-5add-4df7-8e69-5b3342e010fe-original-pull-secret\") pod \"global-pull-secret-syncer-q4gqn\" (UID: \"b2a4b11e-5add-4df7-8e69-5b3342e010fe\") " pod="kube-system/global-pull-secret-syncer-q4gqn" Apr 17 17:26:09.313512 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:09.313492 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 17:26:09.323682 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:09.323662 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b2a4b11e-5add-4df7-8e69-5b3342e010fe-original-pull-secret\") pod \"global-pull-secret-syncer-q4gqn\" (UID: \"b2a4b11e-5add-4df7-8e69-5b3342e010fe\") " pod="kube-system/global-pull-secret-syncer-q4gqn" Apr 17 17:26:09.348790 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:09.348768 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q4gqn" Apr 17 17:26:09.461956 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:09.461931 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-q4gqn"] Apr 17 17:26:09.465027 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:26:09.465002 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2a4b11e_5add_4df7_8e69_5b3342e010fe.slice/crio-2acd8ad454db83ed12bd2060cff6a8881c5f6f81abbf2a50bf6bbb917667912f WatchSource:0}: Error finding container 2acd8ad454db83ed12bd2060cff6a8881c5f6f81abbf2a50bf6bbb917667912f: Status 404 returned error can't find the container with id 2acd8ad454db83ed12bd2060cff6a8881c5f6f81abbf2a50bf6bbb917667912f Apr 17 17:26:10.199569 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:10.197080 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-q4gqn" event={"ID":"b2a4b11e-5add-4df7-8e69-5b3342e010fe","Type":"ContainerStarted","Data":"2acd8ad454db83ed12bd2060cff6a8881c5f6f81abbf2a50bf6bbb917667912f"} Apr 17 17:26:10.202609 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:10.202573 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"90ed7337-b3e7-477c-967b-b847ef1e7833","Type":"ContainerStarted","Data":"d9ae1850dae47902392289c56a6a40b5c3b8abb1adb2280e573b1b54c413fad2"} Apr 17 17:26:10.231615 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:10.231534 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.405700527 podStartE2EDuration="7.231509158s" podCreationTimestamp="2026-04-17 17:26:03 +0000 UTC" firstStartedPulling="2026-04-17 17:26:04.170377733 +0000 UTC m=+75.840457354" lastFinishedPulling="2026-04-17 17:26:08.996186362 +0000 UTC m=+80.666265985" observedRunningTime="2026-04-17 17:26:10.230399308 +0000 UTC m=+81.900478953" watchObservedRunningTime="2026-04-17 17:26:10.231509158 +0000 UTC m=+81.901588812" Apr 17 17:26:13.581441 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:13.581405 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:14.217341 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:14.217306 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-q4gqn" event={"ID":"b2a4b11e-5add-4df7-8e69-5b3342e010fe","Type":"ContainerStarted","Data":"ff02eb9c352fac068fb1fd34b04d6c233f85163bc5d608ea96ca2aab7c1a7e5a"} Apr 17 17:26:14.234916 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:14.234866 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-q4gqn" podStartSLOduration=65.429278307 podStartE2EDuration="1m9.23485152s" podCreationTimestamp="2026-04-17 17:25:05 +0000 UTC" firstStartedPulling="2026-04-17 17:26:09.466674198 +0000 UTC m=+81.136753819" lastFinishedPulling="2026-04-17 17:26:13.272247408 +0000 UTC m=+84.942327032" observedRunningTime="2026-04-17 17:26:14.234525216 +0000 UTC m=+85.904604857" watchObservedRunningTime="2026-04-17 17:26:14.23485152 +0000 UTC m=+85.904931164" Apr 17 17:26:30.143322 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:26:30.143287 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-vdhlz" Apr 17 17:27:03.581519 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:27:03.581465 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:03.599955 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:27:03.599929 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:04.371133 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:27:04.371105 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:48.766471 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:29:48.766436 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz777_35fe1bc2-e1a4-4744-8f67-3cc38747e567/ovn-acl-logging/0.log" Apr 17 17:29:48.766963 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:29:48.766574 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz777_35fe1bc2-e1a4-4744-8f67-3cc38747e567/ovn-acl-logging/0.log" Apr 17 17:29:48.768840 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:29:48.768822 2566 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 17:31:07.153238 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:07.153205 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-bgpkt"] Apr 17 17:31:07.156209 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:07.156193 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-bgpkt" Apr 17 17:31:07.159116 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:07.159095 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-bv46s\"" Apr 17 17:31:07.159214 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:07.159097 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 17 17:31:07.159953 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:07.159937 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 17 17:31:07.160021 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:07.159937 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 17 17:31:07.164047 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:07.164029 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-bgpkt"] Apr 17 17:31:07.222428 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:07.222402 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8n8s\" (UniqueName: \"kubernetes.io/projected/7bae9142-97fb-4c94-92a0-080475df4674-kube-api-access-z8n8s\") pod \"s3-init-bgpkt\" (UID: \"7bae9142-97fb-4c94-92a0-080475df4674\") " pod="kserve/s3-init-bgpkt" Apr 17 17:31:07.322959 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:07.322927 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z8n8s\" (UniqueName: \"kubernetes.io/projected/7bae9142-97fb-4c94-92a0-080475df4674-kube-api-access-z8n8s\") pod \"s3-init-bgpkt\" (UID: \"7bae9142-97fb-4c94-92a0-080475df4674\") " pod="kserve/s3-init-bgpkt" Apr 17 17:31:07.331189 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:07.331170 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8n8s\" (UniqueName: \"kubernetes.io/projected/7bae9142-97fb-4c94-92a0-080475df4674-kube-api-access-z8n8s\") pod \"s3-init-bgpkt\" (UID: \"7bae9142-97fb-4c94-92a0-080475df4674\") " pod="kserve/s3-init-bgpkt" Apr 17 17:31:07.478930 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:07.478840 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-bgpkt" Apr 17 17:31:07.596959 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:07.596938 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-bgpkt"] Apr 17 17:31:07.599547 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:31:07.599519 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7bae9142_97fb_4c94_92a0_080475df4674.slice/crio-87f32a3017b158436a0bcffbd151cfd000922e35b4a727db0d56863d5025159c WatchSource:0}: Error finding container 87f32a3017b158436a0bcffbd151cfd000922e35b4a727db0d56863d5025159c: Status 404 returned error can't find the container with id 87f32a3017b158436a0bcffbd151cfd000922e35b4a727db0d56863d5025159c Apr 17 17:31:07.603325 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:07.603303 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:31:07.988694 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:07.988656 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-bgpkt" event={"ID":"7bae9142-97fb-4c94-92a0-080475df4674","Type":"ContainerStarted","Data":"87f32a3017b158436a0bcffbd151cfd000922e35b4a727db0d56863d5025159c"} Apr 17 17:31:12.002265 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:12.002230 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-bgpkt" event={"ID":"7bae9142-97fb-4c94-92a0-080475df4674","Type":"ContainerStarted","Data":"b081e6e0211be44c587f15b5984e2dd64d0126c91a08c4dfc3ae8adcb36932ef"} Apr 17 17:31:15.011204 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:15.011174 2566 generic.go:358] "Generic (PLEG): container finished" podID="7bae9142-97fb-4c94-92a0-080475df4674" containerID="b081e6e0211be44c587f15b5984e2dd64d0126c91a08c4dfc3ae8adcb36932ef" exitCode=0 Apr 17 17:31:15.011566 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:15.011243 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-bgpkt" event={"ID":"7bae9142-97fb-4c94-92a0-080475df4674","Type":"ContainerDied","Data":"b081e6e0211be44c587f15b5984e2dd64d0126c91a08c4dfc3ae8adcb36932ef"} Apr 17 17:31:16.141378 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:16.141355 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-bgpkt" Apr 17 17:31:16.206102 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:16.206071 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8n8s\" (UniqueName: \"kubernetes.io/projected/7bae9142-97fb-4c94-92a0-080475df4674-kube-api-access-z8n8s\") pod \"7bae9142-97fb-4c94-92a0-080475df4674\" (UID: \"7bae9142-97fb-4c94-92a0-080475df4674\") " Apr 17 17:31:16.208190 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:16.208168 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bae9142-97fb-4c94-92a0-080475df4674-kube-api-access-z8n8s" (OuterVolumeSpecName: "kube-api-access-z8n8s") pod "7bae9142-97fb-4c94-92a0-080475df4674" (UID: "7bae9142-97fb-4c94-92a0-080475df4674"). InnerVolumeSpecName "kube-api-access-z8n8s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:31:16.306847 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:16.306769 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z8n8s\" (UniqueName: \"kubernetes.io/projected/7bae9142-97fb-4c94-92a0-080475df4674-kube-api-access-z8n8s\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:31:17.016810 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:17.016784 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-bgpkt" Apr 17 17:31:17.016810 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:17.016796 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-bgpkt" event={"ID":"7bae9142-97fb-4c94-92a0-080475df4674","Type":"ContainerDied","Data":"87f32a3017b158436a0bcffbd151cfd000922e35b4a727db0d56863d5025159c"} Apr 17 17:31:17.016987 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:17.016821 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87f32a3017b158436a0bcffbd151cfd000922e35b4a727db0d56863d5025159c" Apr 17 17:31:50.646489 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:50.646456 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-custom-tgzvp"] Apr 17 17:31:50.646939 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:50.646730 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7bae9142-97fb-4c94-92a0-080475df4674" containerName="s3-init" Apr 17 17:31:50.646939 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:50.646740 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bae9142-97fb-4c94-92a0-080475df4674" containerName="s3-init" Apr 17 17:31:50.646939 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:50.646800 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="7bae9142-97fb-4c94-92a0-080475df4674" containerName="s3-init" Apr 17 17:31:50.670884 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:50.670859 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-tgzvp"] Apr 17 17:31:50.671006 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:50.670951 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-tgzvp" Apr 17 17:31:50.673520 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:50.673499 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 17 17:31:50.673679 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:50.673559 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 17 17:31:50.674391 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:50.674371 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-bv46s\"" Apr 17 17:31:50.674391 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:50.674382 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 17 17:31:50.780399 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:50.780371 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw8cn\" (UniqueName: \"kubernetes.io/projected/3d2d0aca-d91c-46e6-b056-c050639d3ba6-kube-api-access-dw8cn\") pod \"s3-tls-init-custom-tgzvp\" (UID: \"3d2d0aca-d91c-46e6-b056-c050639d3ba6\") " pod="kserve/s3-tls-init-custom-tgzvp" Apr 17 17:31:50.880958 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:50.880924 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dw8cn\" (UniqueName: \"kubernetes.io/projected/3d2d0aca-d91c-46e6-b056-c050639d3ba6-kube-api-access-dw8cn\") pod \"s3-tls-init-custom-tgzvp\" (UID: \"3d2d0aca-d91c-46e6-b056-c050639d3ba6\") " pod="kserve/s3-tls-init-custom-tgzvp" Apr 17 17:31:50.889299 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:50.889245 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw8cn\" (UniqueName: \"kubernetes.io/projected/3d2d0aca-d91c-46e6-b056-c050639d3ba6-kube-api-access-dw8cn\") pod \"s3-tls-init-custom-tgzvp\" (UID: \"3d2d0aca-d91c-46e6-b056-c050639d3ba6\") " pod="kserve/s3-tls-init-custom-tgzvp" Apr 17 17:31:50.991975 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:50.991907 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-tgzvp" Apr 17 17:31:51.109778 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:51.109659 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-tgzvp"] Apr 17 17:31:51.112613 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:31:51.112586 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d2d0aca_d91c_46e6_b056_c050639d3ba6.slice/crio-de735ed872da888e9af2c58d48335bc793f1dc753336dfd0b941c384e87d6b50 WatchSource:0}: Error finding container de735ed872da888e9af2c58d48335bc793f1dc753336dfd0b941c384e87d6b50: Status 404 returned error can't find the container with id de735ed872da888e9af2c58d48335bc793f1dc753336dfd0b941c384e87d6b50 Apr 17 17:31:52.116039 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:52.116002 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-tgzvp" event={"ID":"3d2d0aca-d91c-46e6-b056-c050639d3ba6","Type":"ContainerStarted","Data":"a1ee3fe9158fad40ba2a296ad4767574590d0a48eec641141eae8f4a4bfedccc"} Apr 17 17:31:52.116412 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:52.116048 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-tgzvp" event={"ID":"3d2d0aca-d91c-46e6-b056-c050639d3ba6","Type":"ContainerStarted","Data":"de735ed872da888e9af2c58d48335bc793f1dc753336dfd0b941c384e87d6b50"} Apr 17 17:31:52.134862 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:52.134807 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-custom-tgzvp" podStartSLOduration=2.134788002 podStartE2EDuration="2.134788002s" podCreationTimestamp="2026-04-17 17:31:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:31:52.132410901 +0000 UTC m=+423.802490544" watchObservedRunningTime="2026-04-17 17:31:52.134788002 +0000 UTC m=+423.804867645" Apr 17 17:31:55.125346 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:55.125316 2566 generic.go:358] "Generic (PLEG): container finished" podID="3d2d0aca-d91c-46e6-b056-c050639d3ba6" containerID="a1ee3fe9158fad40ba2a296ad4767574590d0a48eec641141eae8f4a4bfedccc" exitCode=0 Apr 17 17:31:55.125715 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:55.125383 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-tgzvp" event={"ID":"3d2d0aca-d91c-46e6-b056-c050639d3ba6","Type":"ContainerDied","Data":"a1ee3fe9158fad40ba2a296ad4767574590d0a48eec641141eae8f4a4bfedccc"} Apr 17 17:31:56.247985 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:56.247924 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-tgzvp" Apr 17 17:31:56.326285 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:56.326243 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dw8cn\" (UniqueName: \"kubernetes.io/projected/3d2d0aca-d91c-46e6-b056-c050639d3ba6-kube-api-access-dw8cn\") pod \"3d2d0aca-d91c-46e6-b056-c050639d3ba6\" (UID: \"3d2d0aca-d91c-46e6-b056-c050639d3ba6\") " Apr 17 17:31:56.328123 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:56.328101 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d2d0aca-d91c-46e6-b056-c050639d3ba6-kube-api-access-dw8cn" (OuterVolumeSpecName: "kube-api-access-dw8cn") pod "3d2d0aca-d91c-46e6-b056-c050639d3ba6" (UID: "3d2d0aca-d91c-46e6-b056-c050639d3ba6"). InnerVolumeSpecName "kube-api-access-dw8cn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:31:56.427807 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:56.427717 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dw8cn\" (UniqueName: \"kubernetes.io/projected/3d2d0aca-d91c-46e6-b056-c050639d3ba6-kube-api-access-dw8cn\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:31:57.132117 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:57.132044 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-tgzvp" Apr 17 17:31:57.132117 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:57.132052 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-tgzvp" event={"ID":"3d2d0aca-d91c-46e6-b056-c050639d3ba6","Type":"ContainerDied","Data":"de735ed872da888e9af2c58d48335bc793f1dc753336dfd0b941c384e87d6b50"} Apr 17 17:31:57.132117 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:57.132082 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de735ed872da888e9af2c58d48335bc793f1dc753336dfd0b941c384e87d6b50" Apr 17 17:31:59.933294 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:59.933238 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-serving-s2mjg"] Apr 17 17:31:59.933763 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:59.933692 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3d2d0aca-d91c-46e6-b056-c050639d3ba6" containerName="s3-tls-init-custom" Apr 17 17:31:59.933763 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:59.933709 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d2d0aca-d91c-46e6-b056-c050639d3ba6" containerName="s3-tls-init-custom" Apr 17 17:31:59.933870 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:59.933774 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="3d2d0aca-d91c-46e6-b056-c050639d3ba6" containerName="s3-tls-init-custom" Apr 17 17:31:59.936656 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:59.936630 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-s2mjg" Apr 17 17:31:59.939816 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:59.939792 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 17 17:31:59.939953 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:59.939852 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-bv46s\"" Apr 17 17:31:59.939953 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:59.939913 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 17 17:31:59.940423 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:59.940403 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 17 17:31:59.945736 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:31:59.945714 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-s2mjg"] Apr 17 17:32:00.056909 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:00.056878 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnqfw\" (UniqueName: \"kubernetes.io/projected/b45b0d0d-ce34-4f8e-b448-0d33f7523226-kube-api-access-pnqfw\") pod \"s3-tls-init-serving-s2mjg\" (UID: \"b45b0d0d-ce34-4f8e-b448-0d33f7523226\") " pod="kserve/s3-tls-init-serving-s2mjg" Apr 17 17:32:00.157929 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:00.157897 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pnqfw\" (UniqueName: \"kubernetes.io/projected/b45b0d0d-ce34-4f8e-b448-0d33f7523226-kube-api-access-pnqfw\") pod \"s3-tls-init-serving-s2mjg\" (UID: \"b45b0d0d-ce34-4f8e-b448-0d33f7523226\") " pod="kserve/s3-tls-init-serving-s2mjg" Apr 17 17:32:00.166271 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:00.166226 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnqfw\" (UniqueName: \"kubernetes.io/projected/b45b0d0d-ce34-4f8e-b448-0d33f7523226-kube-api-access-pnqfw\") pod \"s3-tls-init-serving-s2mjg\" (UID: \"b45b0d0d-ce34-4f8e-b448-0d33f7523226\") " pod="kserve/s3-tls-init-serving-s2mjg" Apr 17 17:32:00.261136 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:00.261065 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-s2mjg" Apr 17 17:32:00.372016 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:00.371986 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-s2mjg"] Apr 17 17:32:00.375577 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:32:00.375549 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb45b0d0d_ce34_4f8e_b448_0d33f7523226.slice/crio-04cb6a46f796079ecff25b7c3047183464ac68f7282df9beff45f7f232fa9f49 WatchSource:0}: Error finding container 04cb6a46f796079ecff25b7c3047183464ac68f7282df9beff45f7f232fa9f49: Status 404 returned error can't find the container with id 04cb6a46f796079ecff25b7c3047183464ac68f7282df9beff45f7f232fa9f49 Apr 17 17:32:01.142865 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:01.142832 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-s2mjg" event={"ID":"b45b0d0d-ce34-4f8e-b448-0d33f7523226","Type":"ContainerStarted","Data":"a63aeb1dbb99501fef9c97c23d1a9817a62c8f122d7e47a7be904964fdacfa2b"} Apr 17 17:32:01.142865 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:01.142865 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-s2mjg" event={"ID":"b45b0d0d-ce34-4f8e-b448-0d33f7523226","Type":"ContainerStarted","Data":"04cb6a46f796079ecff25b7c3047183464ac68f7282df9beff45f7f232fa9f49"} Apr 17 17:32:01.174559 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:01.174498 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-serving-s2mjg" podStartSLOduration=2.174480808 podStartE2EDuration="2.174480808s" podCreationTimestamp="2026-04-17 17:31:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:32:01.172838472 +0000 UTC m=+432.842918114" watchObservedRunningTime="2026-04-17 17:32:01.174480808 +0000 UTC m=+432.844560451" Apr 17 17:32:06.158674 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:06.158642 2566 generic.go:358] "Generic (PLEG): container finished" podID="b45b0d0d-ce34-4f8e-b448-0d33f7523226" containerID="a63aeb1dbb99501fef9c97c23d1a9817a62c8f122d7e47a7be904964fdacfa2b" exitCode=0 Apr 17 17:32:06.159054 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:06.158693 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-s2mjg" event={"ID":"b45b0d0d-ce34-4f8e-b448-0d33f7523226","Type":"ContainerDied","Data":"a63aeb1dbb99501fef9c97c23d1a9817a62c8f122d7e47a7be904964fdacfa2b"} Apr 17 17:32:07.285903 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:07.285882 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-s2mjg" Apr 17 17:32:07.418737 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:07.418661 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnqfw\" (UniqueName: \"kubernetes.io/projected/b45b0d0d-ce34-4f8e-b448-0d33f7523226-kube-api-access-pnqfw\") pod \"b45b0d0d-ce34-4f8e-b448-0d33f7523226\" (UID: \"b45b0d0d-ce34-4f8e-b448-0d33f7523226\") " Apr 17 17:32:07.420658 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:07.420633 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b45b0d0d-ce34-4f8e-b448-0d33f7523226-kube-api-access-pnqfw" (OuterVolumeSpecName: "kube-api-access-pnqfw") pod "b45b0d0d-ce34-4f8e-b448-0d33f7523226" (UID: "b45b0d0d-ce34-4f8e-b448-0d33f7523226"). InnerVolumeSpecName "kube-api-access-pnqfw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:32:07.519783 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:07.519753 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pnqfw\" (UniqueName: \"kubernetes.io/projected/b45b0d0d-ce34-4f8e-b448-0d33f7523226-kube-api-access-pnqfw\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:32:08.165440 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:08.165403 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-s2mjg" event={"ID":"b45b0d0d-ce34-4f8e-b448-0d33f7523226","Type":"ContainerDied","Data":"04cb6a46f796079ecff25b7c3047183464ac68f7282df9beff45f7f232fa9f49"} Apr 17 17:32:08.165440 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:08.165437 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04cb6a46f796079ecff25b7c3047183464ac68f7282df9beff45f7f232fa9f49" Apr 17 17:32:08.165440 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:08.165418 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-s2mjg" Apr 17 17:32:17.945319 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:17.945286 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8"] Apr 17 17:32:17.945669 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:17.945572 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b45b0d0d-ce34-4f8e-b448-0d33f7523226" containerName="s3-tls-init-serving" Apr 17 17:32:17.945669 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:17.945582 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="b45b0d0d-ce34-4f8e-b448-0d33f7523226" containerName="s3-tls-init-serving" Apr 17 17:32:17.945669 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:17.945629 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="b45b0d0d-ce34-4f8e-b448-0d33f7523226" containerName="s3-tls-init-serving" Apr 17 17:32:17.947802 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:17.947785 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" Apr 17 17:32:17.950184 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:17.950156 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\"" Apr 17 17:32:17.951131 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:17.951102 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 17 17:32:17.951203 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:17.951110 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-b25c4\"" Apr 17 17:32:17.951203 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:17.951110 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-predictor-serving-cert\"" Apr 17 17:32:17.951203 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:17.951168 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 17 17:32:17.973726 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:17.973706 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8"] Apr 17 17:32:18.104246 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:18.104202 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3627da1a-71ab-4d46-990c-317de290cc90-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8\" (UID: \"3627da1a-71ab-4d46-990c-317de290cc90\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" Apr 17 17:32:18.104246 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:18.104248 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3627da1a-71ab-4d46-990c-317de290cc90-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8\" (UID: \"3627da1a-71ab-4d46-990c-317de290cc90\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" Apr 17 17:32:18.104463 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:18.104309 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fjk9\" (UniqueName: \"kubernetes.io/projected/3627da1a-71ab-4d46-990c-317de290cc90-kube-api-access-7fjk9\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8\" (UID: \"3627da1a-71ab-4d46-990c-317de290cc90\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" Apr 17 17:32:18.104463 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:18.104385 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3627da1a-71ab-4d46-990c-317de290cc90-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8\" (UID: \"3627da1a-71ab-4d46-990c-317de290cc90\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" Apr 17 17:32:18.204780 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:18.204711 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fjk9\" (UniqueName: \"kubernetes.io/projected/3627da1a-71ab-4d46-990c-317de290cc90-kube-api-access-7fjk9\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8\" (UID: \"3627da1a-71ab-4d46-990c-317de290cc90\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" Apr 17 17:32:18.204780 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:18.204769 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3627da1a-71ab-4d46-990c-317de290cc90-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8\" (UID: \"3627da1a-71ab-4d46-990c-317de290cc90\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" Apr 17 17:32:18.204938 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:18.204801 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3627da1a-71ab-4d46-990c-317de290cc90-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8\" (UID: \"3627da1a-71ab-4d46-990c-317de290cc90\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" Apr 17 17:32:18.204938 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:18.204819 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3627da1a-71ab-4d46-990c-317de290cc90-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8\" (UID: \"3627da1a-71ab-4d46-990c-317de290cc90\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" Apr 17 17:32:18.205144 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:18.205129 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3627da1a-71ab-4d46-990c-317de290cc90-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8\" (UID: \"3627da1a-71ab-4d46-990c-317de290cc90\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" Apr 17 17:32:18.205571 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:18.205553 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3627da1a-71ab-4d46-990c-317de290cc90-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8\" (UID: \"3627da1a-71ab-4d46-990c-317de290cc90\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" Apr 17 17:32:18.207202 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:18.207186 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3627da1a-71ab-4d46-990c-317de290cc90-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8\" (UID: \"3627da1a-71ab-4d46-990c-317de290cc90\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" Apr 17 17:32:18.221754 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:18.221728 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fjk9\" (UniqueName: \"kubernetes.io/projected/3627da1a-71ab-4d46-990c-317de290cc90-kube-api-access-7fjk9\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8\" (UID: \"3627da1a-71ab-4d46-990c-317de290cc90\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" Apr 17 17:32:18.257769 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:18.257746 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" Apr 17 17:32:18.388044 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:18.388020 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8"] Apr 17 17:32:18.390109 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:32:18.390081 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3627da1a_71ab_4d46_990c_317de290cc90.slice/crio-f9c2dd4e645a53eb2139b195a7172d51c681689de443a7c77288802e4df0988a WatchSource:0}: Error finding container f9c2dd4e645a53eb2139b195a7172d51c681689de443a7c77288802e4df0988a: Status 404 returned error can't find the container with id f9c2dd4e645a53eb2139b195a7172d51c681689de443a7c77288802e4df0988a Apr 17 17:32:19.197914 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:19.197855 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" event={"ID":"3627da1a-71ab-4d46-990c-317de290cc90","Type":"ContainerStarted","Data":"f9c2dd4e645a53eb2139b195a7172d51c681689de443a7c77288802e4df0988a"} Apr 17 17:32:22.208603 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:22.208559 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" event={"ID":"3627da1a-71ab-4d46-990c-317de290cc90","Type":"ContainerStarted","Data":"2334288eec6824750109eba8d82bc95cce7e996e48dd214b30fdd6fc3fd94c14"} Apr 17 17:32:26.220985 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:26.220894 2566 generic.go:358] "Generic (PLEG): container finished" podID="3627da1a-71ab-4d46-990c-317de290cc90" containerID="2334288eec6824750109eba8d82bc95cce7e996e48dd214b30fdd6fc3fd94c14" exitCode=0 Apr 17 17:32:26.221355 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:26.220973 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" event={"ID":"3627da1a-71ab-4d46-990c-317de290cc90","Type":"ContainerDied","Data":"2334288eec6824750109eba8d82bc95cce7e996e48dd214b30fdd6fc3fd94c14"} Apr 17 17:32:39.275314 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:39.275276 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" event={"ID":"3627da1a-71ab-4d46-990c-317de290cc90","Type":"ContainerStarted","Data":"2b6d96cfc02d5cdd0dddb6c639307a709fe2322847fb6650f7a71b082867be26"} Apr 17 17:32:43.292614 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:43.292577 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" event={"ID":"3627da1a-71ab-4d46-990c-317de290cc90","Type":"ContainerStarted","Data":"d57da273c202b3cf15abebbaec28f6a1cb7f3f34c3c69d87e4eeea7b2c2f57d1"} Apr 17 17:32:46.303920 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:46.303881 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" event={"ID":"3627da1a-71ab-4d46-990c-317de290cc90","Type":"ContainerStarted","Data":"2e826f7e336fa9b67f2387ce13604e47e73b0b60b7e9248b5e408c5ddcf8835a"} Apr 17 17:32:46.304330 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:46.304055 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" Apr 17 17:32:46.343144 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:46.343095 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" podStartSLOduration=2.207709777 podStartE2EDuration="29.343080408s" podCreationTimestamp="2026-04-17 17:32:17 +0000 UTC" firstStartedPulling="2026-04-17 17:32:18.392014843 +0000 UTC m=+450.062094463" lastFinishedPulling="2026-04-17 17:32:45.527385473 +0000 UTC m=+477.197465094" observedRunningTime="2026-04-17 17:32:46.341944166 +0000 UTC m=+478.012023808" watchObservedRunningTime="2026-04-17 17:32:46.343080408 +0000 UTC m=+478.013160048" Apr 17 17:32:47.307503 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:47.307473 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" Apr 17 17:32:47.307503 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:47.307505 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" Apr 17 17:32:47.308748 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:47.308703 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" podUID="3627da1a-71ab-4d46-990c-317de290cc90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.18:8080: connect: connection refused" Apr 17 17:32:47.309506 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:47.309477 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" podUID="3627da1a-71ab-4d46-990c-317de290cc90" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:32:47.312201 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:47.312184 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" Apr 17 17:32:48.310018 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:48.309978 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" podUID="3627da1a-71ab-4d46-990c-317de290cc90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.18:8080: connect: connection refused" Apr 17 17:32:48.310403 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:48.310377 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" podUID="3627da1a-71ab-4d46-990c-317de290cc90" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:32:49.312607 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:49.312571 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" podUID="3627da1a-71ab-4d46-990c-317de290cc90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.18:8080: connect: connection refused" Apr 17 17:32:49.313020 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:49.312916 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" podUID="3627da1a-71ab-4d46-990c-317de290cc90" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:32:59.312905 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:59.312858 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" podUID="3627da1a-71ab-4d46-990c-317de290cc90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.18:8080: connect: connection refused" Apr 17 17:32:59.313418 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:32:59.313394 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" podUID="3627da1a-71ab-4d46-990c-317de290cc90" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:33:09.312591 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:33:09.312544 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" podUID="3627da1a-71ab-4d46-990c-317de290cc90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.18:8080: connect: connection refused" Apr 17 17:33:09.313046 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:33:09.313006 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" podUID="3627da1a-71ab-4d46-990c-317de290cc90" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:33:19.312966 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:33:19.312918 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" podUID="3627da1a-71ab-4d46-990c-317de290cc90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.18:8080: connect: connection refused" Apr 17 17:33:19.313491 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:33:19.313400 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" podUID="3627da1a-71ab-4d46-990c-317de290cc90" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:33:29.313424 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:33:29.313385 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" podUID="3627da1a-71ab-4d46-990c-317de290cc90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.18:8080: connect: connection refused" Apr 17 17:33:29.313879 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:33:29.313832 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" podUID="3627da1a-71ab-4d46-990c-317de290cc90" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:33:39.313234 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:33:39.313191 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" podUID="3627da1a-71ab-4d46-990c-317de290cc90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.18:8080: connect: connection refused" Apr 17 17:33:39.313697 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:33:39.313641 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" podUID="3627da1a-71ab-4d46-990c-317de290cc90" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:33:49.313268 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:33:49.313217 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" Apr 17 17:33:49.313641 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:33:49.313527 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" Apr 17 17:34:02.940401 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:02.940370 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8"] Apr 17 17:34:02.940815 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:02.940709 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" podUID="3627da1a-71ab-4d46-990c-317de290cc90" containerName="agent" containerID="cri-o://2e826f7e336fa9b67f2387ce13604e47e73b0b60b7e9248b5e408c5ddcf8835a" gracePeriod=30 Apr 17 17:34:02.940815 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:02.940707 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" podUID="3627da1a-71ab-4d46-990c-317de290cc90" containerName="kserve-container" containerID="cri-o://2b6d96cfc02d5cdd0dddb6c639307a709fe2322847fb6650f7a71b082867be26" gracePeriod=30 Apr 17 17:34:02.940815 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:02.940758 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" podUID="3627da1a-71ab-4d46-990c-317de290cc90" containerName="kube-rbac-proxy" containerID="cri-o://d57da273c202b3cf15abebbaec28f6a1cb7f3f34c3c69d87e4eeea7b2c2f57d1" gracePeriod=30 Apr 17 17:34:03.120061 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:03.120022 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2"] Apr 17 17:34:03.123485 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:03.123462 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" Apr 17 17:34:03.125807 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:03.125785 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-custom-predictor-serving-cert\"" Apr 17 17:34:03.125913 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:03.125815 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\"" Apr 17 17:34:03.133966 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:03.133946 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2"] Apr 17 17:34:03.273507 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:03.273419 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4b058759-d1db-4428-a759-8c2d79f38eec-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2\" (UID: \"4b058759-d1db-4428-a759-8c2d79f38eec\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" Apr 17 17:34:03.273507 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:03.273466 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drnj7\" (UniqueName: \"kubernetes.io/projected/4b058759-d1db-4428-a759-8c2d79f38eec-kube-api-access-drnj7\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2\" (UID: \"4b058759-d1db-4428-a759-8c2d79f38eec\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" Apr 17 17:34:03.273690 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:03.273551 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4b058759-d1db-4428-a759-8c2d79f38eec-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2\" (UID: \"4b058759-d1db-4428-a759-8c2d79f38eec\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" Apr 17 17:34:03.273690 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:03.273603 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4b058759-d1db-4428-a759-8c2d79f38eec-proxy-tls\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2\" (UID: \"4b058759-d1db-4428-a759-8c2d79f38eec\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" Apr 17 17:34:03.374866 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:03.374831 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4b058759-d1db-4428-a759-8c2d79f38eec-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2\" (UID: \"4b058759-d1db-4428-a759-8c2d79f38eec\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" Apr 17 17:34:03.375056 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:03.374890 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4b058759-d1db-4428-a759-8c2d79f38eec-proxy-tls\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2\" (UID: \"4b058759-d1db-4428-a759-8c2d79f38eec\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" Apr 17 17:34:03.375056 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:03.374951 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4b058759-d1db-4428-a759-8c2d79f38eec-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2\" (UID: \"4b058759-d1db-4428-a759-8c2d79f38eec\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" Apr 17 17:34:03.375056 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:03.374984 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-drnj7\" (UniqueName: \"kubernetes.io/projected/4b058759-d1db-4428-a759-8c2d79f38eec-kube-api-access-drnj7\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2\" (UID: \"4b058759-d1db-4428-a759-8c2d79f38eec\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" Apr 17 17:34:03.375362 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:03.375341 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4b058759-d1db-4428-a759-8c2d79f38eec-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2\" (UID: \"4b058759-d1db-4428-a759-8c2d79f38eec\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" Apr 17 17:34:03.375585 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:03.375567 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4b058759-d1db-4428-a759-8c2d79f38eec-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2\" (UID: \"4b058759-d1db-4428-a759-8c2d79f38eec\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" Apr 17 17:34:03.377192 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:03.377175 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4b058759-d1db-4428-a759-8c2d79f38eec-proxy-tls\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2\" (UID: \"4b058759-d1db-4428-a759-8c2d79f38eec\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" Apr 17 17:34:03.383636 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:03.383615 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-drnj7\" (UniqueName: \"kubernetes.io/projected/4b058759-d1db-4428-a759-8c2d79f38eec-kube-api-access-drnj7\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2\" (UID: \"4b058759-d1db-4428-a759-8c2d79f38eec\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" Apr 17 17:34:03.433603 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:03.433563 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" Apr 17 17:34:03.532967 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:03.532891 2566 generic.go:358] "Generic (PLEG): container finished" podID="3627da1a-71ab-4d46-990c-317de290cc90" containerID="d57da273c202b3cf15abebbaec28f6a1cb7f3f34c3c69d87e4eeea7b2c2f57d1" exitCode=2 Apr 17 17:34:03.533133 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:03.532959 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" event={"ID":"3627da1a-71ab-4d46-990c-317de290cc90","Type":"ContainerDied","Data":"d57da273c202b3cf15abebbaec28f6a1cb7f3f34c3c69d87e4eeea7b2c2f57d1"} Apr 17 17:34:03.550003 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:03.549973 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2"] Apr 17 17:34:03.553815 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:34:03.553778 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b058759_d1db_4428_a759_8c2d79f38eec.slice/crio-7bedbc8b76407d58c0597f81c02dac4efa553f93d04828318132b2f5dbd69ff7 WatchSource:0}: Error finding container 7bedbc8b76407d58c0597f81c02dac4efa553f93d04828318132b2f5dbd69ff7: Status 404 returned error can't find the container with id 7bedbc8b76407d58c0597f81c02dac4efa553f93d04828318132b2f5dbd69ff7 Apr 17 17:34:04.538495 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:04.538461 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" event={"ID":"4b058759-d1db-4428-a759-8c2d79f38eec","Type":"ContainerStarted","Data":"1e293ff56c5635c17e83759304e067ee1173cf048c30dd7be95ac0da48a0e034"} Apr 17 17:34:04.538495 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:04.538498 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" event={"ID":"4b058759-d1db-4428-a759-8c2d79f38eec","Type":"ContainerStarted","Data":"7bedbc8b76407d58c0597f81c02dac4efa553f93d04828318132b2f5dbd69ff7"} Apr 17 17:34:07.308061 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:07.308020 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" podUID="3627da1a-71ab-4d46-990c-317de290cc90" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.18:8643/healthz\": dial tcp 10.133.0.18:8643: connect: connection refused" Apr 17 17:34:07.549611 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:07.549537 2566 generic.go:358] "Generic (PLEG): container finished" podID="3627da1a-71ab-4d46-990c-317de290cc90" containerID="2b6d96cfc02d5cdd0dddb6c639307a709fe2322847fb6650f7a71b082867be26" exitCode=0 Apr 17 17:34:07.549747 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:07.549611 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" event={"ID":"3627da1a-71ab-4d46-990c-317de290cc90","Type":"ContainerDied","Data":"2b6d96cfc02d5cdd0dddb6c639307a709fe2322847fb6650f7a71b082867be26"} Apr 17 17:34:07.550878 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:07.550858 2566 generic.go:358] "Generic (PLEG): container finished" podID="4b058759-d1db-4428-a759-8c2d79f38eec" containerID="1e293ff56c5635c17e83759304e067ee1173cf048c30dd7be95ac0da48a0e034" exitCode=0 Apr 17 17:34:07.550931 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:07.550896 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" event={"ID":"4b058759-d1db-4428-a759-8c2d79f38eec","Type":"ContainerDied","Data":"1e293ff56c5635c17e83759304e067ee1173cf048c30dd7be95ac0da48a0e034"} Apr 17 17:34:08.555739 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:08.555707 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" event={"ID":"4b058759-d1db-4428-a759-8c2d79f38eec","Type":"ContainerStarted","Data":"3251ef30662526e0385c70e2b52cef1956634e345c777540f2d46bddf59400c7"} Apr 17 17:34:08.555739 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:08.555744 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" event={"ID":"4b058759-d1db-4428-a759-8c2d79f38eec","Type":"ContainerStarted","Data":"d78ba668e68adfaaae79e2ca797dd078ef99b132e355dd29fc31c1d2d29e1b45"} Apr 17 17:34:08.556176 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:08.555754 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" event={"ID":"4b058759-d1db-4428-a759-8c2d79f38eec","Type":"ContainerStarted","Data":"5823aafa44d0f036e06126d593f998f0982fe4590eaaa13b4286a6906a2eaf0c"} Apr 17 17:34:08.556176 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:08.555955 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" Apr 17 17:34:08.577318 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:08.577277 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" podStartSLOduration=5.577242246 podStartE2EDuration="5.577242246s" podCreationTimestamp="2026-04-17 17:34:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:34:08.575768595 +0000 UTC m=+560.245848237" watchObservedRunningTime="2026-04-17 17:34:08.577242246 +0000 UTC m=+560.247321888" Apr 17 17:34:09.313220 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:09.313173 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" podUID="3627da1a-71ab-4d46-990c-317de290cc90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.18:8080: connect: connection refused" Apr 17 17:34:09.313573 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:09.313548 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" podUID="3627da1a-71ab-4d46-990c-317de290cc90" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:34:09.559552 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:09.559517 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" Apr 17 17:34:09.559900 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:09.559561 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" Apr 17 17:34:09.560875 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:09.560838 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" podUID="4b058759-d1db-4428-a759-8c2d79f38eec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.19:5000: connect: connection refused" Apr 17 17:34:09.561536 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:09.561511 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" podUID="4b058759-d1db-4428-a759-8c2d79f38eec" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:34:10.562559 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:10.562522 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" podUID="4b058759-d1db-4428-a759-8c2d79f38eec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.19:5000: connect: connection refused" Apr 17 17:34:10.563054 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:10.562800 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" podUID="4b058759-d1db-4428-a759-8c2d79f38eec" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:34:12.308233 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:12.308193 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" podUID="3627da1a-71ab-4d46-990c-317de290cc90" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.18:8643/healthz\": dial tcp 10.133.0.18:8643: connect: connection refused" Apr 17 17:34:15.566428 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:15.566402 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" Apr 17 17:34:15.567040 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:15.567010 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" podUID="4b058759-d1db-4428-a759-8c2d79f38eec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.19:5000: connect: connection refused" Apr 17 17:34:15.567304 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:15.567276 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" podUID="4b058759-d1db-4428-a759-8c2d79f38eec" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:34:17.308233 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:17.308194 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" podUID="3627da1a-71ab-4d46-990c-317de290cc90" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.18:8643/healthz\": dial tcp 10.133.0.18:8643: connect: connection refused" Apr 17 17:34:17.308651 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:17.308357 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" Apr 17 17:34:19.312705 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:19.312656 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" podUID="3627da1a-71ab-4d46-990c-317de290cc90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.18:8080: connect: connection refused" Apr 17 17:34:19.313079 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:19.312986 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" podUID="3627da1a-71ab-4d46-990c-317de290cc90" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:34:22.308644 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:22.308603 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" podUID="3627da1a-71ab-4d46-990c-317de290cc90" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.18:8643/healthz\": dial tcp 10.133.0.18:8643: connect: connection refused" Apr 17 17:34:25.567161 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:25.567117 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" podUID="4b058759-d1db-4428-a759-8c2d79f38eec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.19:5000: connect: connection refused" Apr 17 17:34:25.567632 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:25.567520 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" podUID="4b058759-d1db-4428-a759-8c2d79f38eec" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:34:27.308078 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:27.308037 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" podUID="3627da1a-71ab-4d46-990c-317de290cc90" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.18:8643/healthz\": dial tcp 10.133.0.18:8643: connect: connection refused" Apr 17 17:34:29.313531 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:29.313484 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" podUID="3627da1a-71ab-4d46-990c-317de290cc90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.18:8080: connect: connection refused" Apr 17 17:34:29.313899 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:29.313625 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" Apr 17 17:34:29.313899 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:29.313800 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" podUID="3627da1a-71ab-4d46-990c-317de290cc90" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:34:29.313968 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:29.313897 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" Apr 17 17:34:32.308638 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:32.308598 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" podUID="3627da1a-71ab-4d46-990c-317de290cc90" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.18:8643/healthz\": dial tcp 10.133.0.18:8643: connect: connection refused" Apr 17 17:34:33.073690 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:33.073660 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" Apr 17 17:34:33.116593 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:33.116561 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fjk9\" (UniqueName: \"kubernetes.io/projected/3627da1a-71ab-4d46-990c-317de290cc90-kube-api-access-7fjk9\") pod \"3627da1a-71ab-4d46-990c-317de290cc90\" (UID: \"3627da1a-71ab-4d46-990c-317de290cc90\") " Apr 17 17:34:33.116742 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:33.116623 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3627da1a-71ab-4d46-990c-317de290cc90-proxy-tls\") pod \"3627da1a-71ab-4d46-990c-317de290cc90\" (UID: \"3627da1a-71ab-4d46-990c-317de290cc90\") " Apr 17 17:34:33.116742 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:33.116650 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3627da1a-71ab-4d46-990c-317de290cc90-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"3627da1a-71ab-4d46-990c-317de290cc90\" (UID: \"3627da1a-71ab-4d46-990c-317de290cc90\") " Apr 17 17:34:33.117038 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:33.117012 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3627da1a-71ab-4d46-990c-317de290cc90-isvc-sklearn-batcher-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-batcher-kube-rbac-proxy-sar-config") pod "3627da1a-71ab-4d46-990c-317de290cc90" (UID: "3627da1a-71ab-4d46-990c-317de290cc90"). InnerVolumeSpecName "isvc-sklearn-batcher-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:34:33.118763 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:33.118737 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3627da1a-71ab-4d46-990c-317de290cc90-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "3627da1a-71ab-4d46-990c-317de290cc90" (UID: "3627da1a-71ab-4d46-990c-317de290cc90"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:34:33.118839 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:33.118773 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3627da1a-71ab-4d46-990c-317de290cc90-kube-api-access-7fjk9" (OuterVolumeSpecName: "kube-api-access-7fjk9") pod "3627da1a-71ab-4d46-990c-317de290cc90" (UID: "3627da1a-71ab-4d46-990c-317de290cc90"). InnerVolumeSpecName "kube-api-access-7fjk9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:34:33.217486 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:33.217389 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3627da1a-71ab-4d46-990c-317de290cc90-kserve-provision-location\") pod \"3627da1a-71ab-4d46-990c-317de290cc90\" (UID: \"3627da1a-71ab-4d46-990c-317de290cc90\") " Apr 17 17:34:33.217642 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:33.217549 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7fjk9\" (UniqueName: \"kubernetes.io/projected/3627da1a-71ab-4d46-990c-317de290cc90-kube-api-access-7fjk9\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:34:33.217642 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:33.217566 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3627da1a-71ab-4d46-990c-317de290cc90-proxy-tls\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:34:33.217642 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:33.217581 2566 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3627da1a-71ab-4d46-990c-317de290cc90-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:34:33.217739 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:33.217683 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3627da1a-71ab-4d46-990c-317de290cc90-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3627da1a-71ab-4d46-990c-317de290cc90" (UID: "3627da1a-71ab-4d46-990c-317de290cc90"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:34:33.318516 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:33.318485 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3627da1a-71ab-4d46-990c-317de290cc90-kserve-provision-location\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:34:33.630394 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:33.630357 2566 generic.go:358] "Generic (PLEG): container finished" podID="3627da1a-71ab-4d46-990c-317de290cc90" containerID="2e826f7e336fa9b67f2387ce13604e47e73b0b60b7e9248b5e408c5ddcf8835a" exitCode=0 Apr 17 17:34:33.630554 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:33.630415 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" event={"ID":"3627da1a-71ab-4d46-990c-317de290cc90","Type":"ContainerDied","Data":"2e826f7e336fa9b67f2387ce13604e47e73b0b60b7e9248b5e408c5ddcf8835a"} Apr 17 17:34:33.630554 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:33.630461 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" event={"ID":"3627da1a-71ab-4d46-990c-317de290cc90","Type":"ContainerDied","Data":"f9c2dd4e645a53eb2139b195a7172d51c681689de443a7c77288802e4df0988a"} Apr 17 17:34:33.630554 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:33.630483 2566 scope.go:117] "RemoveContainer" containerID="2e826f7e336fa9b67f2387ce13604e47e73b0b60b7e9248b5e408c5ddcf8835a" Apr 17 17:34:33.630554 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:33.630500 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8" Apr 17 17:34:33.639213 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:33.639193 2566 scope.go:117] "RemoveContainer" containerID="d57da273c202b3cf15abebbaec28f6a1cb7f3f34c3c69d87e4eeea7b2c2f57d1" Apr 17 17:34:33.647967 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:33.647949 2566 scope.go:117] "RemoveContainer" containerID="2b6d96cfc02d5cdd0dddb6c639307a709fe2322847fb6650f7a71b082867be26" Apr 17 17:34:33.653786 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:33.653765 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8"] Apr 17 17:34:33.655490 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:33.655477 2566 scope.go:117] "RemoveContainer" containerID="2334288eec6824750109eba8d82bc95cce7e996e48dd214b30fdd6fc3fd94c14" Apr 17 17:34:33.658484 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:33.658464 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-zvnq8"] Apr 17 17:34:33.661966 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:33.661948 2566 scope.go:117] "RemoveContainer" containerID="2e826f7e336fa9b67f2387ce13604e47e73b0b60b7e9248b5e408c5ddcf8835a" Apr 17 17:34:33.662195 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:34:33.662176 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e826f7e336fa9b67f2387ce13604e47e73b0b60b7e9248b5e408c5ddcf8835a\": container with ID starting with 2e826f7e336fa9b67f2387ce13604e47e73b0b60b7e9248b5e408c5ddcf8835a not found: ID does not exist" containerID="2e826f7e336fa9b67f2387ce13604e47e73b0b60b7e9248b5e408c5ddcf8835a" Apr 17 17:34:33.662274 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:33.662202 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e826f7e336fa9b67f2387ce13604e47e73b0b60b7e9248b5e408c5ddcf8835a"} err="failed to get container status \"2e826f7e336fa9b67f2387ce13604e47e73b0b60b7e9248b5e408c5ddcf8835a\": rpc error: code = NotFound desc = could not find container \"2e826f7e336fa9b67f2387ce13604e47e73b0b60b7e9248b5e408c5ddcf8835a\": container with ID starting with 2e826f7e336fa9b67f2387ce13604e47e73b0b60b7e9248b5e408c5ddcf8835a not found: ID does not exist" Apr 17 17:34:33.662274 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:33.662233 2566 scope.go:117] "RemoveContainer" containerID="d57da273c202b3cf15abebbaec28f6a1cb7f3f34c3c69d87e4eeea7b2c2f57d1" Apr 17 17:34:33.662482 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:34:33.662467 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d57da273c202b3cf15abebbaec28f6a1cb7f3f34c3c69d87e4eeea7b2c2f57d1\": container with ID starting with d57da273c202b3cf15abebbaec28f6a1cb7f3f34c3c69d87e4eeea7b2c2f57d1 not found: ID does not exist" containerID="d57da273c202b3cf15abebbaec28f6a1cb7f3f34c3c69d87e4eeea7b2c2f57d1" Apr 17 17:34:33.662518 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:33.662488 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d57da273c202b3cf15abebbaec28f6a1cb7f3f34c3c69d87e4eeea7b2c2f57d1"} err="failed to get container status \"d57da273c202b3cf15abebbaec28f6a1cb7f3f34c3c69d87e4eeea7b2c2f57d1\": rpc error: code = NotFound desc = could not find container \"d57da273c202b3cf15abebbaec28f6a1cb7f3f34c3c69d87e4eeea7b2c2f57d1\": container with ID starting with d57da273c202b3cf15abebbaec28f6a1cb7f3f34c3c69d87e4eeea7b2c2f57d1 not found: ID does not exist" Apr 17 17:34:33.662518 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:33.662501 2566 scope.go:117] "RemoveContainer" containerID="2b6d96cfc02d5cdd0dddb6c639307a709fe2322847fb6650f7a71b082867be26" Apr 17 17:34:33.662689 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:34:33.662674 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b6d96cfc02d5cdd0dddb6c639307a709fe2322847fb6650f7a71b082867be26\": container with ID starting with 2b6d96cfc02d5cdd0dddb6c639307a709fe2322847fb6650f7a71b082867be26 not found: ID does not exist" containerID="2b6d96cfc02d5cdd0dddb6c639307a709fe2322847fb6650f7a71b082867be26" Apr 17 17:34:33.662728 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:33.662692 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b6d96cfc02d5cdd0dddb6c639307a709fe2322847fb6650f7a71b082867be26"} err="failed to get container status \"2b6d96cfc02d5cdd0dddb6c639307a709fe2322847fb6650f7a71b082867be26\": rpc error: code = NotFound desc = could not find container \"2b6d96cfc02d5cdd0dddb6c639307a709fe2322847fb6650f7a71b082867be26\": container with ID starting with 2b6d96cfc02d5cdd0dddb6c639307a709fe2322847fb6650f7a71b082867be26 not found: ID does not exist" Apr 17 17:34:33.662728 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:33.662703 2566 scope.go:117] "RemoveContainer" containerID="2334288eec6824750109eba8d82bc95cce7e996e48dd214b30fdd6fc3fd94c14" Apr 17 17:34:33.662934 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:34:33.662918 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2334288eec6824750109eba8d82bc95cce7e996e48dd214b30fdd6fc3fd94c14\": container with ID starting with 2334288eec6824750109eba8d82bc95cce7e996e48dd214b30fdd6fc3fd94c14 not found: ID does not exist" containerID="2334288eec6824750109eba8d82bc95cce7e996e48dd214b30fdd6fc3fd94c14" Apr 17 17:34:33.662981 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:33.662940 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2334288eec6824750109eba8d82bc95cce7e996e48dd214b30fdd6fc3fd94c14"} err="failed to get container status \"2334288eec6824750109eba8d82bc95cce7e996e48dd214b30fdd6fc3fd94c14\": rpc error: code = NotFound desc = could not find container \"2334288eec6824750109eba8d82bc95cce7e996e48dd214b30fdd6fc3fd94c14\": container with ID starting with 2334288eec6824750109eba8d82bc95cce7e996e48dd214b30fdd6fc3fd94c14 not found: ID does not exist" Apr 17 17:34:34.839621 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:34.839580 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3627da1a-71ab-4d46-990c-317de290cc90" path="/var/lib/kubelet/pods/3627da1a-71ab-4d46-990c-317de290cc90/volumes" Apr 17 17:34:35.567571 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:35.567523 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" podUID="4b058759-d1db-4428-a759-8c2d79f38eec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.19:5000: connect: connection refused" Apr 17 17:34:35.567957 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:35.567931 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" podUID="4b058759-d1db-4428-a759-8c2d79f38eec" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:34:45.567621 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:45.567575 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" podUID="4b058759-d1db-4428-a759-8c2d79f38eec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.19:5000: connect: connection refused" Apr 17 17:34:45.568128 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:45.568025 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" podUID="4b058759-d1db-4428-a759-8c2d79f38eec" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:34:48.792115 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:48.792085 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz777_35fe1bc2-e1a4-4744-8f67-3cc38747e567/ovn-acl-logging/0.log" Apr 17 17:34:48.792485 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:48.792168 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz777_35fe1bc2-e1a4-4744-8f67-3cc38747e567/ovn-acl-logging/0.log" Apr 17 17:34:55.566967 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:55.566921 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" podUID="4b058759-d1db-4428-a759-8c2d79f38eec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.19:5000: connect: connection refused" Apr 17 17:34:55.567386 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:34:55.567285 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" podUID="4b058759-d1db-4428-a759-8c2d79f38eec" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:35:05.571929 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:05.571882 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" podUID="4b058759-d1db-4428-a759-8c2d79f38eec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.19:5000: connect: connection refused" Apr 17 17:35:05.572525 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:05.572494 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" podUID="4b058759-d1db-4428-a759-8c2d79f38eec" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:35:15.568347 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:15.568313 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" Apr 17 17:35:15.568721 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:15.568420 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" Apr 17 17:35:28.121218 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:28.121188 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2"] Apr 17 17:35:28.123705 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:28.121537 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" podUID="4b058759-d1db-4428-a759-8c2d79f38eec" containerName="kserve-container" containerID="cri-o://5823aafa44d0f036e06126d593f998f0982fe4590eaaa13b4286a6906a2eaf0c" gracePeriod=30 Apr 17 17:35:28.123705 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:28.121576 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" podUID="4b058759-d1db-4428-a759-8c2d79f38eec" containerName="kube-rbac-proxy" containerID="cri-o://d78ba668e68adfaaae79e2ca797dd078ef99b132e355dd29fc31c1d2d29e1b45" gracePeriod=30 Apr 17 17:35:28.123705 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:28.121572 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" podUID="4b058759-d1db-4428-a759-8c2d79f38eec" containerName="agent" containerID="cri-o://3251ef30662526e0385c70e2b52cef1956634e345c777540f2d46bddf59400c7" gracePeriod=30 Apr 17 17:35:28.191924 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:28.191893 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-gmg59"] Apr 17 17:35:28.192242 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:28.192231 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3627da1a-71ab-4d46-990c-317de290cc90" containerName="agent" Apr 17 17:35:28.192318 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:28.192245 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="3627da1a-71ab-4d46-990c-317de290cc90" containerName="agent" Apr 17 17:35:28.192318 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:28.192268 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3627da1a-71ab-4d46-990c-317de290cc90" containerName="kube-rbac-proxy" Apr 17 17:35:28.192318 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:28.192275 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="3627da1a-71ab-4d46-990c-317de290cc90" containerName="kube-rbac-proxy" Apr 17 17:35:28.192318 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:28.192287 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3627da1a-71ab-4d46-990c-317de290cc90" containerName="storage-initializer" Apr 17 17:35:28.192318 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:28.192294 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="3627da1a-71ab-4d46-990c-317de290cc90" containerName="storage-initializer" Apr 17 17:35:28.192318 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:28.192308 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3627da1a-71ab-4d46-990c-317de290cc90" containerName="kserve-container" Apr 17 17:35:28.192318 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:28.192314 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="3627da1a-71ab-4d46-990c-317de290cc90" containerName="kserve-container" Apr 17 17:35:28.192515 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:28.192354 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="3627da1a-71ab-4d46-990c-317de290cc90" containerName="kube-rbac-proxy" Apr 17 17:35:28.192515 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:28.192365 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="3627da1a-71ab-4d46-990c-317de290cc90" containerName="agent" Apr 17 17:35:28.192515 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:28.192373 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="3627da1a-71ab-4d46-990c-317de290cc90" containerName="kserve-container" Apr 17 17:35:28.195160 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:28.195141 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-gmg59" Apr 17 17:35:28.197508 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:28.197489 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-predictor-serving-cert\"" Apr 17 17:35:28.197609 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:28.197489 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-kube-rbac-proxy-sar-config\"" Apr 17 17:35:28.206628 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:28.206605 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-gmg59"] Apr 17 17:35:28.284635 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:28.284602 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2b7b11d5-7be8-4e9d-8c74-0172c6f19543-message-dumper-kube-rbac-proxy-sar-config\") pod \"message-dumper-predictor-c7d86bcbd-gmg59\" (UID: \"2b7b11d5-7be8-4e9d-8c74-0172c6f19543\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-gmg59" Apr 17 17:35:28.284801 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:28.284652 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2b7b11d5-7be8-4e9d-8c74-0172c6f19543-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-gmg59\" (UID: \"2b7b11d5-7be8-4e9d-8c74-0172c6f19543\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-gmg59" Apr 17 17:35:28.284801 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:28.284679 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6jxt\" (UniqueName: \"kubernetes.io/projected/2b7b11d5-7be8-4e9d-8c74-0172c6f19543-kube-api-access-b6jxt\") pod \"message-dumper-predictor-c7d86bcbd-gmg59\" (UID: \"2b7b11d5-7be8-4e9d-8c74-0172c6f19543\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-gmg59" Apr 17 17:35:28.385195 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:28.385106 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2b7b11d5-7be8-4e9d-8c74-0172c6f19543-message-dumper-kube-rbac-proxy-sar-config\") pod \"message-dumper-predictor-c7d86bcbd-gmg59\" (UID: \"2b7b11d5-7be8-4e9d-8c74-0172c6f19543\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-gmg59" Apr 17 17:35:28.385195 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:28.385150 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2b7b11d5-7be8-4e9d-8c74-0172c6f19543-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-gmg59\" (UID: \"2b7b11d5-7be8-4e9d-8c74-0172c6f19543\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-gmg59" Apr 17 17:35:28.385195 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:28.385167 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b6jxt\" (UniqueName: \"kubernetes.io/projected/2b7b11d5-7be8-4e9d-8c74-0172c6f19543-kube-api-access-b6jxt\") pod \"message-dumper-predictor-c7d86bcbd-gmg59\" (UID: \"2b7b11d5-7be8-4e9d-8c74-0172c6f19543\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-gmg59" Apr 17 17:35:28.385816 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:28.385790 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2b7b11d5-7be8-4e9d-8c74-0172c6f19543-message-dumper-kube-rbac-proxy-sar-config\") pod \"message-dumper-predictor-c7d86bcbd-gmg59\" (UID: \"2b7b11d5-7be8-4e9d-8c74-0172c6f19543\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-gmg59" Apr 17 17:35:28.387558 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:28.387540 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2b7b11d5-7be8-4e9d-8c74-0172c6f19543-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-gmg59\" (UID: \"2b7b11d5-7be8-4e9d-8c74-0172c6f19543\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-gmg59" Apr 17 17:35:28.394053 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:28.394027 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6jxt\" (UniqueName: \"kubernetes.io/projected/2b7b11d5-7be8-4e9d-8c74-0172c6f19543-kube-api-access-b6jxt\") pod \"message-dumper-predictor-c7d86bcbd-gmg59\" (UID: \"2b7b11d5-7be8-4e9d-8c74-0172c6f19543\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-gmg59" Apr 17 17:35:28.504699 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:28.504664 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-gmg59" Apr 17 17:35:28.622387 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:28.622364 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-gmg59"] Apr 17 17:35:28.624975 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:35:28.624951 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b7b11d5_7be8_4e9d_8c74_0172c6f19543.slice/crio-36ee9a2ba46dde129b01421e05d890c01713e0b1720e10ba5ced892600063249 WatchSource:0}: Error finding container 36ee9a2ba46dde129b01421e05d890c01713e0b1720e10ba5ced892600063249: Status 404 returned error can't find the container with id 36ee9a2ba46dde129b01421e05d890c01713e0b1720e10ba5ced892600063249 Apr 17 17:35:28.790353 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:28.790271 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-gmg59" event={"ID":"2b7b11d5-7be8-4e9d-8c74-0172c6f19543","Type":"ContainerStarted","Data":"36ee9a2ba46dde129b01421e05d890c01713e0b1720e10ba5ced892600063249"} Apr 17 17:35:28.792106 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:28.792081 2566 generic.go:358] "Generic (PLEG): container finished" podID="4b058759-d1db-4428-a759-8c2d79f38eec" containerID="d78ba668e68adfaaae79e2ca797dd078ef99b132e355dd29fc31c1d2d29e1b45" exitCode=2 Apr 17 17:35:28.792207 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:28.792137 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" event={"ID":"4b058759-d1db-4428-a759-8c2d79f38eec","Type":"ContainerDied","Data":"d78ba668e68adfaaae79e2ca797dd078ef99b132e355dd29fc31c1d2d29e1b45"} Apr 17 17:35:29.796569 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:29.796531 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-gmg59" event={"ID":"2b7b11d5-7be8-4e9d-8c74-0172c6f19543","Type":"ContainerStarted","Data":"670d5e6ec6f535d67522ddbb6461faacc3ab17a945e38f4e3bc694c0878738ee"} Apr 17 17:35:29.796569 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:29.796571 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-gmg59" event={"ID":"2b7b11d5-7be8-4e9d-8c74-0172c6f19543","Type":"ContainerStarted","Data":"56984335d1bf159541879c99abf8ff4962c3858ef0d99c4636cec5c5521eb87e"} Apr 17 17:35:29.797040 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:29.796667 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-gmg59" Apr 17 17:35:29.816772 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:29.816725 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-gmg59" podStartSLOduration=0.854197236 podStartE2EDuration="1.816711298s" podCreationTimestamp="2026-04-17 17:35:28 +0000 UTC" firstStartedPulling="2026-04-17 17:35:28.626685549 +0000 UTC m=+640.296765168" lastFinishedPulling="2026-04-17 17:35:29.589199606 +0000 UTC m=+641.259279230" observedRunningTime="2026-04-17 17:35:29.814805452 +0000 UTC m=+641.484885093" watchObservedRunningTime="2026-04-17 17:35:29.816711298 +0000 UTC m=+641.486790939" Apr 17 17:35:30.563417 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:30.563377 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" podUID="4b058759-d1db-4428-a759-8c2d79f38eec" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.19:8643/healthz\": dial tcp 10.133.0.19:8643: connect: connection refused" Apr 17 17:35:30.799245 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:30.799211 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-gmg59" Apr 17 17:35:30.800842 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:30.800819 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-gmg59" Apr 17 17:35:32.808136 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:32.808104 2566 generic.go:358] "Generic (PLEG): container finished" podID="4b058759-d1db-4428-a759-8c2d79f38eec" containerID="5823aafa44d0f036e06126d593f998f0982fe4590eaaa13b4286a6906a2eaf0c" exitCode=0 Apr 17 17:35:32.808529 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:32.808170 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" event={"ID":"4b058759-d1db-4428-a759-8c2d79f38eec","Type":"ContainerDied","Data":"5823aafa44d0f036e06126d593f998f0982fe4590eaaa13b4286a6906a2eaf0c"} Apr 17 17:35:35.562682 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:35.562639 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" podUID="4b058759-d1db-4428-a759-8c2d79f38eec" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.19:8643/healthz\": dial tcp 10.133.0.19:8643: connect: connection refused" Apr 17 17:35:35.566948 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:35.566918 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" podUID="4b058759-d1db-4428-a759-8c2d79f38eec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.19:5000: connect: connection refused" Apr 17 17:35:35.567228 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:35.567206 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" podUID="4b058759-d1db-4428-a759-8c2d79f38eec" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:35:37.812507 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:37.812478 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-gmg59" Apr 17 17:35:38.266172 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:38.266093 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg"] Apr 17 17:35:38.270020 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:38.270003 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" Apr 17 17:35:38.272424 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:38.272402 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-kube-rbac-proxy-sar-config\"" Apr 17 17:35:38.272495 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:38.272402 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-predictor-serving-cert\"" Apr 17 17:35:38.283325 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:38.283303 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg"] Apr 17 17:35:38.368069 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:38.368034 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ebfc1119-7d52-4bf8-b997-987ec7032fb5-proxy-tls\") pod \"isvc-logger-predictor-64d54fcc88-wlplg\" (UID: \"ebfc1119-7d52-4bf8-b997-987ec7032fb5\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" Apr 17 17:35:38.368239 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:38.368074 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ebfc1119-7d52-4bf8-b997-987ec7032fb5-kserve-provision-location\") pod \"isvc-logger-predictor-64d54fcc88-wlplg\" (UID: \"ebfc1119-7d52-4bf8-b997-987ec7032fb5\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" Apr 17 17:35:38.368239 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:38.368179 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ebfc1119-7d52-4bf8-b997-987ec7032fb5-isvc-logger-kube-rbac-proxy-sar-config\") pod \"isvc-logger-predictor-64d54fcc88-wlplg\" (UID: \"ebfc1119-7d52-4bf8-b997-987ec7032fb5\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" Apr 17 17:35:38.368239 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:38.368224 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdw8t\" (UniqueName: \"kubernetes.io/projected/ebfc1119-7d52-4bf8-b997-987ec7032fb5-kube-api-access-fdw8t\") pod \"isvc-logger-predictor-64d54fcc88-wlplg\" (UID: \"ebfc1119-7d52-4bf8-b997-987ec7032fb5\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" Apr 17 17:35:38.469249 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:38.469218 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ebfc1119-7d52-4bf8-b997-987ec7032fb5-proxy-tls\") pod \"isvc-logger-predictor-64d54fcc88-wlplg\" (UID: \"ebfc1119-7d52-4bf8-b997-987ec7032fb5\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" Apr 17 17:35:38.469422 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:38.469286 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ebfc1119-7d52-4bf8-b997-987ec7032fb5-kserve-provision-location\") pod \"isvc-logger-predictor-64d54fcc88-wlplg\" (UID: \"ebfc1119-7d52-4bf8-b997-987ec7032fb5\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" Apr 17 17:35:38.469422 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:38.469342 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ebfc1119-7d52-4bf8-b997-987ec7032fb5-isvc-logger-kube-rbac-proxy-sar-config\") pod \"isvc-logger-predictor-64d54fcc88-wlplg\" (UID: \"ebfc1119-7d52-4bf8-b997-987ec7032fb5\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" Apr 17 17:35:38.469422 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:38.469390 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fdw8t\" (UniqueName: \"kubernetes.io/projected/ebfc1119-7d52-4bf8-b997-987ec7032fb5-kube-api-access-fdw8t\") pod \"isvc-logger-predictor-64d54fcc88-wlplg\" (UID: \"ebfc1119-7d52-4bf8-b997-987ec7032fb5\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" Apr 17 17:35:38.469702 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:38.469685 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ebfc1119-7d52-4bf8-b997-987ec7032fb5-kserve-provision-location\") pod \"isvc-logger-predictor-64d54fcc88-wlplg\" (UID: \"ebfc1119-7d52-4bf8-b997-987ec7032fb5\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" Apr 17 17:35:38.470085 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:38.470055 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ebfc1119-7d52-4bf8-b997-987ec7032fb5-isvc-logger-kube-rbac-proxy-sar-config\") pod \"isvc-logger-predictor-64d54fcc88-wlplg\" (UID: \"ebfc1119-7d52-4bf8-b997-987ec7032fb5\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" Apr 17 17:35:38.471856 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:38.471833 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ebfc1119-7d52-4bf8-b997-987ec7032fb5-proxy-tls\") pod \"isvc-logger-predictor-64d54fcc88-wlplg\" (UID: \"ebfc1119-7d52-4bf8-b997-987ec7032fb5\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" Apr 17 17:35:38.478511 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:38.478490 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdw8t\" (UniqueName: \"kubernetes.io/projected/ebfc1119-7d52-4bf8-b997-987ec7032fb5-kube-api-access-fdw8t\") pod \"isvc-logger-predictor-64d54fcc88-wlplg\" (UID: \"ebfc1119-7d52-4bf8-b997-987ec7032fb5\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" Apr 17 17:35:38.580278 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:38.580221 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" Apr 17 17:35:38.702352 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:38.702282 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg"] Apr 17 17:35:38.704561 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:35:38.704533 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebfc1119_7d52_4bf8_b997_987ec7032fb5.slice/crio-650488f0b812dcc53a7ee354e79775fb95d458c47d8a997c1c651f554c2baee6 WatchSource:0}: Error finding container 650488f0b812dcc53a7ee354e79775fb95d458c47d8a997c1c651f554c2baee6: Status 404 returned error can't find the container with id 650488f0b812dcc53a7ee354e79775fb95d458c47d8a997c1c651f554c2baee6 Apr 17 17:35:38.827230 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:38.827196 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" event={"ID":"ebfc1119-7d52-4bf8-b997-987ec7032fb5","Type":"ContainerStarted","Data":"250da3a69e8212967afc7d70ff00558c7fd203ac9dc0a37979647ddf32fd6c3a"} Apr 17 17:35:38.827230 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:38.827229 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" event={"ID":"ebfc1119-7d52-4bf8-b997-987ec7032fb5","Type":"ContainerStarted","Data":"650488f0b812dcc53a7ee354e79775fb95d458c47d8a997c1c651f554c2baee6"} Apr 17 17:35:40.563480 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:40.563439 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" podUID="4b058759-d1db-4428-a759-8c2d79f38eec" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.19:8643/healthz\": dial tcp 10.133.0.19:8643: connect: connection refused" Apr 17 17:35:40.563859 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:40.563558 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" Apr 17 17:35:42.840443 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:42.840411 2566 generic.go:358] "Generic (PLEG): container finished" podID="ebfc1119-7d52-4bf8-b997-987ec7032fb5" containerID="250da3a69e8212967afc7d70ff00558c7fd203ac9dc0a37979647ddf32fd6c3a" exitCode=0 Apr 17 17:35:42.840806 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:42.840476 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" event={"ID":"ebfc1119-7d52-4bf8-b997-987ec7032fb5","Type":"ContainerDied","Data":"250da3a69e8212967afc7d70ff00558c7fd203ac9dc0a37979647ddf32fd6c3a"} Apr 17 17:35:43.845548 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:43.845509 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" event={"ID":"ebfc1119-7d52-4bf8-b997-987ec7032fb5","Type":"ContainerStarted","Data":"55d62db5ff29c88cdbce72e22a544fd75ca1a3f5086ae6c4dff4fa54f06adf78"} Apr 17 17:35:43.845548 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:43.845553 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" event={"ID":"ebfc1119-7d52-4bf8-b997-987ec7032fb5","Type":"ContainerStarted","Data":"e5a2e822de06e32bd2aa5566106ea64ed8989bb978e520ec791a76ab112a84e8"} Apr 17 17:35:43.845943 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:43.845562 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" event={"ID":"ebfc1119-7d52-4bf8-b997-987ec7032fb5","Type":"ContainerStarted","Data":"fc446bc46aeeb98bcaaf50f6e6350e76fb7e94d78c8d14fd6e1ebbcaf5ee1a91"} Apr 17 17:35:43.845943 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:43.845897 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" Apr 17 17:35:43.845943 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:43.845920 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" Apr 17 17:35:43.847335 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:43.847309 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" podUID="ebfc1119-7d52-4bf8-b997-987ec7032fb5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 17 17:35:43.868863 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:43.868820 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" podStartSLOduration=5.868808521 podStartE2EDuration="5.868808521s" podCreationTimestamp="2026-04-17 17:35:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:35:43.867177454 +0000 UTC m=+655.537257120" watchObservedRunningTime="2026-04-17 17:35:43.868808521 +0000 UTC m=+655.538888163" Apr 17 17:35:44.848527 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:44.848489 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" Apr 17 17:35:44.848964 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:44.848638 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" podUID="ebfc1119-7d52-4bf8-b997-987ec7032fb5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 17 17:35:44.849606 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:44.849579 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" podUID="ebfc1119-7d52-4bf8-b997-987ec7032fb5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:35:45.563722 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:45.563680 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" podUID="4b058759-d1db-4428-a759-8c2d79f38eec" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.19:8643/healthz\": dial tcp 10.133.0.19:8643: connect: connection refused" Apr 17 17:35:45.566962 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:45.566929 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" podUID="4b058759-d1db-4428-a759-8c2d79f38eec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.19:5000: connect: connection refused" Apr 17 17:35:45.567322 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:45.567300 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" podUID="4b058759-d1db-4428-a759-8c2d79f38eec" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:35:45.851903 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:45.851804 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" podUID="ebfc1119-7d52-4bf8-b997-987ec7032fb5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 17 17:35:45.852316 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:45.852172 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" podUID="ebfc1119-7d52-4bf8-b997-987ec7032fb5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:35:50.563461 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:50.563419 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" podUID="4b058759-d1db-4428-a759-8c2d79f38eec" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.19:8643/healthz\": dial tcp 10.133.0.19:8643: connect: connection refused" Apr 17 17:35:50.856461 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:50.856378 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" Apr 17 17:35:50.856952 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:50.856927 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" podUID="ebfc1119-7d52-4bf8-b997-987ec7032fb5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 17 17:35:50.857242 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:50.857204 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" podUID="ebfc1119-7d52-4bf8-b997-987ec7032fb5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:35:55.563621 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:55.563579 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" podUID="4b058759-d1db-4428-a759-8c2d79f38eec" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.19:8643/healthz\": dial tcp 10.133.0.19:8643: connect: connection refused" Apr 17 17:35:55.567098 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:55.567060 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" podUID="4b058759-d1db-4428-a759-8c2d79f38eec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.19:5000: connect: connection refused" Apr 17 17:35:55.567219 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:55.567196 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" Apr 17 17:35:55.567308 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:55.567209 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" podUID="4b058759-d1db-4428-a759-8c2d79f38eec" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:35:55.567358 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:55.567332 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" Apr 17 17:35:58.264363 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:58.264338 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" Apr 17 17:35:58.345403 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:58.345370 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4b058759-d1db-4428-a759-8c2d79f38eec-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"4b058759-d1db-4428-a759-8c2d79f38eec\" (UID: \"4b058759-d1db-4428-a759-8c2d79f38eec\") " Apr 17 17:35:58.345597 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:58.345414 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4b058759-d1db-4428-a759-8c2d79f38eec-kserve-provision-location\") pod \"4b058759-d1db-4428-a759-8c2d79f38eec\" (UID: \"4b058759-d1db-4428-a759-8c2d79f38eec\") " Apr 17 17:35:58.345597 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:58.345440 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drnj7\" (UniqueName: \"kubernetes.io/projected/4b058759-d1db-4428-a759-8c2d79f38eec-kube-api-access-drnj7\") pod \"4b058759-d1db-4428-a759-8c2d79f38eec\" (UID: \"4b058759-d1db-4428-a759-8c2d79f38eec\") " Apr 17 17:35:58.345597 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:58.345504 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4b058759-d1db-4428-a759-8c2d79f38eec-proxy-tls\") pod \"4b058759-d1db-4428-a759-8c2d79f38eec\" (UID: \"4b058759-d1db-4428-a759-8c2d79f38eec\") " Apr 17 17:35:58.345758 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:58.345728 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b058759-d1db-4428-a759-8c2d79f38eec-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4b058759-d1db-4428-a759-8c2d79f38eec" (UID: "4b058759-d1db-4428-a759-8c2d79f38eec"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:35:58.345806 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:58.345781 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b058759-d1db-4428-a759-8c2d79f38eec-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config") pod "4b058759-d1db-4428-a759-8c2d79f38eec" (UID: "4b058759-d1db-4428-a759-8c2d79f38eec"). InnerVolumeSpecName "isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:35:58.347656 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:58.347631 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b058759-d1db-4428-a759-8c2d79f38eec-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4b058759-d1db-4428-a759-8c2d79f38eec" (UID: "4b058759-d1db-4428-a759-8c2d79f38eec"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:35:58.347656 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:58.347632 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b058759-d1db-4428-a759-8c2d79f38eec-kube-api-access-drnj7" (OuterVolumeSpecName: "kube-api-access-drnj7") pod "4b058759-d1db-4428-a759-8c2d79f38eec" (UID: "4b058759-d1db-4428-a759-8c2d79f38eec"). InnerVolumeSpecName "kube-api-access-drnj7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:35:58.446309 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:58.446206 2566 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4b058759-d1db-4428-a759-8c2d79f38eec-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:35:58.446309 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:58.446239 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4b058759-d1db-4428-a759-8c2d79f38eec-kserve-provision-location\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:35:58.446309 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:58.446272 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-drnj7\" (UniqueName: \"kubernetes.io/projected/4b058759-d1db-4428-a759-8c2d79f38eec-kube-api-access-drnj7\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:35:58.446309 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:58.446284 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4b058759-d1db-4428-a759-8c2d79f38eec-proxy-tls\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:35:58.893327 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:58.893297 2566 generic.go:358] "Generic (PLEG): container finished" podID="4b058759-d1db-4428-a759-8c2d79f38eec" containerID="3251ef30662526e0385c70e2b52cef1956634e345c777540f2d46bddf59400c7" exitCode=0 Apr 17 17:35:58.893501 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:58.893384 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" Apr 17 17:35:58.893501 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:58.893378 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" event={"ID":"4b058759-d1db-4428-a759-8c2d79f38eec","Type":"ContainerDied","Data":"3251ef30662526e0385c70e2b52cef1956634e345c777540f2d46bddf59400c7"} Apr 17 17:35:58.893615 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:58.893500 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2" event={"ID":"4b058759-d1db-4428-a759-8c2d79f38eec","Type":"ContainerDied","Data":"7bedbc8b76407d58c0597f81c02dac4efa553f93d04828318132b2f5dbd69ff7"} Apr 17 17:35:58.893615 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:58.893525 2566 scope.go:117] "RemoveContainer" containerID="3251ef30662526e0385c70e2b52cef1956634e345c777540f2d46bddf59400c7" Apr 17 17:35:58.901088 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:58.901072 2566 scope.go:117] "RemoveContainer" containerID="d78ba668e68adfaaae79e2ca797dd078ef99b132e355dd29fc31c1d2d29e1b45" Apr 17 17:35:58.907899 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:58.907881 2566 scope.go:117] "RemoveContainer" containerID="5823aafa44d0f036e06126d593f998f0982fe4590eaaa13b4286a6906a2eaf0c" Apr 17 17:35:58.911118 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:58.911096 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2"] Apr 17 17:35:58.915001 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:58.914987 2566 scope.go:117] "RemoveContainer" containerID="1e293ff56c5635c17e83759304e067ee1173cf048c30dd7be95ac0da48a0e034" Apr 17 17:35:58.921927 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:58.921914 2566 scope.go:117] "RemoveContainer" containerID="3251ef30662526e0385c70e2b52cef1956634e345c777540f2d46bddf59400c7" Apr 17 17:35:58.922195 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:35:58.922173 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3251ef30662526e0385c70e2b52cef1956634e345c777540f2d46bddf59400c7\": container with ID starting with 3251ef30662526e0385c70e2b52cef1956634e345c777540f2d46bddf59400c7 not found: ID does not exist" containerID="3251ef30662526e0385c70e2b52cef1956634e345c777540f2d46bddf59400c7" Apr 17 17:35:58.922304 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:58.922201 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3251ef30662526e0385c70e2b52cef1956634e345c777540f2d46bddf59400c7"} err="failed to get container status \"3251ef30662526e0385c70e2b52cef1956634e345c777540f2d46bddf59400c7\": rpc error: code = NotFound desc = could not find container \"3251ef30662526e0385c70e2b52cef1956634e345c777540f2d46bddf59400c7\": container with ID starting with 3251ef30662526e0385c70e2b52cef1956634e345c777540f2d46bddf59400c7 not found: ID does not exist" Apr 17 17:35:58.922304 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:58.922221 2566 scope.go:117] "RemoveContainer" containerID="d78ba668e68adfaaae79e2ca797dd078ef99b132e355dd29fc31c1d2d29e1b45" Apr 17 17:35:58.922433 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:58.922369 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-pwpk2"] Apr 17 17:35:58.922611 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:35:58.922594 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d78ba668e68adfaaae79e2ca797dd078ef99b132e355dd29fc31c1d2d29e1b45\": container with ID starting with d78ba668e68adfaaae79e2ca797dd078ef99b132e355dd29fc31c1d2d29e1b45 not found: ID does not exist" containerID="d78ba668e68adfaaae79e2ca797dd078ef99b132e355dd29fc31c1d2d29e1b45" Apr 17 17:35:58.922670 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:58.922620 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d78ba668e68adfaaae79e2ca797dd078ef99b132e355dd29fc31c1d2d29e1b45"} err="failed to get container status \"d78ba668e68adfaaae79e2ca797dd078ef99b132e355dd29fc31c1d2d29e1b45\": rpc error: code = NotFound desc = could not find container \"d78ba668e68adfaaae79e2ca797dd078ef99b132e355dd29fc31c1d2d29e1b45\": container with ID starting with d78ba668e68adfaaae79e2ca797dd078ef99b132e355dd29fc31c1d2d29e1b45 not found: ID does not exist" Apr 17 17:35:58.922670 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:58.922644 2566 scope.go:117] "RemoveContainer" containerID="5823aafa44d0f036e06126d593f998f0982fe4590eaaa13b4286a6906a2eaf0c" Apr 17 17:35:58.922850 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:35:58.922833 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5823aafa44d0f036e06126d593f998f0982fe4590eaaa13b4286a6906a2eaf0c\": container with ID starting with 5823aafa44d0f036e06126d593f998f0982fe4590eaaa13b4286a6906a2eaf0c not found: ID does not exist" containerID="5823aafa44d0f036e06126d593f998f0982fe4590eaaa13b4286a6906a2eaf0c" Apr 17 17:35:58.922891 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:58.922855 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5823aafa44d0f036e06126d593f998f0982fe4590eaaa13b4286a6906a2eaf0c"} err="failed to get container status \"5823aafa44d0f036e06126d593f998f0982fe4590eaaa13b4286a6906a2eaf0c\": rpc error: code = NotFound desc = could not find container \"5823aafa44d0f036e06126d593f998f0982fe4590eaaa13b4286a6906a2eaf0c\": container with ID starting with 5823aafa44d0f036e06126d593f998f0982fe4590eaaa13b4286a6906a2eaf0c not found: ID does not exist" Apr 17 17:35:58.922891 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:58.922870 2566 scope.go:117] "RemoveContainer" containerID="1e293ff56c5635c17e83759304e067ee1173cf048c30dd7be95ac0da48a0e034" Apr 17 17:35:58.923094 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:35:58.923075 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e293ff56c5635c17e83759304e067ee1173cf048c30dd7be95ac0da48a0e034\": container with ID starting with 1e293ff56c5635c17e83759304e067ee1173cf048c30dd7be95ac0da48a0e034 not found: ID does not exist" containerID="1e293ff56c5635c17e83759304e067ee1173cf048c30dd7be95ac0da48a0e034" Apr 17 17:35:58.923132 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:35:58.923101 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e293ff56c5635c17e83759304e067ee1173cf048c30dd7be95ac0da48a0e034"} err="failed to get container status \"1e293ff56c5635c17e83759304e067ee1173cf048c30dd7be95ac0da48a0e034\": rpc error: code = NotFound desc = could not find container \"1e293ff56c5635c17e83759304e067ee1173cf048c30dd7be95ac0da48a0e034\": container with ID starting with 1e293ff56c5635c17e83759304e067ee1173cf048c30dd7be95ac0da48a0e034 not found: ID does not exist" Apr 17 17:36:00.838536 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:36:00.838508 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b058759-d1db-4428-a759-8c2d79f38eec" path="/var/lib/kubelet/pods/4b058759-d1db-4428-a759-8c2d79f38eec/volumes" Apr 17 17:36:00.857595 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:36:00.857558 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" podUID="ebfc1119-7d52-4bf8-b997-987ec7032fb5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 17 17:36:00.858048 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:36:00.858023 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" podUID="ebfc1119-7d52-4bf8-b997-987ec7032fb5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:36:10.857010 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:36:10.856966 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" podUID="ebfc1119-7d52-4bf8-b997-987ec7032fb5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 17 17:36:10.857492 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:36:10.857467 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" podUID="ebfc1119-7d52-4bf8-b997-987ec7032fb5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:36:20.856916 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:36:20.856869 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" podUID="ebfc1119-7d52-4bf8-b997-987ec7032fb5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 17 17:36:20.857333 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:36:20.857246 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" podUID="ebfc1119-7d52-4bf8-b997-987ec7032fb5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:36:30.856814 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:36:30.856772 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" podUID="ebfc1119-7d52-4bf8-b997-987ec7032fb5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 17 17:36:30.857310 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:36:30.857225 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" podUID="ebfc1119-7d52-4bf8-b997-987ec7032fb5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:36:40.857737 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:36:40.857694 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" podUID="ebfc1119-7d52-4bf8-b997-987ec7032fb5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 17 17:36:40.858220 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:36:40.858186 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" podUID="ebfc1119-7d52-4bf8-b997-987ec7032fb5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:36:50.857437 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:36:50.857410 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" Apr 17 17:36:50.857901 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:36:50.857882 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" Apr 17 17:37:03.273413 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:03.273383 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-predictor-c7d86bcbd-gmg59_2b7b11d5-7be8-4e9d-8c74-0172c6f19543/kserve-container/0.log" Apr 17 17:37:03.466657 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:03.466627 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg"] Apr 17 17:37:03.466992 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:03.466947 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" podUID="ebfc1119-7d52-4bf8-b997-987ec7032fb5" containerName="kserve-container" containerID="cri-o://fc446bc46aeeb98bcaaf50f6e6350e76fb7e94d78c8d14fd6e1ebbcaf5ee1a91" gracePeriod=30 Apr 17 17:37:03.467136 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:03.466991 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" podUID="ebfc1119-7d52-4bf8-b997-987ec7032fb5" containerName="kube-rbac-proxy" containerID="cri-o://e5a2e822de06e32bd2aa5566106ea64ed8989bb978e520ec791a76ab112a84e8" gracePeriod=30 Apr 17 17:37:03.467136 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:03.466983 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" podUID="ebfc1119-7d52-4bf8-b997-987ec7032fb5" containerName="agent" containerID="cri-o://55d62db5ff29c88cdbce72e22a544fd75ca1a3f5086ae6c4dff4fa54f06adf78" gracePeriod=30 Apr 17 17:37:03.510830 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:03.510801 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9vmjw"] Apr 17 17:37:03.511130 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:03.511118 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4b058759-d1db-4428-a759-8c2d79f38eec" containerName="storage-initializer" Apr 17 17:37:03.511179 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:03.511131 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b058759-d1db-4428-a759-8c2d79f38eec" containerName="storage-initializer" Apr 17 17:37:03.511179 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:03.511140 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4b058759-d1db-4428-a759-8c2d79f38eec" containerName="kserve-container" Apr 17 17:37:03.511179 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:03.511146 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b058759-d1db-4428-a759-8c2d79f38eec" containerName="kserve-container" Apr 17 17:37:03.511179 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:03.511154 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4b058759-d1db-4428-a759-8c2d79f38eec" containerName="kube-rbac-proxy" Apr 17 17:37:03.511179 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:03.511159 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b058759-d1db-4428-a759-8c2d79f38eec" containerName="kube-rbac-proxy" Apr 17 17:37:03.511179 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:03.511175 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4b058759-d1db-4428-a759-8c2d79f38eec" containerName="agent" Apr 17 17:37:03.511179 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:03.511180 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b058759-d1db-4428-a759-8c2d79f38eec" containerName="agent" Apr 17 17:37:03.511415 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:03.511224 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="4b058759-d1db-4428-a759-8c2d79f38eec" containerName="kserve-container" Apr 17 17:37:03.511415 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:03.511237 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="4b058759-d1db-4428-a759-8c2d79f38eec" containerName="kube-rbac-proxy" Apr 17 17:37:03.511415 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:03.511243 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="4b058759-d1db-4428-a759-8c2d79f38eec" containerName="agent" Apr 17 17:37:03.514227 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:03.514210 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9vmjw" Apr 17 17:37:03.516679 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:03.516662 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-predictor-serving-cert\"" Apr 17 17:37:03.516755 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:03.516666 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-kube-rbac-proxy-sar-config\"" Apr 17 17:37:03.525374 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:03.525321 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9vmjw"] Apr 17 17:37:03.568035 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:03.568006 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-gmg59"] Apr 17 17:37:03.568332 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:03.568273 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-gmg59" podUID="2b7b11d5-7be8-4e9d-8c74-0172c6f19543" containerName="kserve-container" containerID="cri-o://56984335d1bf159541879c99abf8ff4962c3858ef0d99c4636cec5c5521eb87e" gracePeriod=30 Apr 17 17:37:03.568332 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:03.568299 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-gmg59" podUID="2b7b11d5-7be8-4e9d-8c74-0172c6f19543" containerName="kube-rbac-proxy" containerID="cri-o://670d5e6ec6f535d67522ddbb6461faacc3ab17a945e38f4e3bc694c0878738ee" gracePeriod=30 Apr 17 17:37:03.597201 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:03.597172 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6pj5\" (UniqueName: \"kubernetes.io/projected/bd72de05-6f18-45e5-861d-a35b6692e91a-kube-api-access-p6pj5\") pod \"isvc-lightgbm-predictor-bdf964bd-9vmjw\" (UID: \"bd72de05-6f18-45e5-861d-a35b6692e91a\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9vmjw" Apr 17 17:37:03.597341 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:03.597216 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd72de05-6f18-45e5-861d-a35b6692e91a-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-9vmjw\" (UID: \"bd72de05-6f18-45e5-861d-a35b6692e91a\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9vmjw" Apr 17 17:37:03.597341 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:03.597276 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bd72de05-6f18-45e5-861d-a35b6692e91a-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bdf964bd-9vmjw\" (UID: \"bd72de05-6f18-45e5-861d-a35b6692e91a\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9vmjw" Apr 17 17:37:03.597341 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:03.597329 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bd72de05-6f18-45e5-861d-a35b6692e91a-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-predictor-bdf964bd-9vmjw\" (UID: \"bd72de05-6f18-45e5-861d-a35b6692e91a\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9vmjw" Apr 17 17:37:03.698062 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:03.698030 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bd72de05-6f18-45e5-861d-a35b6692e91a-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-predictor-bdf964bd-9vmjw\" (UID: \"bd72de05-6f18-45e5-861d-a35b6692e91a\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9vmjw" Apr 17 17:37:03.698211 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:03.698193 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p6pj5\" (UniqueName: \"kubernetes.io/projected/bd72de05-6f18-45e5-861d-a35b6692e91a-kube-api-access-p6pj5\") pod \"isvc-lightgbm-predictor-bdf964bd-9vmjw\" (UID: \"bd72de05-6f18-45e5-861d-a35b6692e91a\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9vmjw" Apr 17 17:37:03.698292 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:03.698247 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd72de05-6f18-45e5-861d-a35b6692e91a-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-9vmjw\" (UID: \"bd72de05-6f18-45e5-861d-a35b6692e91a\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9vmjw" Apr 17 17:37:03.698352 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:03.698296 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bd72de05-6f18-45e5-861d-a35b6692e91a-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bdf964bd-9vmjw\" (UID: \"bd72de05-6f18-45e5-861d-a35b6692e91a\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9vmjw" Apr 17 17:37:03.698583 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:03.698565 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bd72de05-6f18-45e5-861d-a35b6692e91a-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bdf964bd-9vmjw\" (UID: \"bd72de05-6f18-45e5-861d-a35b6692e91a\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9vmjw" Apr 17 17:37:03.698860 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:03.698832 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bd72de05-6f18-45e5-861d-a35b6692e91a-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-predictor-bdf964bd-9vmjw\" (UID: \"bd72de05-6f18-45e5-861d-a35b6692e91a\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9vmjw" Apr 17 17:37:03.700609 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:03.700584 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd72de05-6f18-45e5-861d-a35b6692e91a-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-9vmjw\" (UID: \"bd72de05-6f18-45e5-861d-a35b6692e91a\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9vmjw" Apr 17 17:37:03.706999 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:03.706974 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6pj5\" (UniqueName: \"kubernetes.io/projected/bd72de05-6f18-45e5-861d-a35b6692e91a-kube-api-access-p6pj5\") pod \"isvc-lightgbm-predictor-bdf964bd-9vmjw\" (UID: \"bd72de05-6f18-45e5-861d-a35b6692e91a\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9vmjw" Apr 17 17:37:03.797491 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:03.797466 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-gmg59" Apr 17 17:37:03.824405 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:03.824377 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9vmjw" Apr 17 17:37:03.899756 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:03.899711 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2b7b11d5-7be8-4e9d-8c74-0172c6f19543-proxy-tls\") pod \"2b7b11d5-7be8-4e9d-8c74-0172c6f19543\" (UID: \"2b7b11d5-7be8-4e9d-8c74-0172c6f19543\") " Apr 17 17:37:03.899891 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:03.899768 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2b7b11d5-7be8-4e9d-8c74-0172c6f19543-message-dumper-kube-rbac-proxy-sar-config\") pod \"2b7b11d5-7be8-4e9d-8c74-0172c6f19543\" (UID: \"2b7b11d5-7be8-4e9d-8c74-0172c6f19543\") " Apr 17 17:37:03.899891 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:03.899882 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6jxt\" (UniqueName: \"kubernetes.io/projected/2b7b11d5-7be8-4e9d-8c74-0172c6f19543-kube-api-access-b6jxt\") pod \"2b7b11d5-7be8-4e9d-8c74-0172c6f19543\" (UID: \"2b7b11d5-7be8-4e9d-8c74-0172c6f19543\") " Apr 17 17:37:03.900190 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:03.900156 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b7b11d5-7be8-4e9d-8c74-0172c6f19543-message-dumper-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "message-dumper-kube-rbac-proxy-sar-config") pod "2b7b11d5-7be8-4e9d-8c74-0172c6f19543" (UID: "2b7b11d5-7be8-4e9d-8c74-0172c6f19543"). InnerVolumeSpecName "message-dumper-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:37:03.904352 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:03.904314 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b7b11d5-7be8-4e9d-8c74-0172c6f19543-kube-api-access-b6jxt" (OuterVolumeSpecName: "kube-api-access-b6jxt") pod "2b7b11d5-7be8-4e9d-8c74-0172c6f19543" (UID: "2b7b11d5-7be8-4e9d-8c74-0172c6f19543"). InnerVolumeSpecName "kube-api-access-b6jxt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:37:03.904462 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:03.904366 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b7b11d5-7be8-4e9d-8c74-0172c6f19543-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "2b7b11d5-7be8-4e9d-8c74-0172c6f19543" (UID: "2b7b11d5-7be8-4e9d-8c74-0172c6f19543"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:37:03.942597 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:03.942453 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9vmjw"] Apr 17 17:37:03.945220 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:37:03.945194 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd72de05_6f18_45e5_861d_a35b6692e91a.slice/crio-019133a6d27b038e462d2fdfcf71c293fd8c936a77c3bde8bded2c96a2842c0f WatchSource:0}: Error finding container 019133a6d27b038e462d2fdfcf71c293fd8c936a77c3bde8bded2c96a2842c0f: Status 404 returned error can't find the container with id 019133a6d27b038e462d2fdfcf71c293fd8c936a77c3bde8bded2c96a2842c0f Apr 17 17:37:03.946909 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:03.946895 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:37:04.000776 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:04.000753 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b6jxt\" (UniqueName: \"kubernetes.io/projected/2b7b11d5-7be8-4e9d-8c74-0172c6f19543-kube-api-access-b6jxt\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:37:04.000776 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:04.000775 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2b7b11d5-7be8-4e9d-8c74-0172c6f19543-proxy-tls\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:37:04.000917 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:04.000785 2566 reconciler_common.go:299] "Volume detached for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2b7b11d5-7be8-4e9d-8c74-0172c6f19543-message-dumper-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:37:04.079169 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:04.079139 2566 generic.go:358] "Generic (PLEG): container finished" podID="2b7b11d5-7be8-4e9d-8c74-0172c6f19543" containerID="670d5e6ec6f535d67522ddbb6461faacc3ab17a945e38f4e3bc694c0878738ee" exitCode=2 Apr 17 17:37:04.079169 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:04.079162 2566 generic.go:358] "Generic (PLEG): container finished" podID="2b7b11d5-7be8-4e9d-8c74-0172c6f19543" containerID="56984335d1bf159541879c99abf8ff4962c3858ef0d99c4636cec5c5521eb87e" exitCode=2 Apr 17 17:37:04.079430 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:04.079209 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-gmg59" Apr 17 17:37:04.079430 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:04.079221 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-gmg59" event={"ID":"2b7b11d5-7be8-4e9d-8c74-0172c6f19543","Type":"ContainerDied","Data":"670d5e6ec6f535d67522ddbb6461faacc3ab17a945e38f4e3bc694c0878738ee"} Apr 17 17:37:04.079430 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:04.079294 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-gmg59" event={"ID":"2b7b11d5-7be8-4e9d-8c74-0172c6f19543","Type":"ContainerDied","Data":"56984335d1bf159541879c99abf8ff4962c3858ef0d99c4636cec5c5521eb87e"} Apr 17 17:37:04.079430 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:04.079311 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-gmg59" event={"ID":"2b7b11d5-7be8-4e9d-8c74-0172c6f19543","Type":"ContainerDied","Data":"36ee9a2ba46dde129b01421e05d890c01713e0b1720e10ba5ced892600063249"} Apr 17 17:37:04.079430 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:04.079330 2566 scope.go:117] "RemoveContainer" containerID="670d5e6ec6f535d67522ddbb6461faacc3ab17a945e38f4e3bc694c0878738ee" Apr 17 17:37:04.081751 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:04.081731 2566 generic.go:358] "Generic (PLEG): container finished" podID="ebfc1119-7d52-4bf8-b997-987ec7032fb5" containerID="e5a2e822de06e32bd2aa5566106ea64ed8989bb978e520ec791a76ab112a84e8" exitCode=2 Apr 17 17:37:04.081880 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:04.081796 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" event={"ID":"ebfc1119-7d52-4bf8-b997-987ec7032fb5","Type":"ContainerDied","Data":"e5a2e822de06e32bd2aa5566106ea64ed8989bb978e520ec791a76ab112a84e8"} Apr 17 17:37:04.083054 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:04.083035 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9vmjw" event={"ID":"bd72de05-6f18-45e5-861d-a35b6692e91a","Type":"ContainerStarted","Data":"1eedd9e73d15aef13a1264a2af1ffe986ea7a10230c3d113902beebc5499eda9"} Apr 17 17:37:04.083151 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:04.083062 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9vmjw" event={"ID":"bd72de05-6f18-45e5-861d-a35b6692e91a","Type":"ContainerStarted","Data":"019133a6d27b038e462d2fdfcf71c293fd8c936a77c3bde8bded2c96a2842c0f"} Apr 17 17:37:04.089174 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:04.089158 2566 scope.go:117] "RemoveContainer" containerID="56984335d1bf159541879c99abf8ff4962c3858ef0d99c4636cec5c5521eb87e" Apr 17 17:37:04.096298 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:04.096235 2566 scope.go:117] "RemoveContainer" containerID="670d5e6ec6f535d67522ddbb6461faacc3ab17a945e38f4e3bc694c0878738ee" Apr 17 17:37:04.096569 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:37:04.096548 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"670d5e6ec6f535d67522ddbb6461faacc3ab17a945e38f4e3bc694c0878738ee\": container with ID starting with 670d5e6ec6f535d67522ddbb6461faacc3ab17a945e38f4e3bc694c0878738ee not found: ID does not exist" containerID="670d5e6ec6f535d67522ddbb6461faacc3ab17a945e38f4e3bc694c0878738ee" Apr 17 17:37:04.096658 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:04.096574 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"670d5e6ec6f535d67522ddbb6461faacc3ab17a945e38f4e3bc694c0878738ee"} err="failed to get container status \"670d5e6ec6f535d67522ddbb6461faacc3ab17a945e38f4e3bc694c0878738ee\": rpc error: code = NotFound desc = could not find container \"670d5e6ec6f535d67522ddbb6461faacc3ab17a945e38f4e3bc694c0878738ee\": container with ID starting with 670d5e6ec6f535d67522ddbb6461faacc3ab17a945e38f4e3bc694c0878738ee not found: ID does not exist" Apr 17 17:37:04.096658 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:04.096592 2566 scope.go:117] "RemoveContainer" containerID="56984335d1bf159541879c99abf8ff4962c3858ef0d99c4636cec5c5521eb87e" Apr 17 17:37:04.096833 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:37:04.096817 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56984335d1bf159541879c99abf8ff4962c3858ef0d99c4636cec5c5521eb87e\": container with ID starting with 56984335d1bf159541879c99abf8ff4962c3858ef0d99c4636cec5c5521eb87e not found: ID does not exist" containerID="56984335d1bf159541879c99abf8ff4962c3858ef0d99c4636cec5c5521eb87e" Apr 17 17:37:04.096872 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:04.096837 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56984335d1bf159541879c99abf8ff4962c3858ef0d99c4636cec5c5521eb87e"} err="failed to get container status \"56984335d1bf159541879c99abf8ff4962c3858ef0d99c4636cec5c5521eb87e\": rpc error: code = NotFound desc = could not find container \"56984335d1bf159541879c99abf8ff4962c3858ef0d99c4636cec5c5521eb87e\": container with ID starting with 56984335d1bf159541879c99abf8ff4962c3858ef0d99c4636cec5c5521eb87e not found: ID does not exist" Apr 17 17:37:04.096872 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:04.096850 2566 scope.go:117] "RemoveContainer" containerID="670d5e6ec6f535d67522ddbb6461faacc3ab17a945e38f4e3bc694c0878738ee" Apr 17 17:37:04.097088 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:04.097065 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"670d5e6ec6f535d67522ddbb6461faacc3ab17a945e38f4e3bc694c0878738ee"} err="failed to get container status \"670d5e6ec6f535d67522ddbb6461faacc3ab17a945e38f4e3bc694c0878738ee\": rpc error: code = NotFound desc = could not find container \"670d5e6ec6f535d67522ddbb6461faacc3ab17a945e38f4e3bc694c0878738ee\": container with ID starting with 670d5e6ec6f535d67522ddbb6461faacc3ab17a945e38f4e3bc694c0878738ee not found: ID does not exist" Apr 17 17:37:04.097149 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:04.097089 2566 scope.go:117] "RemoveContainer" containerID="56984335d1bf159541879c99abf8ff4962c3858ef0d99c4636cec5c5521eb87e" Apr 17 17:37:04.097313 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:04.097296 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56984335d1bf159541879c99abf8ff4962c3858ef0d99c4636cec5c5521eb87e"} err="failed to get container status \"56984335d1bf159541879c99abf8ff4962c3858ef0d99c4636cec5c5521eb87e\": rpc error: code = NotFound desc = could not find container \"56984335d1bf159541879c99abf8ff4962c3858ef0d99c4636cec5c5521eb87e\": container with ID starting with 56984335d1bf159541879c99abf8ff4962c3858ef0d99c4636cec5c5521eb87e not found: ID does not exist" Apr 17 17:37:04.133731 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:04.133700 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-gmg59"] Apr 17 17:37:04.141520 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:04.141492 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-gmg59"] Apr 17 17:37:04.839170 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:04.839136 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b7b11d5-7be8-4e9d-8c74-0172c6f19543" path="/var/lib/kubelet/pods/2b7b11d5-7be8-4e9d-8c74-0172c6f19543/volumes" Apr 17 17:37:05.852737 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:05.852698 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" podUID="ebfc1119-7d52-4bf8-b997-987ec7032fb5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.21:8643/healthz\": dial tcp 10.133.0.21:8643: connect: connection refused" Apr 17 17:37:08.098915 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:08.098879 2566 generic.go:358] "Generic (PLEG): container finished" podID="ebfc1119-7d52-4bf8-b997-987ec7032fb5" containerID="fc446bc46aeeb98bcaaf50f6e6350e76fb7e94d78c8d14fd6e1ebbcaf5ee1a91" exitCode=0 Apr 17 17:37:08.099332 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:08.098947 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" event={"ID":"ebfc1119-7d52-4bf8-b997-987ec7032fb5","Type":"ContainerDied","Data":"fc446bc46aeeb98bcaaf50f6e6350e76fb7e94d78c8d14fd6e1ebbcaf5ee1a91"} Apr 17 17:37:08.100170 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:08.100146 2566 generic.go:358] "Generic (PLEG): container finished" podID="bd72de05-6f18-45e5-861d-a35b6692e91a" containerID="1eedd9e73d15aef13a1264a2af1ffe986ea7a10230c3d113902beebc5499eda9" exitCode=0 Apr 17 17:37:08.100326 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:08.100189 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9vmjw" event={"ID":"bd72de05-6f18-45e5-861d-a35b6692e91a","Type":"ContainerDied","Data":"1eedd9e73d15aef13a1264a2af1ffe986ea7a10230c3d113902beebc5499eda9"} Apr 17 17:37:10.852345 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:10.852277 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" podUID="ebfc1119-7d52-4bf8-b997-987ec7032fb5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.21:8643/healthz\": dial tcp 10.133.0.21:8643: connect: connection refused" Apr 17 17:37:10.857745 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:10.857719 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" podUID="ebfc1119-7d52-4bf8-b997-987ec7032fb5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 17 17:37:10.858473 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:10.858443 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" podUID="ebfc1119-7d52-4bf8-b997-987ec7032fb5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:37:15.125989 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:15.125951 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9vmjw" event={"ID":"bd72de05-6f18-45e5-861d-a35b6692e91a","Type":"ContainerStarted","Data":"f79d7130e95d4de341acc9b004ad9be3faa16e5a6d90f40ea4766d8fd85ef716"} Apr 17 17:37:15.125989 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:15.125993 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9vmjw" event={"ID":"bd72de05-6f18-45e5-861d-a35b6692e91a","Type":"ContainerStarted","Data":"ddf28f2e58d8c8f520e777999c904d730bf096687a180741eeb4b5a9e5f30dd5"} Apr 17 17:37:15.126410 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:15.126286 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9vmjw" Apr 17 17:37:15.126448 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:15.126407 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9vmjw" Apr 17 17:37:15.127607 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:15.127584 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9vmjw" podUID="bd72de05-6f18-45e5-861d-a35b6692e91a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 17 17:37:15.144904 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:15.144862 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9vmjw" podStartSLOduration=5.9098887300000005 podStartE2EDuration="12.144850344s" podCreationTimestamp="2026-04-17 17:37:03 +0000 UTC" firstStartedPulling="2026-04-17 17:37:08.101538721 +0000 UTC m=+739.771618344" lastFinishedPulling="2026-04-17 17:37:14.336500339 +0000 UTC m=+746.006579958" observedRunningTime="2026-04-17 17:37:15.143898894 +0000 UTC m=+746.813978536" watchObservedRunningTime="2026-04-17 17:37:15.144850344 +0000 UTC m=+746.814929985" Apr 17 17:37:15.852974 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:15.852929 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" podUID="ebfc1119-7d52-4bf8-b997-987ec7032fb5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.21:8643/healthz\": dial tcp 10.133.0.21:8643: connect: connection refused" Apr 17 17:37:15.853175 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:15.853058 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" Apr 17 17:37:16.130002 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:16.129915 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9vmjw" podUID="bd72de05-6f18-45e5-861d-a35b6692e91a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 17 17:37:20.852626 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:20.852584 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" podUID="ebfc1119-7d52-4bf8-b997-987ec7032fb5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.21:8643/healthz\": dial tcp 10.133.0.21:8643: connect: connection refused" Apr 17 17:37:20.856948 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:20.856923 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" podUID="ebfc1119-7d52-4bf8-b997-987ec7032fb5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 17 17:37:20.858594 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:20.858541 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" podUID="ebfc1119-7d52-4bf8-b997-987ec7032fb5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:37:21.135099 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:21.135023 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9vmjw" Apr 17 17:37:21.135668 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:21.135639 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9vmjw" podUID="bd72de05-6f18-45e5-861d-a35b6692e91a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 17 17:37:25.852933 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:25.852893 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" podUID="ebfc1119-7d52-4bf8-b997-987ec7032fb5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.21:8643/healthz\": dial tcp 10.133.0.21:8643: connect: connection refused" Apr 17 17:37:30.852016 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:30.851979 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" podUID="ebfc1119-7d52-4bf8-b997-987ec7032fb5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.21:8643/healthz\": dial tcp 10.133.0.21:8643: connect: connection refused" Apr 17 17:37:30.857429 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:30.857399 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" podUID="ebfc1119-7d52-4bf8-b997-987ec7032fb5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 17 17:37:30.857550 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:30.857535 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" Apr 17 17:37:30.858171 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:30.858146 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" podUID="ebfc1119-7d52-4bf8-b997-987ec7032fb5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:37:30.858266 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:30.858241 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" Apr 17 17:37:31.135888 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:31.135797 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9vmjw" podUID="bd72de05-6f18-45e5-861d-a35b6692e91a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 17 17:37:34.113423 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:34.113401 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" Apr 17 17:37:34.189654 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:34.189616 2566 generic.go:358] "Generic (PLEG): container finished" podID="ebfc1119-7d52-4bf8-b997-987ec7032fb5" containerID="55d62db5ff29c88cdbce72e22a544fd75ca1a3f5086ae6c4dff4fa54f06adf78" exitCode=0 Apr 17 17:37:34.189902 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:34.189668 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" event={"ID":"ebfc1119-7d52-4bf8-b997-987ec7032fb5","Type":"ContainerDied","Data":"55d62db5ff29c88cdbce72e22a544fd75ca1a3f5086ae6c4dff4fa54f06adf78"} Apr 17 17:37:34.189902 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:34.189703 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" event={"ID":"ebfc1119-7d52-4bf8-b997-987ec7032fb5","Type":"ContainerDied","Data":"650488f0b812dcc53a7ee354e79775fb95d458c47d8a997c1c651f554c2baee6"} Apr 17 17:37:34.189902 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:34.189722 2566 scope.go:117] "RemoveContainer" containerID="55d62db5ff29c88cdbce72e22a544fd75ca1a3f5086ae6c4dff4fa54f06adf78" Apr 17 17:37:34.189902 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:34.189723 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg" Apr 17 17:37:34.197755 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:34.197729 2566 scope.go:117] "RemoveContainer" containerID="e5a2e822de06e32bd2aa5566106ea64ed8989bb978e520ec791a76ab112a84e8" Apr 17 17:37:34.206142 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:34.206119 2566 scope.go:117] "RemoveContainer" containerID="fc446bc46aeeb98bcaaf50f6e6350e76fb7e94d78c8d14fd6e1ebbcaf5ee1a91" Apr 17 17:37:34.212912 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:34.212896 2566 scope.go:117] "RemoveContainer" containerID="250da3a69e8212967afc7d70ff00558c7fd203ac9dc0a37979647ddf32fd6c3a" Apr 17 17:37:34.219557 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:34.219539 2566 scope.go:117] "RemoveContainer" containerID="55d62db5ff29c88cdbce72e22a544fd75ca1a3f5086ae6c4dff4fa54f06adf78" Apr 17 17:37:34.219820 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:37:34.219801 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55d62db5ff29c88cdbce72e22a544fd75ca1a3f5086ae6c4dff4fa54f06adf78\": container with ID starting with 55d62db5ff29c88cdbce72e22a544fd75ca1a3f5086ae6c4dff4fa54f06adf78 not found: ID does not exist" containerID="55d62db5ff29c88cdbce72e22a544fd75ca1a3f5086ae6c4dff4fa54f06adf78" Apr 17 17:37:34.219876 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:34.219830 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55d62db5ff29c88cdbce72e22a544fd75ca1a3f5086ae6c4dff4fa54f06adf78"} err="failed to get container status \"55d62db5ff29c88cdbce72e22a544fd75ca1a3f5086ae6c4dff4fa54f06adf78\": rpc error: code = NotFound desc = could not find container \"55d62db5ff29c88cdbce72e22a544fd75ca1a3f5086ae6c4dff4fa54f06adf78\": container with ID starting with 55d62db5ff29c88cdbce72e22a544fd75ca1a3f5086ae6c4dff4fa54f06adf78 not found: ID does not exist" Apr 17 17:37:34.219876 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:34.219849 2566 scope.go:117] "RemoveContainer" containerID="e5a2e822de06e32bd2aa5566106ea64ed8989bb978e520ec791a76ab112a84e8" Apr 17 17:37:34.220035 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:37:34.220021 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5a2e822de06e32bd2aa5566106ea64ed8989bb978e520ec791a76ab112a84e8\": container with ID starting with e5a2e822de06e32bd2aa5566106ea64ed8989bb978e520ec791a76ab112a84e8 not found: ID does not exist" containerID="e5a2e822de06e32bd2aa5566106ea64ed8989bb978e520ec791a76ab112a84e8" Apr 17 17:37:34.220074 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:34.220037 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5a2e822de06e32bd2aa5566106ea64ed8989bb978e520ec791a76ab112a84e8"} err="failed to get container status \"e5a2e822de06e32bd2aa5566106ea64ed8989bb978e520ec791a76ab112a84e8\": rpc error: code = NotFound desc = could not find container \"e5a2e822de06e32bd2aa5566106ea64ed8989bb978e520ec791a76ab112a84e8\": container with ID starting with e5a2e822de06e32bd2aa5566106ea64ed8989bb978e520ec791a76ab112a84e8 not found: ID does not exist" Apr 17 17:37:34.220074 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:34.220048 2566 scope.go:117] "RemoveContainer" containerID="fc446bc46aeeb98bcaaf50f6e6350e76fb7e94d78c8d14fd6e1ebbcaf5ee1a91" Apr 17 17:37:34.220301 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:37:34.220284 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc446bc46aeeb98bcaaf50f6e6350e76fb7e94d78c8d14fd6e1ebbcaf5ee1a91\": container with ID starting with fc446bc46aeeb98bcaaf50f6e6350e76fb7e94d78c8d14fd6e1ebbcaf5ee1a91 not found: ID does not exist" containerID="fc446bc46aeeb98bcaaf50f6e6350e76fb7e94d78c8d14fd6e1ebbcaf5ee1a91" Apr 17 17:37:34.220344 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:34.220306 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc446bc46aeeb98bcaaf50f6e6350e76fb7e94d78c8d14fd6e1ebbcaf5ee1a91"} err="failed to get container status \"fc446bc46aeeb98bcaaf50f6e6350e76fb7e94d78c8d14fd6e1ebbcaf5ee1a91\": rpc error: code = NotFound desc = could not find container \"fc446bc46aeeb98bcaaf50f6e6350e76fb7e94d78c8d14fd6e1ebbcaf5ee1a91\": container with ID starting with fc446bc46aeeb98bcaaf50f6e6350e76fb7e94d78c8d14fd6e1ebbcaf5ee1a91 not found: ID does not exist" Apr 17 17:37:34.220344 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:34.220323 2566 scope.go:117] "RemoveContainer" containerID="250da3a69e8212967afc7d70ff00558c7fd203ac9dc0a37979647ddf32fd6c3a" Apr 17 17:37:34.220491 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:37:34.220475 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"250da3a69e8212967afc7d70ff00558c7fd203ac9dc0a37979647ddf32fd6c3a\": container with ID starting with 250da3a69e8212967afc7d70ff00558c7fd203ac9dc0a37979647ddf32fd6c3a not found: ID does not exist" containerID="250da3a69e8212967afc7d70ff00558c7fd203ac9dc0a37979647ddf32fd6c3a" Apr 17 17:37:34.220533 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:34.220494 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"250da3a69e8212967afc7d70ff00558c7fd203ac9dc0a37979647ddf32fd6c3a"} err="failed to get container status \"250da3a69e8212967afc7d70ff00558c7fd203ac9dc0a37979647ddf32fd6c3a\": rpc error: code = NotFound desc = could not find container \"250da3a69e8212967afc7d70ff00558c7fd203ac9dc0a37979647ddf32fd6c3a\": container with ID starting with 250da3a69e8212967afc7d70ff00558c7fd203ac9dc0a37979647ddf32fd6c3a not found: ID does not exist" Apr 17 17:37:34.268506 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:34.268419 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdw8t\" (UniqueName: \"kubernetes.io/projected/ebfc1119-7d52-4bf8-b997-987ec7032fb5-kube-api-access-fdw8t\") pod \"ebfc1119-7d52-4bf8-b997-987ec7032fb5\" (UID: \"ebfc1119-7d52-4bf8-b997-987ec7032fb5\") " Apr 17 17:37:34.268506 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:34.268489 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ebfc1119-7d52-4bf8-b997-987ec7032fb5-kserve-provision-location\") pod \"ebfc1119-7d52-4bf8-b997-987ec7032fb5\" (UID: \"ebfc1119-7d52-4bf8-b997-987ec7032fb5\") " Apr 17 17:37:34.268692 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:34.268546 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ebfc1119-7d52-4bf8-b997-987ec7032fb5-proxy-tls\") pod \"ebfc1119-7d52-4bf8-b997-987ec7032fb5\" (UID: \"ebfc1119-7d52-4bf8-b997-987ec7032fb5\") " Apr 17 17:37:34.268692 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:34.268570 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ebfc1119-7d52-4bf8-b997-987ec7032fb5-isvc-logger-kube-rbac-proxy-sar-config\") pod \"ebfc1119-7d52-4bf8-b997-987ec7032fb5\" (UID: \"ebfc1119-7d52-4bf8-b997-987ec7032fb5\") " Apr 17 17:37:34.268875 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:34.268846 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebfc1119-7d52-4bf8-b997-987ec7032fb5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ebfc1119-7d52-4bf8-b997-987ec7032fb5" (UID: "ebfc1119-7d52-4bf8-b997-987ec7032fb5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:37:34.268934 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:34.268914 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebfc1119-7d52-4bf8-b997-987ec7032fb5-isvc-logger-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-logger-kube-rbac-proxy-sar-config") pod "ebfc1119-7d52-4bf8-b997-987ec7032fb5" (UID: "ebfc1119-7d52-4bf8-b997-987ec7032fb5"). InnerVolumeSpecName "isvc-logger-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:37:34.270603 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:34.270578 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebfc1119-7d52-4bf8-b997-987ec7032fb5-kube-api-access-fdw8t" (OuterVolumeSpecName: "kube-api-access-fdw8t") pod "ebfc1119-7d52-4bf8-b997-987ec7032fb5" (UID: "ebfc1119-7d52-4bf8-b997-987ec7032fb5"). InnerVolumeSpecName "kube-api-access-fdw8t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:37:34.270692 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:34.270623 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebfc1119-7d52-4bf8-b997-987ec7032fb5-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ebfc1119-7d52-4bf8-b997-987ec7032fb5" (UID: "ebfc1119-7d52-4bf8-b997-987ec7032fb5"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:37:34.369314 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:34.369284 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ebfc1119-7d52-4bf8-b997-987ec7032fb5-kserve-provision-location\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:37:34.369314 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:34.369310 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ebfc1119-7d52-4bf8-b997-987ec7032fb5-proxy-tls\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:37:34.369314 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:34.369321 2566 reconciler_common.go:299] "Volume detached for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ebfc1119-7d52-4bf8-b997-987ec7032fb5-isvc-logger-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:37:34.369531 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:34.369332 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fdw8t\" (UniqueName: \"kubernetes.io/projected/ebfc1119-7d52-4bf8-b997-987ec7032fb5-kube-api-access-fdw8t\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:37:34.513682 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:34.513653 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg"] Apr 17 17:37:34.519062 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:34.519009 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-wlplg"] Apr 17 17:37:34.838357 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:34.838326 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebfc1119-7d52-4bf8-b997-987ec7032fb5" path="/var/lib/kubelet/pods/ebfc1119-7d52-4bf8-b997-987ec7032fb5/volumes" Apr 17 17:37:41.135920 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:41.135882 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9vmjw" podUID="bd72de05-6f18-45e5-861d-a35b6692e91a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 17 17:37:51.136263 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:37:51.136214 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9vmjw" podUID="bd72de05-6f18-45e5-861d-a35b6692e91a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 17 17:38:01.136126 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:01.136076 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9vmjw" podUID="bd72de05-6f18-45e5-861d-a35b6692e91a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 17 17:38:11.136044 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:11.136005 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9vmjw" podUID="bd72de05-6f18-45e5-861d-a35b6692e91a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 17 17:38:21.136438 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:21.136390 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9vmjw" podUID="bd72de05-6f18-45e5-861d-a35b6692e91a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 17 17:38:31.136109 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:31.136080 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9vmjw" Apr 17 17:38:33.618383 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:33.618344 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9vmjw"] Apr 17 17:38:33.618799 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:33.618750 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9vmjw" podUID="bd72de05-6f18-45e5-861d-a35b6692e91a" containerName="kserve-container" containerID="cri-o://ddf28f2e58d8c8f520e777999c904d730bf096687a180741eeb4b5a9e5f30dd5" gracePeriod=30 Apr 17 17:38:33.618865 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:33.618790 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9vmjw" podUID="bd72de05-6f18-45e5-861d-a35b6692e91a" containerName="kube-rbac-proxy" containerID="cri-o://f79d7130e95d4de341acc9b004ad9be3faa16e5a6d90f40ea4766d8fd85ef716" gracePeriod=30 Apr 17 17:38:33.750183 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:33.750156 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj"] Apr 17 17:38:33.750495 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:33.750481 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b7b11d5-7be8-4e9d-8c74-0172c6f19543" containerName="kube-rbac-proxy" Apr 17 17:38:33.750495 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:33.750496 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b7b11d5-7be8-4e9d-8c74-0172c6f19543" containerName="kube-rbac-proxy" Apr 17 17:38:33.750577 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:33.750506 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ebfc1119-7d52-4bf8-b997-987ec7032fb5" containerName="kserve-container" Apr 17 17:38:33.750577 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:33.750513 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebfc1119-7d52-4bf8-b997-987ec7032fb5" containerName="kserve-container" Apr 17 17:38:33.750577 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:33.750522 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b7b11d5-7be8-4e9d-8c74-0172c6f19543" containerName="kserve-container" Apr 17 17:38:33.750577 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:33.750527 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b7b11d5-7be8-4e9d-8c74-0172c6f19543" containerName="kserve-container" Apr 17 17:38:33.750577 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:33.750533 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ebfc1119-7d52-4bf8-b997-987ec7032fb5" containerName="storage-initializer" Apr 17 17:38:33.750577 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:33.750539 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebfc1119-7d52-4bf8-b997-987ec7032fb5" containerName="storage-initializer" Apr 17 17:38:33.750577 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:33.750549 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ebfc1119-7d52-4bf8-b997-987ec7032fb5" containerName="kube-rbac-proxy" Apr 17 17:38:33.750577 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:33.750556 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebfc1119-7d52-4bf8-b997-987ec7032fb5" containerName="kube-rbac-proxy" Apr 17 17:38:33.750577 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:33.750564 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ebfc1119-7d52-4bf8-b997-987ec7032fb5" containerName="agent" Apr 17 17:38:33.750577 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:33.750569 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebfc1119-7d52-4bf8-b997-987ec7032fb5" containerName="agent" Apr 17 17:38:33.750910 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:33.750615 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="2b7b11d5-7be8-4e9d-8c74-0172c6f19543" containerName="kube-rbac-proxy" Apr 17 17:38:33.750910 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:33.750624 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="ebfc1119-7d52-4bf8-b997-987ec7032fb5" containerName="kserve-container" Apr 17 17:38:33.750910 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:33.750630 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="2b7b11d5-7be8-4e9d-8c74-0172c6f19543" containerName="kserve-container" Apr 17 17:38:33.750910 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:33.750637 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="ebfc1119-7d52-4bf8-b997-987ec7032fb5" containerName="kube-rbac-proxy" Apr 17 17:38:33.750910 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:33.750643 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="ebfc1119-7d52-4bf8-b997-987ec7032fb5" containerName="agent" Apr 17 17:38:33.753609 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:33.753592 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj" Apr 17 17:38:33.756279 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:33.756245 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-runtime-predictor-serving-cert\"" Apr 17 17:38:33.756736 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:33.756719 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\"" Apr 17 17:38:33.764643 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:33.764618 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj"] Apr 17 17:38:33.867532 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:33.867481 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/848a30d4-1b62-4b43-92f2-29a9dfe26a15-proxy-tls\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj\" (UID: \"848a30d4-1b62-4b43-92f2-29a9dfe26a15\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj" Apr 17 17:38:33.867717 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:33.867609 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/848a30d4-1b62-4b43-92f2-29a9dfe26a15-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj\" (UID: \"848a30d4-1b62-4b43-92f2-29a9dfe26a15\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj" Apr 17 17:38:33.867717 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:33.867676 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/848a30d4-1b62-4b43-92f2-29a9dfe26a15-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj\" (UID: \"848a30d4-1b62-4b43-92f2-29a9dfe26a15\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj" Apr 17 17:38:33.867717 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:33.867713 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzf8m\" (UniqueName: \"kubernetes.io/projected/848a30d4-1b62-4b43-92f2-29a9dfe26a15-kube-api-access-fzf8m\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj\" (UID: \"848a30d4-1b62-4b43-92f2-29a9dfe26a15\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj" Apr 17 17:38:33.969021 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:33.968940 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/848a30d4-1b62-4b43-92f2-29a9dfe26a15-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj\" (UID: \"848a30d4-1b62-4b43-92f2-29a9dfe26a15\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj" Apr 17 17:38:33.969021 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:33.969001 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/848a30d4-1b62-4b43-92f2-29a9dfe26a15-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj\" (UID: \"848a30d4-1b62-4b43-92f2-29a9dfe26a15\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj" Apr 17 17:38:33.969365 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:33.969055 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fzf8m\" (UniqueName: \"kubernetes.io/projected/848a30d4-1b62-4b43-92f2-29a9dfe26a15-kube-api-access-fzf8m\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj\" (UID: \"848a30d4-1b62-4b43-92f2-29a9dfe26a15\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj" Apr 17 17:38:33.969365 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:33.969097 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/848a30d4-1b62-4b43-92f2-29a9dfe26a15-proxy-tls\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj\" (UID: \"848a30d4-1b62-4b43-92f2-29a9dfe26a15\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj" Apr 17 17:38:33.969469 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:33.969445 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/848a30d4-1b62-4b43-92f2-29a9dfe26a15-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj\" (UID: \"848a30d4-1b62-4b43-92f2-29a9dfe26a15\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj" Apr 17 17:38:33.969809 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:33.969787 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/848a30d4-1b62-4b43-92f2-29a9dfe26a15-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj\" (UID: \"848a30d4-1b62-4b43-92f2-29a9dfe26a15\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj" Apr 17 17:38:33.971745 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:33.971709 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/848a30d4-1b62-4b43-92f2-29a9dfe26a15-proxy-tls\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj\" (UID: \"848a30d4-1b62-4b43-92f2-29a9dfe26a15\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj" Apr 17 17:38:33.977695 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:33.977670 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzf8m\" (UniqueName: \"kubernetes.io/projected/848a30d4-1b62-4b43-92f2-29a9dfe26a15-kube-api-access-fzf8m\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj\" (UID: \"848a30d4-1b62-4b43-92f2-29a9dfe26a15\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj" Apr 17 17:38:34.063105 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:34.063064 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj" Apr 17 17:38:34.186088 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:34.186053 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj"] Apr 17 17:38:34.189517 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:38:34.189488 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod848a30d4_1b62_4b43_92f2_29a9dfe26a15.slice/crio-41d11b3ab9ad68679e57c576b9fefccd99074f88302370380a70a76494cacdbc WatchSource:0}: Error finding container 41d11b3ab9ad68679e57c576b9fefccd99074f88302370380a70a76494cacdbc: Status 404 returned error can't find the container with id 41d11b3ab9ad68679e57c576b9fefccd99074f88302370380a70a76494cacdbc Apr 17 17:38:34.362767 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:34.362737 2566 generic.go:358] "Generic (PLEG): container finished" podID="bd72de05-6f18-45e5-861d-a35b6692e91a" containerID="f79d7130e95d4de341acc9b004ad9be3faa16e5a6d90f40ea4766d8fd85ef716" exitCode=2 Apr 17 17:38:34.362929 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:34.362816 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9vmjw" event={"ID":"bd72de05-6f18-45e5-861d-a35b6692e91a","Type":"ContainerDied","Data":"f79d7130e95d4de341acc9b004ad9be3faa16e5a6d90f40ea4766d8fd85ef716"} Apr 17 17:38:34.364107 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:34.364081 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj" event={"ID":"848a30d4-1b62-4b43-92f2-29a9dfe26a15","Type":"ContainerStarted","Data":"b172902dc48d9eef568f2a788bea939a872ce0c0ed6326b6d8f6ae95eef50cbd"} Apr 17 17:38:34.364291 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:34.364114 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj" event={"ID":"848a30d4-1b62-4b43-92f2-29a9dfe26a15","Type":"ContainerStarted","Data":"41d11b3ab9ad68679e57c576b9fefccd99074f88302370380a70a76494cacdbc"} Apr 17 17:38:36.130658 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:36.130616 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9vmjw" podUID="bd72de05-6f18-45e5-861d-a35b6692e91a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.22:8643/healthz\": dial tcp 10.133.0.22:8643: connect: connection refused" Apr 17 17:38:38.254391 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:38.254367 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9vmjw" Apr 17 17:38:38.301907 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:38.301875 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bd72de05-6f18-45e5-861d-a35b6692e91a-kserve-provision-location\") pod \"bd72de05-6f18-45e5-861d-a35b6692e91a\" (UID: \"bd72de05-6f18-45e5-861d-a35b6692e91a\") " Apr 17 17:38:38.302074 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:38.301920 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd72de05-6f18-45e5-861d-a35b6692e91a-proxy-tls\") pod \"bd72de05-6f18-45e5-861d-a35b6692e91a\" (UID: \"bd72de05-6f18-45e5-861d-a35b6692e91a\") " Apr 17 17:38:38.302074 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:38.301942 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6pj5\" (UniqueName: \"kubernetes.io/projected/bd72de05-6f18-45e5-861d-a35b6692e91a-kube-api-access-p6pj5\") pod \"bd72de05-6f18-45e5-861d-a35b6692e91a\" (UID: \"bd72de05-6f18-45e5-861d-a35b6692e91a\") " Apr 17 17:38:38.302074 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:38.301963 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bd72de05-6f18-45e5-861d-a35b6692e91a-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"bd72de05-6f18-45e5-861d-a35b6692e91a\" (UID: \"bd72de05-6f18-45e5-861d-a35b6692e91a\") " Apr 17 17:38:38.302312 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:38.302284 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd72de05-6f18-45e5-861d-a35b6692e91a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bd72de05-6f18-45e5-861d-a35b6692e91a" (UID: "bd72de05-6f18-45e5-861d-a35b6692e91a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:38:38.302389 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:38.302333 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd72de05-6f18-45e5-861d-a35b6692e91a-isvc-lightgbm-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-kube-rbac-proxy-sar-config") pod "bd72de05-6f18-45e5-861d-a35b6692e91a" (UID: "bd72de05-6f18-45e5-861d-a35b6692e91a"). InnerVolumeSpecName "isvc-lightgbm-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:38:38.303932 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:38.303915 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd72de05-6f18-45e5-861d-a35b6692e91a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "bd72de05-6f18-45e5-861d-a35b6692e91a" (UID: "bd72de05-6f18-45e5-861d-a35b6692e91a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:38:38.304018 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:38.303912 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd72de05-6f18-45e5-861d-a35b6692e91a-kube-api-access-p6pj5" (OuterVolumeSpecName: "kube-api-access-p6pj5") pod "bd72de05-6f18-45e5-861d-a35b6692e91a" (UID: "bd72de05-6f18-45e5-861d-a35b6692e91a"). InnerVolumeSpecName "kube-api-access-p6pj5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:38:38.377465 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:38.377426 2566 generic.go:358] "Generic (PLEG): container finished" podID="bd72de05-6f18-45e5-861d-a35b6692e91a" containerID="ddf28f2e58d8c8f520e777999c904d730bf096687a180741eeb4b5a9e5f30dd5" exitCode=0 Apr 17 17:38:38.377610 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:38.377502 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9vmjw" Apr 17 17:38:38.377610 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:38.377502 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9vmjw" event={"ID":"bd72de05-6f18-45e5-861d-a35b6692e91a","Type":"ContainerDied","Data":"ddf28f2e58d8c8f520e777999c904d730bf096687a180741eeb4b5a9e5f30dd5"} Apr 17 17:38:38.377610 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:38.377601 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9vmjw" event={"ID":"bd72de05-6f18-45e5-861d-a35b6692e91a","Type":"ContainerDied","Data":"019133a6d27b038e462d2fdfcf71c293fd8c936a77c3bde8bded2c96a2842c0f"} Apr 17 17:38:38.377796 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:38.377621 2566 scope.go:117] "RemoveContainer" containerID="f79d7130e95d4de341acc9b004ad9be3faa16e5a6d90f40ea4766d8fd85ef716" Apr 17 17:38:38.378901 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:38.378878 2566 generic.go:358] "Generic (PLEG): container finished" podID="848a30d4-1b62-4b43-92f2-29a9dfe26a15" containerID="b172902dc48d9eef568f2a788bea939a872ce0c0ed6326b6d8f6ae95eef50cbd" exitCode=0 Apr 17 17:38:38.378990 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:38.378916 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj" event={"ID":"848a30d4-1b62-4b43-92f2-29a9dfe26a15","Type":"ContainerDied","Data":"b172902dc48d9eef568f2a788bea939a872ce0c0ed6326b6d8f6ae95eef50cbd"} Apr 17 17:38:38.386030 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:38.386014 2566 scope.go:117] "RemoveContainer" containerID="ddf28f2e58d8c8f520e777999c904d730bf096687a180741eeb4b5a9e5f30dd5" Apr 17 17:38:38.392827 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:38.392807 2566 scope.go:117] "RemoveContainer" containerID="1eedd9e73d15aef13a1264a2af1ffe986ea7a10230c3d113902beebc5499eda9" Apr 17 17:38:38.400107 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:38.400089 2566 scope.go:117] "RemoveContainer" containerID="f79d7130e95d4de341acc9b004ad9be3faa16e5a6d90f40ea4766d8fd85ef716" Apr 17 17:38:38.400391 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:38:38.400364 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f79d7130e95d4de341acc9b004ad9be3faa16e5a6d90f40ea4766d8fd85ef716\": container with ID starting with f79d7130e95d4de341acc9b004ad9be3faa16e5a6d90f40ea4766d8fd85ef716 not found: ID does not exist" containerID="f79d7130e95d4de341acc9b004ad9be3faa16e5a6d90f40ea4766d8fd85ef716" Apr 17 17:38:38.400469 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:38.400403 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f79d7130e95d4de341acc9b004ad9be3faa16e5a6d90f40ea4766d8fd85ef716"} err="failed to get container status \"f79d7130e95d4de341acc9b004ad9be3faa16e5a6d90f40ea4766d8fd85ef716\": rpc error: code = NotFound desc = could not find container \"f79d7130e95d4de341acc9b004ad9be3faa16e5a6d90f40ea4766d8fd85ef716\": container with ID starting with f79d7130e95d4de341acc9b004ad9be3faa16e5a6d90f40ea4766d8fd85ef716 not found: ID does not exist" Apr 17 17:38:38.400469 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:38.400427 2566 scope.go:117] "RemoveContainer" containerID="ddf28f2e58d8c8f520e777999c904d730bf096687a180741eeb4b5a9e5f30dd5" Apr 17 17:38:38.400698 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:38:38.400677 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddf28f2e58d8c8f520e777999c904d730bf096687a180741eeb4b5a9e5f30dd5\": container with ID starting with ddf28f2e58d8c8f520e777999c904d730bf096687a180741eeb4b5a9e5f30dd5 not found: ID does not exist" containerID="ddf28f2e58d8c8f520e777999c904d730bf096687a180741eeb4b5a9e5f30dd5" Apr 17 17:38:38.400738 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:38.400703 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddf28f2e58d8c8f520e777999c904d730bf096687a180741eeb4b5a9e5f30dd5"} err="failed to get container status \"ddf28f2e58d8c8f520e777999c904d730bf096687a180741eeb4b5a9e5f30dd5\": rpc error: code = NotFound desc = could not find container \"ddf28f2e58d8c8f520e777999c904d730bf096687a180741eeb4b5a9e5f30dd5\": container with ID starting with ddf28f2e58d8c8f520e777999c904d730bf096687a180741eeb4b5a9e5f30dd5 not found: ID does not exist" Apr 17 17:38:38.400738 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:38.400717 2566 scope.go:117] "RemoveContainer" containerID="1eedd9e73d15aef13a1264a2af1ffe986ea7a10230c3d113902beebc5499eda9" Apr 17 17:38:38.400937 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:38:38.400920 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1eedd9e73d15aef13a1264a2af1ffe986ea7a10230c3d113902beebc5499eda9\": container with ID starting with 1eedd9e73d15aef13a1264a2af1ffe986ea7a10230c3d113902beebc5499eda9 not found: ID does not exist" containerID="1eedd9e73d15aef13a1264a2af1ffe986ea7a10230c3d113902beebc5499eda9" Apr 17 17:38:38.400982 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:38.400940 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1eedd9e73d15aef13a1264a2af1ffe986ea7a10230c3d113902beebc5499eda9"} err="failed to get container status \"1eedd9e73d15aef13a1264a2af1ffe986ea7a10230c3d113902beebc5499eda9\": rpc error: code = NotFound desc = could not find container \"1eedd9e73d15aef13a1264a2af1ffe986ea7a10230c3d113902beebc5499eda9\": container with ID starting with 1eedd9e73d15aef13a1264a2af1ffe986ea7a10230c3d113902beebc5499eda9 not found: ID does not exist" Apr 17 17:38:38.402763 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:38.402743 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bd72de05-6f18-45e5-861d-a35b6692e91a-kserve-provision-location\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:38:38.402868 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:38.402765 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd72de05-6f18-45e5-861d-a35b6692e91a-proxy-tls\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:38:38.402868 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:38.402781 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p6pj5\" (UniqueName: \"kubernetes.io/projected/bd72de05-6f18-45e5-861d-a35b6692e91a-kube-api-access-p6pj5\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:38:38.402868 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:38.402794 2566 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bd72de05-6f18-45e5-861d-a35b6692e91a-isvc-lightgbm-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:38:38.433053 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:38.433030 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9vmjw"] Apr 17 17:38:38.440445 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:38.440426 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9vmjw"] Apr 17 17:38:38.839223 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:38.839182 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd72de05-6f18-45e5-861d-a35b6692e91a" path="/var/lib/kubelet/pods/bd72de05-6f18-45e5-861d-a35b6692e91a/volumes" Apr 17 17:38:39.383689 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:39.383654 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj" event={"ID":"848a30d4-1b62-4b43-92f2-29a9dfe26a15","Type":"ContainerStarted","Data":"214cd5b71653c87769f4cfdd8d1d23b3f7093ac10b45d8b572a138928b1d996d"} Apr 17 17:38:39.383689 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:39.383690 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj" event={"ID":"848a30d4-1b62-4b43-92f2-29a9dfe26a15","Type":"ContainerStarted","Data":"f79c004704c716e4279a3676d47d73717ac95a48fe954dcb7d83179136db528e"} Apr 17 17:38:39.384189 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:39.383937 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj" Apr 17 17:38:39.403234 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:39.403193 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj" podStartSLOduration=6.4031789230000005 podStartE2EDuration="6.403178923s" podCreationTimestamp="2026-04-17 17:38:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:38:39.402373489 +0000 UTC m=+831.072453130" watchObservedRunningTime="2026-04-17 17:38:39.403178923 +0000 UTC m=+831.073258566" Apr 17 17:38:40.387107 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:40.387076 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj" Apr 17 17:38:40.388341 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:40.388302 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj" podUID="848a30d4-1b62-4b43-92f2-29a9dfe26a15" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 17 17:38:41.389710 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:41.389675 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj" podUID="848a30d4-1b62-4b43-92f2-29a9dfe26a15" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 17 17:38:46.394020 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:46.393991 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj" Apr 17 17:38:46.394517 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:46.394461 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj" podUID="848a30d4-1b62-4b43-92f2-29a9dfe26a15" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 17 17:38:56.395240 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:38:56.395194 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj" podUID="848a30d4-1b62-4b43-92f2-29a9dfe26a15" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 17 17:39:06.394831 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:39:06.394786 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj" podUID="848a30d4-1b62-4b43-92f2-29a9dfe26a15" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 17 17:39:16.394968 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:39:16.394929 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj" podUID="848a30d4-1b62-4b43-92f2-29a9dfe26a15" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 17 17:39:26.395049 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:39:26.394966 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj" podUID="848a30d4-1b62-4b43-92f2-29a9dfe26a15" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 17 17:39:36.394495 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:39:36.394456 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj" podUID="848a30d4-1b62-4b43-92f2-29a9dfe26a15" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 17 17:39:46.394926 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:39:46.394882 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj" podUID="848a30d4-1b62-4b43-92f2-29a9dfe26a15" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 17 17:39:48.814027 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:39:48.814002 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz777_35fe1bc2-e1a4-4744-8f67-3cc38747e567/ovn-acl-logging/0.log" Apr 17 17:39:48.814475 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:39:48.814458 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz777_35fe1bc2-e1a4-4744-8f67-3cc38747e567/ovn-acl-logging/0.log" Apr 17 17:39:56.395479 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:39:56.395444 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj" Apr 17 17:40:04.310193 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:04.310148 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj"] Apr 17 17:40:04.310995 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:04.310618 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj" podUID="848a30d4-1b62-4b43-92f2-29a9dfe26a15" containerName="kserve-container" containerID="cri-o://f79c004704c716e4279a3676d47d73717ac95a48fe954dcb7d83179136db528e" gracePeriod=30 Apr 17 17:40:04.310995 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:04.310685 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj" podUID="848a30d4-1b62-4b43-92f2-29a9dfe26a15" containerName="kube-rbac-proxy" containerID="cri-o://214cd5b71653c87769f4cfdd8d1d23b3f7093ac10b45d8b572a138928b1d996d" gracePeriod=30 Apr 17 17:40:04.447211 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:04.447180 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-jmtt4"] Apr 17 17:40:04.447530 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:04.447511 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bd72de05-6f18-45e5-861d-a35b6692e91a" containerName="storage-initializer" Apr 17 17:40:04.447530 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:04.447528 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd72de05-6f18-45e5-861d-a35b6692e91a" containerName="storage-initializer" Apr 17 17:40:04.447530 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:04.447537 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bd72de05-6f18-45e5-861d-a35b6692e91a" containerName="kserve-container" Apr 17 17:40:04.447736 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:04.447543 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd72de05-6f18-45e5-861d-a35b6692e91a" containerName="kserve-container" Apr 17 17:40:04.447736 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:04.447562 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bd72de05-6f18-45e5-861d-a35b6692e91a" containerName="kube-rbac-proxy" Apr 17 17:40:04.447736 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:04.447577 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd72de05-6f18-45e5-861d-a35b6692e91a" containerName="kube-rbac-proxy" Apr 17 17:40:04.447736 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:04.447619 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="bd72de05-6f18-45e5-861d-a35b6692e91a" containerName="kserve-container" Apr 17 17:40:04.447736 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:04.447630 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="bd72de05-6f18-45e5-861d-a35b6692e91a" containerName="kube-rbac-proxy" Apr 17 17:40:04.450581 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:04.450566 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-jmtt4" Apr 17 17:40:04.452815 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:04.452790 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 17 17:40:04.453011 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:04.452997 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-runtime-predictor-serving-cert\"" Apr 17 17:40:04.460514 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:04.460489 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-jmtt4"] Apr 17 17:40:04.512383 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:04.512351 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzq7k\" (UniqueName: \"kubernetes.io/projected/24df7c9b-6597-4392-9212-3809044f0293-kube-api-access-wzq7k\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-jmtt4\" (UID: \"24df7c9b-6597-4392-9212-3809044f0293\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-jmtt4" Apr 17 17:40:04.512552 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:04.512401 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/24df7c9b-6597-4392-9212-3809044f0293-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-jmtt4\" (UID: \"24df7c9b-6597-4392-9212-3809044f0293\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-jmtt4" Apr 17 17:40:04.512552 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:04.512462 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/24df7c9b-6597-4392-9212-3809044f0293-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-jmtt4\" (UID: \"24df7c9b-6597-4392-9212-3809044f0293\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-jmtt4" Apr 17 17:40:04.512552 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:04.512513 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/24df7c9b-6597-4392-9212-3809044f0293-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-jmtt4\" (UID: \"24df7c9b-6597-4392-9212-3809044f0293\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-jmtt4" Apr 17 17:40:04.613489 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:04.613416 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wzq7k\" (UniqueName: \"kubernetes.io/projected/24df7c9b-6597-4392-9212-3809044f0293-kube-api-access-wzq7k\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-jmtt4\" (UID: \"24df7c9b-6597-4392-9212-3809044f0293\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-jmtt4" Apr 17 17:40:04.613489 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:04.613457 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/24df7c9b-6597-4392-9212-3809044f0293-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-jmtt4\" (UID: \"24df7c9b-6597-4392-9212-3809044f0293\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-jmtt4" Apr 17 17:40:04.613489 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:04.613486 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/24df7c9b-6597-4392-9212-3809044f0293-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-jmtt4\" (UID: \"24df7c9b-6597-4392-9212-3809044f0293\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-jmtt4" Apr 17 17:40:04.613767 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:04.613506 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/24df7c9b-6597-4392-9212-3809044f0293-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-jmtt4\" (UID: \"24df7c9b-6597-4392-9212-3809044f0293\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-jmtt4" Apr 17 17:40:04.613922 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:04.613902 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/24df7c9b-6597-4392-9212-3809044f0293-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-jmtt4\" (UID: \"24df7c9b-6597-4392-9212-3809044f0293\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-jmtt4" Apr 17 17:40:04.614112 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:04.614093 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/24df7c9b-6597-4392-9212-3809044f0293-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-jmtt4\" (UID: \"24df7c9b-6597-4392-9212-3809044f0293\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-jmtt4" Apr 17 17:40:04.615955 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:04.615930 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/24df7c9b-6597-4392-9212-3809044f0293-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-jmtt4\" (UID: \"24df7c9b-6597-4392-9212-3809044f0293\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-jmtt4" Apr 17 17:40:04.621874 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:04.621855 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzq7k\" (UniqueName: \"kubernetes.io/projected/24df7c9b-6597-4392-9212-3809044f0293-kube-api-access-wzq7k\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-jmtt4\" (UID: \"24df7c9b-6597-4392-9212-3809044f0293\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-jmtt4" Apr 17 17:40:04.637952 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:04.637928 2566 generic.go:358] "Generic (PLEG): container finished" podID="848a30d4-1b62-4b43-92f2-29a9dfe26a15" containerID="214cd5b71653c87769f4cfdd8d1d23b3f7093ac10b45d8b572a138928b1d996d" exitCode=2 Apr 17 17:40:04.638058 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:04.637967 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj" event={"ID":"848a30d4-1b62-4b43-92f2-29a9dfe26a15","Type":"ContainerDied","Data":"214cd5b71653c87769f4cfdd8d1d23b3f7093ac10b45d8b572a138928b1d996d"} Apr 17 17:40:04.761402 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:04.761367 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-jmtt4" Apr 17 17:40:04.883273 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:04.883161 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-jmtt4"] Apr 17 17:40:04.885843 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:40:04.885814 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24df7c9b_6597_4392_9212_3809044f0293.slice/crio-2b7ab9acb215718ddc86724b7d9ebdc82ec2e71d611df3b41df30bda6de81523 WatchSource:0}: Error finding container 2b7ab9acb215718ddc86724b7d9ebdc82ec2e71d611df3b41df30bda6de81523: Status 404 returned error can't find the container with id 2b7ab9acb215718ddc86724b7d9ebdc82ec2e71d611df3b41df30bda6de81523 Apr 17 17:40:05.642170 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:05.642132 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-jmtt4" event={"ID":"24df7c9b-6597-4392-9212-3809044f0293","Type":"ContainerStarted","Data":"354da5f45c4f22f5a0c165981004c78f09e044b37424846d7f837ba405018b61"} Apr 17 17:40:05.642170 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:05.642175 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-jmtt4" event={"ID":"24df7c9b-6597-4392-9212-3809044f0293","Type":"ContainerStarted","Data":"2b7ab9acb215718ddc86724b7d9ebdc82ec2e71d611df3b41df30bda6de81523"} Apr 17 17:40:06.390452 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:06.390407 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj" podUID="848a30d4-1b62-4b43-92f2-29a9dfe26a15" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.23:8643/healthz\": dial tcp 10.133.0.23:8643: connect: connection refused" Apr 17 17:40:06.394747 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:06.394717 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj" podUID="848a30d4-1b62-4b43-92f2-29a9dfe26a15" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 17 17:40:08.653296 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:08.653242 2566 generic.go:358] "Generic (PLEG): container finished" podID="24df7c9b-6597-4392-9212-3809044f0293" containerID="354da5f45c4f22f5a0c165981004c78f09e044b37424846d7f837ba405018b61" exitCode=0 Apr 17 17:40:08.653649 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:08.653316 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-jmtt4" event={"ID":"24df7c9b-6597-4392-9212-3809044f0293","Type":"ContainerDied","Data":"354da5f45c4f22f5a0c165981004c78f09e044b37424846d7f837ba405018b61"} Apr 17 17:40:09.175970 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:09.175657 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj" Apr 17 17:40:09.255178 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:09.255094 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/848a30d4-1b62-4b43-92f2-29a9dfe26a15-kserve-provision-location\") pod \"848a30d4-1b62-4b43-92f2-29a9dfe26a15\" (UID: \"848a30d4-1b62-4b43-92f2-29a9dfe26a15\") " Apr 17 17:40:09.255178 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:09.255149 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/848a30d4-1b62-4b43-92f2-29a9dfe26a15-proxy-tls\") pod \"848a30d4-1b62-4b43-92f2-29a9dfe26a15\" (UID: \"848a30d4-1b62-4b43-92f2-29a9dfe26a15\") " Apr 17 17:40:09.255417 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:09.255205 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/848a30d4-1b62-4b43-92f2-29a9dfe26a15-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"848a30d4-1b62-4b43-92f2-29a9dfe26a15\" (UID: \"848a30d4-1b62-4b43-92f2-29a9dfe26a15\") " Apr 17 17:40:09.255417 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:09.255266 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzf8m\" (UniqueName: \"kubernetes.io/projected/848a30d4-1b62-4b43-92f2-29a9dfe26a15-kube-api-access-fzf8m\") pod \"848a30d4-1b62-4b43-92f2-29a9dfe26a15\" (UID: \"848a30d4-1b62-4b43-92f2-29a9dfe26a15\") " Apr 17 17:40:09.256185 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:09.255896 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/848a30d4-1b62-4b43-92f2-29a9dfe26a15-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-runtime-kube-rbac-proxy-sar-config") pod "848a30d4-1b62-4b43-92f2-29a9dfe26a15" (UID: "848a30d4-1b62-4b43-92f2-29a9dfe26a15"). InnerVolumeSpecName "isvc-lightgbm-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:40:09.256185 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:09.256018 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/848a30d4-1b62-4b43-92f2-29a9dfe26a15-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "848a30d4-1b62-4b43-92f2-29a9dfe26a15" (UID: "848a30d4-1b62-4b43-92f2-29a9dfe26a15"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:40:09.266231 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:09.262397 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/848a30d4-1b62-4b43-92f2-29a9dfe26a15-kube-api-access-fzf8m" (OuterVolumeSpecName: "kube-api-access-fzf8m") pod "848a30d4-1b62-4b43-92f2-29a9dfe26a15" (UID: "848a30d4-1b62-4b43-92f2-29a9dfe26a15"). InnerVolumeSpecName "kube-api-access-fzf8m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:40:09.266231 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:09.262527 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/848a30d4-1b62-4b43-92f2-29a9dfe26a15-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "848a30d4-1b62-4b43-92f2-29a9dfe26a15" (UID: "848a30d4-1b62-4b43-92f2-29a9dfe26a15"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:40:09.356186 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:09.356093 2566 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/848a30d4-1b62-4b43-92f2-29a9dfe26a15-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:40:09.356186 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:09.356130 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fzf8m\" (UniqueName: \"kubernetes.io/projected/848a30d4-1b62-4b43-92f2-29a9dfe26a15-kube-api-access-fzf8m\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:40:09.356186 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:09.356147 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/848a30d4-1b62-4b43-92f2-29a9dfe26a15-kserve-provision-location\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:40:09.356186 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:09.356162 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/848a30d4-1b62-4b43-92f2-29a9dfe26a15-proxy-tls\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:40:09.664178 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:09.664138 2566 generic.go:358] "Generic (PLEG): container finished" podID="848a30d4-1b62-4b43-92f2-29a9dfe26a15" containerID="f79c004704c716e4279a3676d47d73717ac95a48fe954dcb7d83179136db528e" exitCode=0 Apr 17 17:40:09.664650 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:09.664202 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj" event={"ID":"848a30d4-1b62-4b43-92f2-29a9dfe26a15","Type":"ContainerDied","Data":"f79c004704c716e4279a3676d47d73717ac95a48fe954dcb7d83179136db528e"} Apr 17 17:40:09.664650 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:09.664234 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj" event={"ID":"848a30d4-1b62-4b43-92f2-29a9dfe26a15","Type":"ContainerDied","Data":"41d11b3ab9ad68679e57c576b9fefccd99074f88302370380a70a76494cacdbc"} Apr 17 17:40:09.664650 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:09.664276 2566 scope.go:117] "RemoveContainer" containerID="214cd5b71653c87769f4cfdd8d1d23b3f7093ac10b45d8b572a138928b1d996d" Apr 17 17:40:09.664650 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:09.664449 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj" Apr 17 17:40:09.693922 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:09.693698 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj"] Apr 17 17:40:09.693922 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:09.693731 2566 scope.go:117] "RemoveContainer" containerID="f79c004704c716e4279a3676d47d73717ac95a48fe954dcb7d83179136db528e" Apr 17 17:40:09.699547 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:09.699502 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-kfwmj"] Apr 17 17:40:09.708180 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:09.708063 2566 scope.go:117] "RemoveContainer" containerID="b172902dc48d9eef568f2a788bea939a872ce0c0ed6326b6d8f6ae95eef50cbd" Apr 17 17:40:09.724704 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:09.724600 2566 scope.go:117] "RemoveContainer" containerID="214cd5b71653c87769f4cfdd8d1d23b3f7093ac10b45d8b572a138928b1d996d" Apr 17 17:40:09.725112 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:40:09.724988 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"214cd5b71653c87769f4cfdd8d1d23b3f7093ac10b45d8b572a138928b1d996d\": container with ID starting with 214cd5b71653c87769f4cfdd8d1d23b3f7093ac10b45d8b572a138928b1d996d not found: ID does not exist" containerID="214cd5b71653c87769f4cfdd8d1d23b3f7093ac10b45d8b572a138928b1d996d" Apr 17 17:40:09.725112 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:09.725025 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"214cd5b71653c87769f4cfdd8d1d23b3f7093ac10b45d8b572a138928b1d996d"} err="failed to get container status \"214cd5b71653c87769f4cfdd8d1d23b3f7093ac10b45d8b572a138928b1d996d\": rpc error: code = NotFound desc = could not find container \"214cd5b71653c87769f4cfdd8d1d23b3f7093ac10b45d8b572a138928b1d996d\": container with ID starting with 214cd5b71653c87769f4cfdd8d1d23b3f7093ac10b45d8b572a138928b1d996d not found: ID does not exist" Apr 17 17:40:09.725112 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:09.725050 2566 scope.go:117] "RemoveContainer" containerID="f79c004704c716e4279a3676d47d73717ac95a48fe954dcb7d83179136db528e" Apr 17 17:40:09.725915 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:40:09.725789 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f79c004704c716e4279a3676d47d73717ac95a48fe954dcb7d83179136db528e\": container with ID starting with f79c004704c716e4279a3676d47d73717ac95a48fe954dcb7d83179136db528e not found: ID does not exist" containerID="f79c004704c716e4279a3676d47d73717ac95a48fe954dcb7d83179136db528e" Apr 17 17:40:09.725915 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:09.725819 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f79c004704c716e4279a3676d47d73717ac95a48fe954dcb7d83179136db528e"} err="failed to get container status \"f79c004704c716e4279a3676d47d73717ac95a48fe954dcb7d83179136db528e\": rpc error: code = NotFound desc = could not find container \"f79c004704c716e4279a3676d47d73717ac95a48fe954dcb7d83179136db528e\": container with ID starting with f79c004704c716e4279a3676d47d73717ac95a48fe954dcb7d83179136db528e not found: ID does not exist" Apr 17 17:40:09.725915 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:09.725840 2566 scope.go:117] "RemoveContainer" containerID="b172902dc48d9eef568f2a788bea939a872ce0c0ed6326b6d8f6ae95eef50cbd" Apr 17 17:40:09.726448 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:40:09.726357 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b172902dc48d9eef568f2a788bea939a872ce0c0ed6326b6d8f6ae95eef50cbd\": container with ID starting with b172902dc48d9eef568f2a788bea939a872ce0c0ed6326b6d8f6ae95eef50cbd not found: ID does not exist" containerID="b172902dc48d9eef568f2a788bea939a872ce0c0ed6326b6d8f6ae95eef50cbd" Apr 17 17:40:09.726448 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:09.726406 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b172902dc48d9eef568f2a788bea939a872ce0c0ed6326b6d8f6ae95eef50cbd"} err="failed to get container status \"b172902dc48d9eef568f2a788bea939a872ce0c0ed6326b6d8f6ae95eef50cbd\": rpc error: code = NotFound desc = could not find container \"b172902dc48d9eef568f2a788bea939a872ce0c0ed6326b6d8f6ae95eef50cbd\": container with ID starting with b172902dc48d9eef568f2a788bea939a872ce0c0ed6326b6d8f6ae95eef50cbd not found: ID does not exist" Apr 17 17:40:10.841033 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:40:10.840964 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="848a30d4-1b62-4b43-92f2-29a9dfe26a15" path="/var/lib/kubelet/pods/848a30d4-1b62-4b43-92f2-29a9dfe26a15/volumes" Apr 17 17:42:40.210213 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:42:40.210188 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:42:41.154821 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:42:41.154783 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-jmtt4" event={"ID":"24df7c9b-6597-4392-9212-3809044f0293","Type":"ContainerStarted","Data":"5261f1e631ac9cb35c9b3b9afb51801437bc8a1030c89c95b36b81f02ba7a9c5"} Apr 17 17:42:41.154821 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:42:41.154823 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-jmtt4" event={"ID":"24df7c9b-6597-4392-9212-3809044f0293","Type":"ContainerStarted","Data":"ffc6a7ee7c9392120007f9f139e880c83148fff31a253873f954b2aa10fa7a37"} Apr 17 17:42:41.155035 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:42:41.154902 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-jmtt4" Apr 17 17:42:41.185425 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:42:41.185373 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-jmtt4" podStartSLOduration=5.771159346 podStartE2EDuration="2m37.185357593s" podCreationTimestamp="2026-04-17 17:40:04 +0000 UTC" firstStartedPulling="2026-04-17 17:40:08.654368179 +0000 UTC m=+920.324447799" lastFinishedPulling="2026-04-17 17:42:40.068566422 +0000 UTC m=+1071.738646046" observedRunningTime="2026-04-17 17:42:41.183331742 +0000 UTC m=+1072.853411385" watchObservedRunningTime="2026-04-17 17:42:41.185357593 +0000 UTC m=+1072.855437236" Apr 17 17:42:42.157898 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:42:42.157860 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-jmtt4" Apr 17 17:42:48.168161 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:42:48.168131 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-jmtt4" Apr 17 17:43:18.171424 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:18.171397 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-jmtt4" Apr 17 17:43:24.638090 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:24.638059 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-jmtt4"] Apr 17 17:43:24.638550 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:24.638516 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-jmtt4" podUID="24df7c9b-6597-4392-9212-3809044f0293" containerName="kserve-container" containerID="cri-o://ffc6a7ee7c9392120007f9f139e880c83148fff31a253873f954b2aa10fa7a37" gracePeriod=30 Apr 17 17:43:24.638621 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:24.638588 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-jmtt4" podUID="24df7c9b-6597-4392-9212-3809044f0293" containerName="kube-rbac-proxy" containerID="cri-o://5261f1e631ac9cb35c9b3b9afb51801437bc8a1030c89c95b36b81f02ba7a9c5" gracePeriod=30 Apr 17 17:43:24.765320 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:24.765289 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn"] Apr 17 17:43:24.765685 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:24.765668 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="848a30d4-1b62-4b43-92f2-29a9dfe26a15" containerName="storage-initializer" Apr 17 17:43:24.765769 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:24.765687 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="848a30d4-1b62-4b43-92f2-29a9dfe26a15" containerName="storage-initializer" Apr 17 17:43:24.765769 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:24.765714 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="848a30d4-1b62-4b43-92f2-29a9dfe26a15" containerName="kserve-container" Apr 17 17:43:24.765769 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:24.765723 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="848a30d4-1b62-4b43-92f2-29a9dfe26a15" containerName="kserve-container" Apr 17 17:43:24.765769 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:24.765739 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="848a30d4-1b62-4b43-92f2-29a9dfe26a15" containerName="kube-rbac-proxy" Apr 17 17:43:24.765769 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:24.765748 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="848a30d4-1b62-4b43-92f2-29a9dfe26a15" containerName="kube-rbac-proxy" Apr 17 17:43:24.766020 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:24.765827 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="848a30d4-1b62-4b43-92f2-29a9dfe26a15" containerName="kube-rbac-proxy" Apr 17 17:43:24.766020 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:24.765843 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="848a30d4-1b62-4b43-92f2-29a9dfe26a15" containerName="kserve-container" Apr 17 17:43:24.768984 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:24.768964 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn" Apr 17 17:43:24.772888 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:24.772864 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-kserve-predictor-serving-cert\"" Apr 17 17:43:24.773032 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:24.773014 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\"" Apr 17 17:43:24.782873 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:24.782840 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn"] Apr 17 17:43:24.867663 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:24.867627 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tlzw\" (UniqueName: \"kubernetes.io/projected/380f336d-3146-4288-97cf-67cb7422fccf-kube-api-access-8tlzw\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn\" (UID: \"380f336d-3146-4288-97cf-67cb7422fccf\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn" Apr 17 17:43:24.867663 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:24.867668 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/380f336d-3146-4288-97cf-67cb7422fccf-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn\" (UID: \"380f336d-3146-4288-97cf-67cb7422fccf\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn" Apr 17 17:43:24.867933 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:24.867694 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/380f336d-3146-4288-97cf-67cb7422fccf-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn\" (UID: \"380f336d-3146-4288-97cf-67cb7422fccf\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn" Apr 17 17:43:24.867933 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:24.867815 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/380f336d-3146-4288-97cf-67cb7422fccf-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn\" (UID: \"380f336d-3146-4288-97cf-67cb7422fccf\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn" Apr 17 17:43:24.969274 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:24.969185 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8tlzw\" (UniqueName: \"kubernetes.io/projected/380f336d-3146-4288-97cf-67cb7422fccf-kube-api-access-8tlzw\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn\" (UID: \"380f336d-3146-4288-97cf-67cb7422fccf\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn" Apr 17 17:43:24.969274 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:24.969221 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/380f336d-3146-4288-97cf-67cb7422fccf-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn\" (UID: \"380f336d-3146-4288-97cf-67cb7422fccf\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn" Apr 17 17:43:24.969449 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:24.969247 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/380f336d-3146-4288-97cf-67cb7422fccf-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn\" (UID: \"380f336d-3146-4288-97cf-67cb7422fccf\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn" Apr 17 17:43:24.969508 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:24.969480 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/380f336d-3146-4288-97cf-67cb7422fccf-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn\" (UID: \"380f336d-3146-4288-97cf-67cb7422fccf\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn" Apr 17 17:43:24.969647 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:43:24.969631 2566 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-serving-cert: secret "isvc-lightgbm-v2-kserve-predictor-serving-cert" not found Apr 17 17:43:24.969713 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:43:24.969701 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/380f336d-3146-4288-97cf-67cb7422fccf-proxy-tls podName:380f336d-3146-4288-97cf-67cb7422fccf nodeName:}" failed. No retries permitted until 2026-04-17 17:43:25.469683745 +0000 UTC m=+1117.139763366 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/380f336d-3146-4288-97cf-67cb7422fccf-proxy-tls") pod "isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn" (UID: "380f336d-3146-4288-97cf-67cb7422fccf") : secret "isvc-lightgbm-v2-kserve-predictor-serving-cert" not found Apr 17 17:43:24.969818 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:24.969798 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/380f336d-3146-4288-97cf-67cb7422fccf-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn\" (UID: \"380f336d-3146-4288-97cf-67cb7422fccf\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn" Apr 17 17:43:24.969994 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:24.969974 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/380f336d-3146-4288-97cf-67cb7422fccf-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn\" (UID: \"380f336d-3146-4288-97cf-67cb7422fccf\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn" Apr 17 17:43:24.979873 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:24.979845 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tlzw\" (UniqueName: \"kubernetes.io/projected/380f336d-3146-4288-97cf-67cb7422fccf-kube-api-access-8tlzw\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn\" (UID: \"380f336d-3146-4288-97cf-67cb7422fccf\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn" Apr 17 17:43:25.285817 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:25.285732 2566 generic.go:358] "Generic (PLEG): container finished" podID="24df7c9b-6597-4392-9212-3809044f0293" containerID="5261f1e631ac9cb35c9b3b9afb51801437bc8a1030c89c95b36b81f02ba7a9c5" exitCode=2 Apr 17 17:43:25.285817 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:25.285779 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-jmtt4" event={"ID":"24df7c9b-6597-4392-9212-3809044f0293","Type":"ContainerDied","Data":"5261f1e631ac9cb35c9b3b9afb51801437bc8a1030c89c95b36b81f02ba7a9c5"} Apr 17 17:43:25.474957 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:25.474922 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/380f336d-3146-4288-97cf-67cb7422fccf-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn\" (UID: \"380f336d-3146-4288-97cf-67cb7422fccf\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn" Apr 17 17:43:25.477435 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:25.477410 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/380f336d-3146-4288-97cf-67cb7422fccf-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn\" (UID: \"380f336d-3146-4288-97cf-67cb7422fccf\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn" Apr 17 17:43:25.676796 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:25.676771 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-jmtt4" Apr 17 17:43:25.686537 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:25.686509 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn" Apr 17 17:43:25.777704 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:25.777675 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzq7k\" (UniqueName: \"kubernetes.io/projected/24df7c9b-6597-4392-9212-3809044f0293-kube-api-access-wzq7k\") pod \"24df7c9b-6597-4392-9212-3809044f0293\" (UID: \"24df7c9b-6597-4392-9212-3809044f0293\") " Apr 17 17:43:25.777857 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:25.777773 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/24df7c9b-6597-4392-9212-3809044f0293-proxy-tls\") pod \"24df7c9b-6597-4392-9212-3809044f0293\" (UID: \"24df7c9b-6597-4392-9212-3809044f0293\") " Apr 17 17:43:25.777912 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:25.777889 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/24df7c9b-6597-4392-9212-3809044f0293-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"24df7c9b-6597-4392-9212-3809044f0293\" (UID: \"24df7c9b-6597-4392-9212-3809044f0293\") " Apr 17 17:43:25.777968 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:25.777934 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/24df7c9b-6597-4392-9212-3809044f0293-kserve-provision-location\") pod \"24df7c9b-6597-4392-9212-3809044f0293\" (UID: \"24df7c9b-6597-4392-9212-3809044f0293\") " Apr 17 17:43:25.778303 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:25.778272 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24df7c9b-6597-4392-9212-3809044f0293-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config") pod "24df7c9b-6597-4392-9212-3809044f0293" (UID: "24df7c9b-6597-4392-9212-3809044f0293"). InnerVolumeSpecName "isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:43:25.778433 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:25.778347 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24df7c9b-6597-4392-9212-3809044f0293-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "24df7c9b-6597-4392-9212-3809044f0293" (UID: "24df7c9b-6597-4392-9212-3809044f0293"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:43:25.780178 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:25.780152 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24df7c9b-6597-4392-9212-3809044f0293-kube-api-access-wzq7k" (OuterVolumeSpecName: "kube-api-access-wzq7k") pod "24df7c9b-6597-4392-9212-3809044f0293" (UID: "24df7c9b-6597-4392-9212-3809044f0293"). InnerVolumeSpecName "kube-api-access-wzq7k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:43:25.780304 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:25.780275 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24df7c9b-6597-4392-9212-3809044f0293-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "24df7c9b-6597-4392-9212-3809044f0293" (UID: "24df7c9b-6597-4392-9212-3809044f0293"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:43:25.810357 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:25.810334 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn"] Apr 17 17:43:25.813066 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:43:25.813043 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod380f336d_3146_4288_97cf_67cb7422fccf.slice/crio-3a7fda4692d5435e6b94c742d500778b3ddd66f11bb5109141fae24619aab837 WatchSource:0}: Error finding container 3a7fda4692d5435e6b94c742d500778b3ddd66f11bb5109141fae24619aab837: Status 404 returned error can't find the container with id 3a7fda4692d5435e6b94c742d500778b3ddd66f11bb5109141fae24619aab837 Apr 17 17:43:25.879774 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:25.879741 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/24df7c9b-6597-4392-9212-3809044f0293-proxy-tls\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:43:25.879865 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:25.879779 2566 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/24df7c9b-6597-4392-9212-3809044f0293-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:43:25.879865 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:25.879800 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/24df7c9b-6597-4392-9212-3809044f0293-kserve-provision-location\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:43:25.879865 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:25.879819 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wzq7k\" (UniqueName: \"kubernetes.io/projected/24df7c9b-6597-4392-9212-3809044f0293-kube-api-access-wzq7k\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:43:26.291437 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:26.291346 2566 generic.go:358] "Generic (PLEG): container finished" podID="24df7c9b-6597-4392-9212-3809044f0293" containerID="ffc6a7ee7c9392120007f9f139e880c83148fff31a253873f954b2aa10fa7a37" exitCode=0 Apr 17 17:43:26.291437 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:26.291425 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-jmtt4" Apr 17 17:43:26.291678 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:26.291431 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-jmtt4" event={"ID":"24df7c9b-6597-4392-9212-3809044f0293","Type":"ContainerDied","Data":"ffc6a7ee7c9392120007f9f139e880c83148fff31a253873f954b2aa10fa7a37"} Apr 17 17:43:26.291678 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:26.291466 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-jmtt4" event={"ID":"24df7c9b-6597-4392-9212-3809044f0293","Type":"ContainerDied","Data":"2b7ab9acb215718ddc86724b7d9ebdc82ec2e71d611df3b41df30bda6de81523"} Apr 17 17:43:26.291678 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:26.291484 2566 scope.go:117] "RemoveContainer" containerID="5261f1e631ac9cb35c9b3b9afb51801437bc8a1030c89c95b36b81f02ba7a9c5" Apr 17 17:43:26.292973 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:26.292949 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn" event={"ID":"380f336d-3146-4288-97cf-67cb7422fccf","Type":"ContainerStarted","Data":"d8222c6c212dbcb9141784b2a5daa59dfa9e081d515d56dbb7469475f4be8593"} Apr 17 17:43:26.293107 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:26.292980 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn" event={"ID":"380f336d-3146-4288-97cf-67cb7422fccf","Type":"ContainerStarted","Data":"3a7fda4692d5435e6b94c742d500778b3ddd66f11bb5109141fae24619aab837"} Apr 17 17:43:26.299557 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:26.299370 2566 scope.go:117] "RemoveContainer" containerID="ffc6a7ee7c9392120007f9f139e880c83148fff31a253873f954b2aa10fa7a37" Apr 17 17:43:26.306851 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:26.306779 2566 scope.go:117] "RemoveContainer" containerID="354da5f45c4f22f5a0c165981004c78f09e044b37424846d7f837ba405018b61" Apr 17 17:43:26.318004 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:26.317982 2566 scope.go:117] "RemoveContainer" containerID="5261f1e631ac9cb35c9b3b9afb51801437bc8a1030c89c95b36b81f02ba7a9c5" Apr 17 17:43:26.318292 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:43:26.318266 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5261f1e631ac9cb35c9b3b9afb51801437bc8a1030c89c95b36b81f02ba7a9c5\": container with ID starting with 5261f1e631ac9cb35c9b3b9afb51801437bc8a1030c89c95b36b81f02ba7a9c5 not found: ID does not exist" containerID="5261f1e631ac9cb35c9b3b9afb51801437bc8a1030c89c95b36b81f02ba7a9c5" Apr 17 17:43:26.318377 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:26.318301 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5261f1e631ac9cb35c9b3b9afb51801437bc8a1030c89c95b36b81f02ba7a9c5"} err="failed to get container status \"5261f1e631ac9cb35c9b3b9afb51801437bc8a1030c89c95b36b81f02ba7a9c5\": rpc error: code = NotFound desc = could not find container \"5261f1e631ac9cb35c9b3b9afb51801437bc8a1030c89c95b36b81f02ba7a9c5\": container with ID starting with 5261f1e631ac9cb35c9b3b9afb51801437bc8a1030c89c95b36b81f02ba7a9c5 not found: ID does not exist" Apr 17 17:43:26.318377 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:26.318321 2566 scope.go:117] "RemoveContainer" containerID="ffc6a7ee7c9392120007f9f139e880c83148fff31a253873f954b2aa10fa7a37" Apr 17 17:43:26.318587 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:43:26.318552 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffc6a7ee7c9392120007f9f139e880c83148fff31a253873f954b2aa10fa7a37\": container with ID starting with ffc6a7ee7c9392120007f9f139e880c83148fff31a253873f954b2aa10fa7a37 not found: ID does not exist" containerID="ffc6a7ee7c9392120007f9f139e880c83148fff31a253873f954b2aa10fa7a37" Apr 17 17:43:26.318655 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:26.318587 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffc6a7ee7c9392120007f9f139e880c83148fff31a253873f954b2aa10fa7a37"} err="failed to get container status \"ffc6a7ee7c9392120007f9f139e880c83148fff31a253873f954b2aa10fa7a37\": rpc error: code = NotFound desc = could not find container \"ffc6a7ee7c9392120007f9f139e880c83148fff31a253873f954b2aa10fa7a37\": container with ID starting with ffc6a7ee7c9392120007f9f139e880c83148fff31a253873f954b2aa10fa7a37 not found: ID does not exist" Apr 17 17:43:26.318655 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:26.318611 2566 scope.go:117] "RemoveContainer" containerID="354da5f45c4f22f5a0c165981004c78f09e044b37424846d7f837ba405018b61" Apr 17 17:43:26.318909 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:43:26.318886 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"354da5f45c4f22f5a0c165981004c78f09e044b37424846d7f837ba405018b61\": container with ID starting with 354da5f45c4f22f5a0c165981004c78f09e044b37424846d7f837ba405018b61 not found: ID does not exist" containerID="354da5f45c4f22f5a0c165981004c78f09e044b37424846d7f837ba405018b61" Apr 17 17:43:26.318970 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:26.318915 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"354da5f45c4f22f5a0c165981004c78f09e044b37424846d7f837ba405018b61"} err="failed to get container status \"354da5f45c4f22f5a0c165981004c78f09e044b37424846d7f837ba405018b61\": rpc error: code = NotFound desc = could not find container \"354da5f45c4f22f5a0c165981004c78f09e044b37424846d7f837ba405018b61\": container with ID starting with 354da5f45c4f22f5a0c165981004c78f09e044b37424846d7f837ba405018b61 not found: ID does not exist" Apr 17 17:43:26.330607 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:26.330586 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-jmtt4"] Apr 17 17:43:26.334390 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:26.334372 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-jmtt4"] Apr 17 17:43:26.838380 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:26.838349 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24df7c9b-6597-4392-9212-3809044f0293" path="/var/lib/kubelet/pods/24df7c9b-6597-4392-9212-3809044f0293/volumes" Apr 17 17:43:30.305897 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:30.305819 2566 generic.go:358] "Generic (PLEG): container finished" podID="380f336d-3146-4288-97cf-67cb7422fccf" containerID="d8222c6c212dbcb9141784b2a5daa59dfa9e081d515d56dbb7469475f4be8593" exitCode=0 Apr 17 17:43:30.306229 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:30.305893 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn" event={"ID":"380f336d-3146-4288-97cf-67cb7422fccf","Type":"ContainerDied","Data":"d8222c6c212dbcb9141784b2a5daa59dfa9e081d515d56dbb7469475f4be8593"} Apr 17 17:43:31.311139 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:31.311103 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn" event={"ID":"380f336d-3146-4288-97cf-67cb7422fccf","Type":"ContainerStarted","Data":"29fb197187784b1842d1df025e33e91be0a49596345c6426e3be3794a3e77ef9"} Apr 17 17:43:31.311139 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:31.311140 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn" event={"ID":"380f336d-3146-4288-97cf-67cb7422fccf","Type":"ContainerStarted","Data":"7c4665089e1e2acad7b62b8234e46c5b34582c09ee895af725e5c6a56106306f"} Apr 17 17:43:31.311589 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:31.311453 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn" Apr 17 17:43:31.311640 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:31.311601 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn" Apr 17 17:43:31.312573 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:31.312547 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn" podUID="380f336d-3146-4288-97cf-67cb7422fccf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 17 17:43:31.330664 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:31.330615 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn" podStartSLOduration=7.330599673 podStartE2EDuration="7.330599673s" podCreationTimestamp="2026-04-17 17:43:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:43:31.329996094 +0000 UTC m=+1123.000075736" watchObservedRunningTime="2026-04-17 17:43:31.330599673 +0000 UTC m=+1123.000679318" Apr 17 17:43:32.314468 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:32.314428 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn" podUID="380f336d-3146-4288-97cf-67cb7422fccf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 17 17:43:37.319029 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:37.319001 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn" Apr 17 17:43:37.319632 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:37.319604 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn" podUID="380f336d-3146-4288-97cf-67cb7422fccf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 17 17:43:47.320243 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:47.320211 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn" Apr 17 17:43:54.777520 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:54.777485 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn"] Apr 17 17:43:54.777916 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:54.777800 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn" podUID="380f336d-3146-4288-97cf-67cb7422fccf" containerName="kserve-container" containerID="cri-o://7c4665089e1e2acad7b62b8234e46c5b34582c09ee895af725e5c6a56106306f" gracePeriod=30 Apr 17 17:43:54.777916 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:54.777847 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn" podUID="380f336d-3146-4288-97cf-67cb7422fccf" containerName="kube-rbac-proxy" containerID="cri-o://29fb197187784b1842d1df025e33e91be0a49596345c6426e3be3794a3e77ef9" gracePeriod=30 Apr 17 17:43:54.876952 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:54.876921 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k"] Apr 17 17:43:54.877266 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:54.877238 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="24df7c9b-6597-4392-9212-3809044f0293" containerName="kserve-container" Apr 17 17:43:54.877313 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:54.877268 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="24df7c9b-6597-4392-9212-3809044f0293" containerName="kserve-container" Apr 17 17:43:54.877313 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:54.877289 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="24df7c9b-6597-4392-9212-3809044f0293" containerName="kube-rbac-proxy" Apr 17 17:43:54.877313 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:54.877294 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="24df7c9b-6597-4392-9212-3809044f0293" containerName="kube-rbac-proxy" Apr 17 17:43:54.877313 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:54.877301 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="24df7c9b-6597-4392-9212-3809044f0293" containerName="storage-initializer" Apr 17 17:43:54.877313 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:54.877306 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="24df7c9b-6597-4392-9212-3809044f0293" containerName="storage-initializer" Apr 17 17:43:54.877462 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:54.877391 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="24df7c9b-6597-4392-9212-3809044f0293" containerName="kserve-container" Apr 17 17:43:54.877462 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:54.877406 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="24df7c9b-6597-4392-9212-3809044f0293" containerName="kube-rbac-proxy" Apr 17 17:43:54.880417 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:54.880401 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k" Apr 17 17:43:54.882708 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:54.882686 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 17 17:43:54.882818 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:54.882686 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-mlflow-v2-runtime-predictor-serving-cert\"" Apr 17 17:43:54.894945 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:54.894925 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k"] Apr 17 17:43:54.910968 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:54.910943 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5905bdf9-065a-4899-973f-07fc299537b8-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k\" (UID: \"5905bdf9-065a-4899-973f-07fc299537b8\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k" Apr 17 17:43:54.911073 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:54.910973 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwczd\" (UniqueName: \"kubernetes.io/projected/5905bdf9-065a-4899-973f-07fc299537b8-kube-api-access-hwczd\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k\" (UID: \"5905bdf9-065a-4899-973f-07fc299537b8\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k" Apr 17 17:43:54.911073 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:54.910991 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5905bdf9-065a-4899-973f-07fc299537b8-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k\" (UID: \"5905bdf9-065a-4899-973f-07fc299537b8\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k" Apr 17 17:43:54.911073 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:54.911022 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5905bdf9-065a-4899-973f-07fc299537b8-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k\" (UID: \"5905bdf9-065a-4899-973f-07fc299537b8\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k" Apr 17 17:43:55.012698 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:55.012660 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5905bdf9-065a-4899-973f-07fc299537b8-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k\" (UID: \"5905bdf9-065a-4899-973f-07fc299537b8\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k" Apr 17 17:43:55.012883 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:55.012711 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hwczd\" (UniqueName: \"kubernetes.io/projected/5905bdf9-065a-4899-973f-07fc299537b8-kube-api-access-hwczd\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k\" (UID: \"5905bdf9-065a-4899-973f-07fc299537b8\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k" Apr 17 17:43:55.012883 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:55.012743 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5905bdf9-065a-4899-973f-07fc299537b8-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k\" (UID: \"5905bdf9-065a-4899-973f-07fc299537b8\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k" Apr 17 17:43:55.012883 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:55.012804 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5905bdf9-065a-4899-973f-07fc299537b8-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k\" (UID: \"5905bdf9-065a-4899-973f-07fc299537b8\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k" Apr 17 17:43:55.013041 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:43:55.012924 2566 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-serving-cert: secret "isvc-mlflow-v2-runtime-predictor-serving-cert" not found Apr 17 17:43:55.013095 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:43:55.013035 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5905bdf9-065a-4899-973f-07fc299537b8-proxy-tls podName:5905bdf9-065a-4899-973f-07fc299537b8 nodeName:}" failed. No retries permitted until 2026-04-17 17:43:55.513003181 +0000 UTC m=+1147.183082813 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/5905bdf9-065a-4899-973f-07fc299537b8-proxy-tls") pod "isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k" (UID: "5905bdf9-065a-4899-973f-07fc299537b8") : secret "isvc-mlflow-v2-runtime-predictor-serving-cert" not found Apr 17 17:43:55.013215 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:55.013197 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5905bdf9-065a-4899-973f-07fc299537b8-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k\" (UID: \"5905bdf9-065a-4899-973f-07fc299537b8\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k" Apr 17 17:43:55.013440 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:55.013421 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5905bdf9-065a-4899-973f-07fc299537b8-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k\" (UID: \"5905bdf9-065a-4899-973f-07fc299537b8\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k" Apr 17 17:43:55.023871 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:55.023837 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwczd\" (UniqueName: \"kubernetes.io/projected/5905bdf9-065a-4899-973f-07fc299537b8-kube-api-access-hwczd\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k\" (UID: \"5905bdf9-065a-4899-973f-07fc299537b8\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k" Apr 17 17:43:55.386039 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:55.385999 2566 generic.go:358] "Generic (PLEG): container finished" podID="380f336d-3146-4288-97cf-67cb7422fccf" containerID="29fb197187784b1842d1df025e33e91be0a49596345c6426e3be3794a3e77ef9" exitCode=2 Apr 17 17:43:55.386039 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:55.386024 2566 generic.go:358] "Generic (PLEG): container finished" podID="380f336d-3146-4288-97cf-67cb7422fccf" containerID="7c4665089e1e2acad7b62b8234e46c5b34582c09ee895af725e5c6a56106306f" exitCode=0 Apr 17 17:43:55.386232 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:55.386056 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn" event={"ID":"380f336d-3146-4288-97cf-67cb7422fccf","Type":"ContainerDied","Data":"29fb197187784b1842d1df025e33e91be0a49596345c6426e3be3794a3e77ef9"} Apr 17 17:43:55.386232 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:55.386081 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn" event={"ID":"380f336d-3146-4288-97cf-67cb7422fccf","Type":"ContainerDied","Data":"7c4665089e1e2acad7b62b8234e46c5b34582c09ee895af725e5c6a56106306f"} Apr 17 17:43:55.417514 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:55.417492 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn" Apr 17 17:43:55.517242 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:55.517207 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/380f336d-3146-4288-97cf-67cb7422fccf-proxy-tls\") pod \"380f336d-3146-4288-97cf-67cb7422fccf\" (UID: \"380f336d-3146-4288-97cf-67cb7422fccf\") " Apr 17 17:43:55.517449 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:55.517273 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/380f336d-3146-4288-97cf-67cb7422fccf-kserve-provision-location\") pod \"380f336d-3146-4288-97cf-67cb7422fccf\" (UID: \"380f336d-3146-4288-97cf-67cb7422fccf\") " Apr 17 17:43:55.517449 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:55.517307 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tlzw\" (UniqueName: \"kubernetes.io/projected/380f336d-3146-4288-97cf-67cb7422fccf-kube-api-access-8tlzw\") pod \"380f336d-3146-4288-97cf-67cb7422fccf\" (UID: \"380f336d-3146-4288-97cf-67cb7422fccf\") " Apr 17 17:43:55.517449 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:55.517340 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/380f336d-3146-4288-97cf-67cb7422fccf-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"380f336d-3146-4288-97cf-67cb7422fccf\" (UID: \"380f336d-3146-4288-97cf-67cb7422fccf\") " Apr 17 17:43:55.517449 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:55.517436 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5905bdf9-065a-4899-973f-07fc299537b8-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k\" (UID: \"5905bdf9-065a-4899-973f-07fc299537b8\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k" Apr 17 17:43:55.517658 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:43:55.517607 2566 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-serving-cert: secret "isvc-mlflow-v2-runtime-predictor-serving-cert" not found Apr 17 17:43:55.517715 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:55.517634 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/380f336d-3146-4288-97cf-67cb7422fccf-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "380f336d-3146-4288-97cf-67cb7422fccf" (UID: "380f336d-3146-4288-97cf-67cb7422fccf"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:43:55.517715 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:43:55.517676 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5905bdf9-065a-4899-973f-07fc299537b8-proxy-tls podName:5905bdf9-065a-4899-973f-07fc299537b8 nodeName:}" failed. No retries permitted until 2026-04-17 17:43:56.517660529 +0000 UTC m=+1148.187740150 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/5905bdf9-065a-4899-973f-07fc299537b8-proxy-tls") pod "isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k" (UID: "5905bdf9-065a-4899-973f-07fc299537b8") : secret "isvc-mlflow-v2-runtime-predictor-serving-cert" not found Apr 17 17:43:55.517795 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:55.517736 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/380f336d-3146-4288-97cf-67cb7422fccf-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config") pod "380f336d-3146-4288-97cf-67cb7422fccf" (UID: "380f336d-3146-4288-97cf-67cb7422fccf"). InnerVolumeSpecName "isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:43:55.519355 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:55.519300 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/380f336d-3146-4288-97cf-67cb7422fccf-kube-api-access-8tlzw" (OuterVolumeSpecName: "kube-api-access-8tlzw") pod "380f336d-3146-4288-97cf-67cb7422fccf" (UID: "380f336d-3146-4288-97cf-67cb7422fccf"). InnerVolumeSpecName "kube-api-access-8tlzw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:43:55.519355 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:55.519321 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/380f336d-3146-4288-97cf-67cb7422fccf-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "380f336d-3146-4288-97cf-67cb7422fccf" (UID: "380f336d-3146-4288-97cf-67cb7422fccf"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:43:55.618196 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:55.618139 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/380f336d-3146-4288-97cf-67cb7422fccf-kserve-provision-location\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:43:55.618196 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:55.618187 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8tlzw\" (UniqueName: \"kubernetes.io/projected/380f336d-3146-4288-97cf-67cb7422fccf-kube-api-access-8tlzw\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:43:55.618196 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:55.618202 2566 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/380f336d-3146-4288-97cf-67cb7422fccf-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:43:55.618538 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:55.618217 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/380f336d-3146-4288-97cf-67cb7422fccf-proxy-tls\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:43:56.390511 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:56.390433 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn" event={"ID":"380f336d-3146-4288-97cf-67cb7422fccf","Type":"ContainerDied","Data":"3a7fda4692d5435e6b94c742d500778b3ddd66f11bb5109141fae24619aab837"} Apr 17 17:43:56.390511 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:56.390474 2566 scope.go:117] "RemoveContainer" containerID="29fb197187784b1842d1df025e33e91be0a49596345c6426e3be3794a3e77ef9" Apr 17 17:43:56.390511 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:56.390475 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn" Apr 17 17:43:56.398701 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:56.398684 2566 scope.go:117] "RemoveContainer" containerID="7c4665089e1e2acad7b62b8234e46c5b34582c09ee895af725e5c6a56106306f" Apr 17 17:43:56.405494 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:56.405478 2566 scope.go:117] "RemoveContainer" containerID="d8222c6c212dbcb9141784b2a5daa59dfa9e081d515d56dbb7469475f4be8593" Apr 17 17:43:56.413164 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:56.413144 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn"] Apr 17 17:43:56.419660 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:56.419638 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-qkrgn"] Apr 17 17:43:56.525199 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:56.525162 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5905bdf9-065a-4899-973f-07fc299537b8-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k\" (UID: \"5905bdf9-065a-4899-973f-07fc299537b8\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k" Apr 17 17:43:56.527479 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:56.527456 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5905bdf9-065a-4899-973f-07fc299537b8-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k\" (UID: \"5905bdf9-065a-4899-973f-07fc299537b8\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k" Apr 17 17:43:56.690982 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:56.690893 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k" Apr 17 17:43:56.810728 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:56.810705 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k"] Apr 17 17:43:56.813865 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:43:56.813829 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5905bdf9_065a_4899_973f_07fc299537b8.slice/crio-0cc2121fdc9aeb0ccb04a08ca0a96198514c0f5dc71a989a30e3aa7e1fdf25bd WatchSource:0}: Error finding container 0cc2121fdc9aeb0ccb04a08ca0a96198514c0f5dc71a989a30e3aa7e1fdf25bd: Status 404 returned error can't find the container with id 0cc2121fdc9aeb0ccb04a08ca0a96198514c0f5dc71a989a30e3aa7e1fdf25bd Apr 17 17:43:56.838444 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:56.838409 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="380f336d-3146-4288-97cf-67cb7422fccf" path="/var/lib/kubelet/pods/380f336d-3146-4288-97cf-67cb7422fccf/volumes" Apr 17 17:43:57.394716 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:57.394679 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k" event={"ID":"5905bdf9-065a-4899-973f-07fc299537b8","Type":"ContainerStarted","Data":"21992d7029421c043ab054fc8fc95466ed99e700f50c33cb8dba8cd10fe118cd"} Apr 17 17:43:57.394716 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:43:57.394718 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k" event={"ID":"5905bdf9-065a-4899-973f-07fc299537b8","Type":"ContainerStarted","Data":"0cc2121fdc9aeb0ccb04a08ca0a96198514c0f5dc71a989a30e3aa7e1fdf25bd"} Apr 17 17:44:01.408577 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:01.408546 2566 generic.go:358] "Generic (PLEG): container finished" podID="5905bdf9-065a-4899-973f-07fc299537b8" containerID="21992d7029421c043ab054fc8fc95466ed99e700f50c33cb8dba8cd10fe118cd" exitCode=0 Apr 17 17:44:01.408951 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:01.408618 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k" event={"ID":"5905bdf9-065a-4899-973f-07fc299537b8","Type":"ContainerDied","Data":"21992d7029421c043ab054fc8fc95466ed99e700f50c33cb8dba8cd10fe118cd"} Apr 17 17:44:02.414385 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:02.414343 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k" event={"ID":"5905bdf9-065a-4899-973f-07fc299537b8","Type":"ContainerStarted","Data":"e82a00113c74e44e6eed9a99012f84cc76db30d69dd88eee120e4223caf45ba4"} Apr 17 17:44:02.414769 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:02.414395 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k" event={"ID":"5905bdf9-065a-4899-973f-07fc299537b8","Type":"ContainerStarted","Data":"dd0368703d9a01822aa773effd9587e99eb5b38614889876fc0a1d75b918799e"} Apr 17 17:44:02.414769 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:02.414627 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k" Apr 17 17:44:02.436598 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:02.436543 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k" podStartSLOduration=8.436526513 podStartE2EDuration="8.436526513s" podCreationTimestamp="2026-04-17 17:43:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:44:02.433982035 +0000 UTC m=+1154.104061676" watchObservedRunningTime="2026-04-17 17:44:02.436526513 +0000 UTC m=+1154.106606156" Apr 17 17:44:03.417614 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:03.417588 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k" Apr 17 17:44:09.426108 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:09.426075 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k" Apr 17 17:44:39.430092 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:39.430063 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k" Apr 17 17:44:44.956458 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:44.956409 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k"] Apr 17 17:44:44.956903 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:44.956694 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k" podUID="5905bdf9-065a-4899-973f-07fc299537b8" containerName="kserve-container" containerID="cri-o://dd0368703d9a01822aa773effd9587e99eb5b38614889876fc0a1d75b918799e" gracePeriod=30 Apr 17 17:44:44.956903 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:44.956728 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k" podUID="5905bdf9-065a-4899-973f-07fc299537b8" containerName="kube-rbac-proxy" containerID="cri-o://e82a00113c74e44e6eed9a99012f84cc76db30d69dd88eee120e4223caf45ba4" gracePeriod=30 Apr 17 17:44:45.064521 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:45.064486 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k"] Apr 17 17:44:45.064908 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:45.064887 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="380f336d-3146-4288-97cf-67cb7422fccf" containerName="kube-rbac-proxy" Apr 17 17:44:45.064908 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:45.064909 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="380f336d-3146-4288-97cf-67cb7422fccf" containerName="kube-rbac-proxy" Apr 17 17:44:45.065052 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:45.064932 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="380f336d-3146-4288-97cf-67cb7422fccf" containerName="kserve-container" Apr 17 17:44:45.065052 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:45.064940 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="380f336d-3146-4288-97cf-67cb7422fccf" containerName="kserve-container" Apr 17 17:44:45.065052 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:45.064955 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="380f336d-3146-4288-97cf-67cb7422fccf" containerName="storage-initializer" Apr 17 17:44:45.065052 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:45.064964 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="380f336d-3146-4288-97cf-67cb7422fccf" containerName="storage-initializer" Apr 17 17:44:45.065244 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:45.065065 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="380f336d-3146-4288-97cf-67cb7422fccf" containerName="kube-rbac-proxy" Apr 17 17:44:45.065244 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:45.065081 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="380f336d-3146-4288-97cf-67cb7422fccf" containerName="kserve-container" Apr 17 17:44:45.068308 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:45.068287 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k" Apr 17 17:44:45.070608 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:45.070585 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\"" Apr 17 17:44:45.070728 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:45.070664 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-mcp-predictor-serving-cert\"" Apr 17 17:44:45.080068 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:45.080040 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k"] Apr 17 17:44:45.134659 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:45.134628 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/939f6ff3-0ca9-40c7-b791-cc203da8037b-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k\" (UID: \"939f6ff3-0ca9-40c7-b791-cc203da8037b\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k" Apr 17 17:44:45.134659 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:45.134664 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/939f6ff3-0ca9-40c7-b791-cc203da8037b-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k\" (UID: \"939f6ff3-0ca9-40c7-b791-cc203da8037b\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k" Apr 17 17:44:45.134925 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:45.134685 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfp2q\" (UniqueName: \"kubernetes.io/projected/939f6ff3-0ca9-40c7-b791-cc203da8037b-kube-api-access-qfp2q\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k\" (UID: \"939f6ff3-0ca9-40c7-b791-cc203da8037b\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k" Apr 17 17:44:45.134925 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:45.134813 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/939f6ff3-0ca9-40c7-b791-cc203da8037b-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k\" (UID: \"939f6ff3-0ca9-40c7-b791-cc203da8037b\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k" Apr 17 17:44:45.235923 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:45.235822 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/939f6ff3-0ca9-40c7-b791-cc203da8037b-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k\" (UID: \"939f6ff3-0ca9-40c7-b791-cc203da8037b\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k" Apr 17 17:44:45.235923 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:45.235916 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/939f6ff3-0ca9-40c7-b791-cc203da8037b-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k\" (UID: \"939f6ff3-0ca9-40c7-b791-cc203da8037b\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k" Apr 17 17:44:45.236144 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:45.235950 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/939f6ff3-0ca9-40c7-b791-cc203da8037b-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k\" (UID: \"939f6ff3-0ca9-40c7-b791-cc203da8037b\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k" Apr 17 17:44:45.236144 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:45.235980 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qfp2q\" (UniqueName: \"kubernetes.io/projected/939f6ff3-0ca9-40c7-b791-cc203da8037b-kube-api-access-qfp2q\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k\" (UID: \"939f6ff3-0ca9-40c7-b791-cc203da8037b\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k" Apr 17 17:44:45.236367 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:45.236341 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/939f6ff3-0ca9-40c7-b791-cc203da8037b-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k\" (UID: \"939f6ff3-0ca9-40c7-b791-cc203da8037b\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k" Apr 17 17:44:45.236627 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:45.236597 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/939f6ff3-0ca9-40c7-b791-cc203da8037b-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k\" (UID: \"939f6ff3-0ca9-40c7-b791-cc203da8037b\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k" Apr 17 17:44:45.238626 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:45.238598 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/939f6ff3-0ca9-40c7-b791-cc203da8037b-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k\" (UID: \"939f6ff3-0ca9-40c7-b791-cc203da8037b\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k" Apr 17 17:44:45.246827 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:45.246800 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfp2q\" (UniqueName: \"kubernetes.io/projected/939f6ff3-0ca9-40c7-b791-cc203da8037b-kube-api-access-qfp2q\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k\" (UID: \"939f6ff3-0ca9-40c7-b791-cc203da8037b\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k" Apr 17 17:44:45.381054 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:45.381016 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k" Apr 17 17:44:45.508414 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:45.508385 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k"] Apr 17 17:44:45.510802 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:44:45.510764 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod939f6ff3_0ca9_40c7_b791_cc203da8037b.slice/crio-3d005cca5a9b20abd5c49b4c16ee032f07f40c33fc906f08f090284521db2035 WatchSource:0}: Error finding container 3d005cca5a9b20abd5c49b4c16ee032f07f40c33fc906f08f090284521db2035: Status 404 returned error can't find the container with id 3d005cca5a9b20abd5c49b4c16ee032f07f40c33fc906f08f090284521db2035 Apr 17 17:44:45.540570 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:45.540545 2566 generic.go:358] "Generic (PLEG): container finished" podID="5905bdf9-065a-4899-973f-07fc299537b8" containerID="e82a00113c74e44e6eed9a99012f84cc76db30d69dd88eee120e4223caf45ba4" exitCode=2 Apr 17 17:44:45.540698 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:45.540604 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k" event={"ID":"5905bdf9-065a-4899-973f-07fc299537b8","Type":"ContainerDied","Data":"e82a00113c74e44e6eed9a99012f84cc76db30d69dd88eee120e4223caf45ba4"} Apr 17 17:44:45.541551 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:45.541531 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k" event={"ID":"939f6ff3-0ca9-40c7-b791-cc203da8037b","Type":"ContainerStarted","Data":"3d005cca5a9b20abd5c49b4c16ee032f07f40c33fc906f08f090284521db2035"} Apr 17 17:44:46.190696 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:46.190672 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k" Apr 17 17:44:46.243878 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:46.243849 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwczd\" (UniqueName: \"kubernetes.io/projected/5905bdf9-065a-4899-973f-07fc299537b8-kube-api-access-hwczd\") pod \"5905bdf9-065a-4899-973f-07fc299537b8\" (UID: \"5905bdf9-065a-4899-973f-07fc299537b8\") " Apr 17 17:44:46.244096 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:46.243912 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5905bdf9-065a-4899-973f-07fc299537b8-proxy-tls\") pod \"5905bdf9-065a-4899-973f-07fc299537b8\" (UID: \"5905bdf9-065a-4899-973f-07fc299537b8\") " Apr 17 17:44:46.244096 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:46.243938 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5905bdf9-065a-4899-973f-07fc299537b8-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"5905bdf9-065a-4899-973f-07fc299537b8\" (UID: \"5905bdf9-065a-4899-973f-07fc299537b8\") " Apr 17 17:44:46.244096 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:46.244024 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5905bdf9-065a-4899-973f-07fc299537b8-kserve-provision-location\") pod \"5905bdf9-065a-4899-973f-07fc299537b8\" (UID: \"5905bdf9-065a-4899-973f-07fc299537b8\") " Apr 17 17:44:46.244359 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:46.244338 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5905bdf9-065a-4899-973f-07fc299537b8-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config") pod "5905bdf9-065a-4899-973f-07fc299537b8" (UID: "5905bdf9-065a-4899-973f-07fc299537b8"). InnerVolumeSpecName "isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:44:46.244423 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:46.244355 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5905bdf9-065a-4899-973f-07fc299537b8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5905bdf9-065a-4899-973f-07fc299537b8" (UID: "5905bdf9-065a-4899-973f-07fc299537b8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:44:46.245927 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:46.245907 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5905bdf9-065a-4899-973f-07fc299537b8-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5905bdf9-065a-4899-973f-07fc299537b8" (UID: "5905bdf9-065a-4899-973f-07fc299537b8"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:44:46.246028 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:46.246008 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5905bdf9-065a-4899-973f-07fc299537b8-kube-api-access-hwczd" (OuterVolumeSpecName: "kube-api-access-hwczd") pod "5905bdf9-065a-4899-973f-07fc299537b8" (UID: "5905bdf9-065a-4899-973f-07fc299537b8"). InnerVolumeSpecName "kube-api-access-hwczd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:44:46.345568 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:46.345532 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5905bdf9-065a-4899-973f-07fc299537b8-proxy-tls\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:44:46.345568 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:46.345565 2566 reconciler_common.go:299] "Volume detached for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5905bdf9-065a-4899-973f-07fc299537b8-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:44:46.345753 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:46.345580 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5905bdf9-065a-4899-973f-07fc299537b8-kserve-provision-location\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:44:46.345753 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:46.345594 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hwczd\" (UniqueName: \"kubernetes.io/projected/5905bdf9-065a-4899-973f-07fc299537b8-kube-api-access-hwczd\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:44:46.545628 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:46.545598 2566 generic.go:358] "Generic (PLEG): container finished" podID="5905bdf9-065a-4899-973f-07fc299537b8" containerID="dd0368703d9a01822aa773effd9587e99eb5b38614889876fc0a1d75b918799e" exitCode=0 Apr 17 17:44:46.545803 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:46.545663 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k" event={"ID":"5905bdf9-065a-4899-973f-07fc299537b8","Type":"ContainerDied","Data":"dd0368703d9a01822aa773effd9587e99eb5b38614889876fc0a1d75b918799e"} Apr 17 17:44:46.545803 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:46.545681 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k" Apr 17 17:44:46.545803 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:46.545690 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k" event={"ID":"5905bdf9-065a-4899-973f-07fc299537b8","Type":"ContainerDied","Data":"0cc2121fdc9aeb0ccb04a08ca0a96198514c0f5dc71a989a30e3aa7e1fdf25bd"} Apr 17 17:44:46.545803 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:46.545709 2566 scope.go:117] "RemoveContainer" containerID="e82a00113c74e44e6eed9a99012f84cc76db30d69dd88eee120e4223caf45ba4" Apr 17 17:44:46.547038 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:46.547014 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k" event={"ID":"939f6ff3-0ca9-40c7-b791-cc203da8037b","Type":"ContainerStarted","Data":"64f7839821b191a91de16334c17a16aff5bd84b00ec2d8395685fc5f5a60749c"} Apr 17 17:44:46.554053 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:46.554034 2566 scope.go:117] "RemoveContainer" containerID="dd0368703d9a01822aa773effd9587e99eb5b38614889876fc0a1d75b918799e" Apr 17 17:44:46.560972 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:46.560958 2566 scope.go:117] "RemoveContainer" containerID="21992d7029421c043ab054fc8fc95466ed99e700f50c33cb8dba8cd10fe118cd" Apr 17 17:44:46.567460 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:46.567442 2566 scope.go:117] "RemoveContainer" containerID="e82a00113c74e44e6eed9a99012f84cc76db30d69dd88eee120e4223caf45ba4" Apr 17 17:44:46.567713 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:44:46.567695 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e82a00113c74e44e6eed9a99012f84cc76db30d69dd88eee120e4223caf45ba4\": container with ID starting with e82a00113c74e44e6eed9a99012f84cc76db30d69dd88eee120e4223caf45ba4 not found: ID does not exist" containerID="e82a00113c74e44e6eed9a99012f84cc76db30d69dd88eee120e4223caf45ba4" Apr 17 17:44:46.567756 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:46.567722 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e82a00113c74e44e6eed9a99012f84cc76db30d69dd88eee120e4223caf45ba4"} err="failed to get container status \"e82a00113c74e44e6eed9a99012f84cc76db30d69dd88eee120e4223caf45ba4\": rpc error: code = NotFound desc = could not find container \"e82a00113c74e44e6eed9a99012f84cc76db30d69dd88eee120e4223caf45ba4\": container with ID starting with e82a00113c74e44e6eed9a99012f84cc76db30d69dd88eee120e4223caf45ba4 not found: ID does not exist" Apr 17 17:44:46.567756 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:46.567739 2566 scope.go:117] "RemoveContainer" containerID="dd0368703d9a01822aa773effd9587e99eb5b38614889876fc0a1d75b918799e" Apr 17 17:44:46.567967 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:44:46.567949 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd0368703d9a01822aa773effd9587e99eb5b38614889876fc0a1d75b918799e\": container with ID starting with dd0368703d9a01822aa773effd9587e99eb5b38614889876fc0a1d75b918799e not found: ID does not exist" containerID="dd0368703d9a01822aa773effd9587e99eb5b38614889876fc0a1d75b918799e" Apr 17 17:44:46.568027 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:46.567975 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd0368703d9a01822aa773effd9587e99eb5b38614889876fc0a1d75b918799e"} err="failed to get container status \"dd0368703d9a01822aa773effd9587e99eb5b38614889876fc0a1d75b918799e\": rpc error: code = NotFound desc = could not find container \"dd0368703d9a01822aa773effd9587e99eb5b38614889876fc0a1d75b918799e\": container with ID starting with dd0368703d9a01822aa773effd9587e99eb5b38614889876fc0a1d75b918799e not found: ID does not exist" Apr 17 17:44:46.568027 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:46.567991 2566 scope.go:117] "RemoveContainer" containerID="21992d7029421c043ab054fc8fc95466ed99e700f50c33cb8dba8cd10fe118cd" Apr 17 17:44:46.568188 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:44:46.568173 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21992d7029421c043ab054fc8fc95466ed99e700f50c33cb8dba8cd10fe118cd\": container with ID starting with 21992d7029421c043ab054fc8fc95466ed99e700f50c33cb8dba8cd10fe118cd not found: ID does not exist" containerID="21992d7029421c043ab054fc8fc95466ed99e700f50c33cb8dba8cd10fe118cd" Apr 17 17:44:46.568228 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:46.568192 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21992d7029421c043ab054fc8fc95466ed99e700f50c33cb8dba8cd10fe118cd"} err="failed to get container status \"21992d7029421c043ab054fc8fc95466ed99e700f50c33cb8dba8cd10fe118cd\": rpc error: code = NotFound desc = could not find container \"21992d7029421c043ab054fc8fc95466ed99e700f50c33cb8dba8cd10fe118cd\": container with ID starting with 21992d7029421c043ab054fc8fc95466ed99e700f50c33cb8dba8cd10fe118cd not found: ID does not exist" Apr 17 17:44:46.586378 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:46.586355 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k"] Apr 17 17:44:46.595520 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:46.595499 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-n9r8k"] Apr 17 17:44:46.837907 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:46.837875 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5905bdf9-065a-4899-973f-07fc299537b8" path="/var/lib/kubelet/pods/5905bdf9-065a-4899-973f-07fc299537b8/volumes" Apr 17 17:44:48.835136 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:48.835106 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz777_35fe1bc2-e1a4-4744-8f67-3cc38747e567/ovn-acl-logging/0.log" Apr 17 17:44:48.837033 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:48.837013 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz777_35fe1bc2-e1a4-4744-8f67-3cc38747e567/ovn-acl-logging/0.log" Apr 17 17:44:49.558632 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:49.558600 2566 generic.go:358] "Generic (PLEG): container finished" podID="939f6ff3-0ca9-40c7-b791-cc203da8037b" containerID="64f7839821b191a91de16334c17a16aff5bd84b00ec2d8395685fc5f5a60749c" exitCode=0 Apr 17 17:44:49.558772 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:49.558637 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k" event={"ID":"939f6ff3-0ca9-40c7-b791-cc203da8037b","Type":"ContainerDied","Data":"64f7839821b191a91de16334c17a16aff5bd84b00ec2d8395685fc5f5a60749c"} Apr 17 17:44:50.563876 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:50.563843 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k" event={"ID":"939f6ff3-0ca9-40c7-b791-cc203da8037b","Type":"ContainerStarted","Data":"c1f82dee20fed28f9222e6bf60fdada7dd01459f6f554cccb806ccf545f831bd"} Apr 17 17:44:52.571592 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:52.571557 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k" event={"ID":"939f6ff3-0ca9-40c7-b791-cc203da8037b","Type":"ContainerStarted","Data":"08f70bdbe38b6a7de6577a6184826cc82946f94c50cd5b0f490957b8543f119d"} Apr 17 17:44:52.571592 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:52.571595 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k" event={"ID":"939f6ff3-0ca9-40c7-b791-cc203da8037b","Type":"ContainerStarted","Data":"286041c69611ab759a86fb9d9195cfe476111415034e858c7cb8a4c3a5c7b783"} Apr 17 17:44:52.572115 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:52.571873 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k" Apr 17 17:44:52.572115 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:52.571902 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k" Apr 17 17:44:52.596449 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:52.596403 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k" podStartSLOduration=5.185332627 podStartE2EDuration="7.596391131s" podCreationTimestamp="2026-04-17 17:44:45 +0000 UTC" firstStartedPulling="2026-04-17 17:44:49.619553301 +0000 UTC m=+1201.289632921" lastFinishedPulling="2026-04-17 17:44:52.030611798 +0000 UTC m=+1203.700691425" observedRunningTime="2026-04-17 17:44:52.5942835 +0000 UTC m=+1204.264363141" watchObservedRunningTime="2026-04-17 17:44:52.596391131 +0000 UTC m=+1204.266470770" Apr 17 17:44:53.574506 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:53.574475 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k" Apr 17 17:44:59.582563 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:44:59.582535 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k" Apr 17 17:45:19.583773 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:45:19.583736 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k" podUID="939f6ff3-0ca9-40c7-b791-cc203da8037b" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.27:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.133.0.27:8080: connect: connection refused" Apr 17 17:45:29.584377 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:45:29.584349 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k" Apr 17 17:45:59.585174 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:45:59.585149 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k" Apr 17 17:46:05.128614 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:05.128584 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k"] Apr 17 17:46:05.129003 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:05.128909 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k" podUID="939f6ff3-0ca9-40c7-b791-cc203da8037b" containerName="kserve-container" containerID="cri-o://c1f82dee20fed28f9222e6bf60fdada7dd01459f6f554cccb806ccf545f831bd" gracePeriod=30 Apr 17 17:46:05.129063 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:05.128991 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k" podUID="939f6ff3-0ca9-40c7-b791-cc203da8037b" containerName="kserve-agent" containerID="cri-o://286041c69611ab759a86fb9d9195cfe476111415034e858c7cb8a4c3a5c7b783" gracePeriod=30 Apr 17 17:46:05.129141 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:05.129074 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k" podUID="939f6ff3-0ca9-40c7-b791-cc203da8037b" containerName="kube-rbac-proxy" containerID="cri-o://08f70bdbe38b6a7de6577a6184826cc82946f94c50cd5b0f490957b8543f119d" gracePeriod=30 Apr 17 17:46:05.225903 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:05.225869 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ffllb"] Apr 17 17:46:05.226370 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:05.226353 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5905bdf9-065a-4899-973f-07fc299537b8" containerName="kserve-container" Apr 17 17:46:05.226416 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:05.226374 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="5905bdf9-065a-4899-973f-07fc299537b8" containerName="kserve-container" Apr 17 17:46:05.226416 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:05.226387 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5905bdf9-065a-4899-973f-07fc299537b8" containerName="kube-rbac-proxy" Apr 17 17:46:05.226416 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:05.226394 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="5905bdf9-065a-4899-973f-07fc299537b8" containerName="kube-rbac-proxy" Apr 17 17:46:05.226514 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:05.226416 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5905bdf9-065a-4899-973f-07fc299537b8" containerName="storage-initializer" Apr 17 17:46:05.226514 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:05.226424 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="5905bdf9-065a-4899-973f-07fc299537b8" containerName="storage-initializer" Apr 17 17:46:05.226514 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:05.226491 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="5905bdf9-065a-4899-973f-07fc299537b8" containerName="kserve-container" Apr 17 17:46:05.226514 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:05.226501 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="5905bdf9-065a-4899-973f-07fc299537b8" containerName="kube-rbac-proxy" Apr 17 17:46:05.228925 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:05.228910 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ffllb" Apr 17 17:46:05.232650 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:05.232629 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-predictor-serving-cert\"" Apr 17 17:46:05.232778 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:05.232635 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-kube-rbac-proxy-sar-config\"" Apr 17 17:46:05.235849 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:05.235825 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ffllb"] Apr 17 17:46:05.294354 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:05.294325 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/11e623a7-dfdc-44aa-a491-dacb3a4166ac-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-ffllb\" (UID: \"11e623a7-dfdc-44aa-a491-dacb3a4166ac\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ffllb" Apr 17 17:46:05.294501 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:05.294358 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/11e623a7-dfdc-44aa-a491-dacb3a4166ac-kserve-provision-location\") pod \"isvc-paddle-predictor-6b8b7cfb4b-ffllb\" (UID: \"11e623a7-dfdc-44aa-a491-dacb3a4166ac\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ffllb" Apr 17 17:46:05.294501 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:05.294404 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/11e623a7-dfdc-44aa-a491-dacb3a4166ac-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-predictor-6b8b7cfb4b-ffllb\" (UID: \"11e623a7-dfdc-44aa-a491-dacb3a4166ac\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ffllb" Apr 17 17:46:05.294501 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:05.294457 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqr48\" (UniqueName: \"kubernetes.io/projected/11e623a7-dfdc-44aa-a491-dacb3a4166ac-kube-api-access-hqr48\") pod \"isvc-paddle-predictor-6b8b7cfb4b-ffllb\" (UID: \"11e623a7-dfdc-44aa-a491-dacb3a4166ac\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ffllb" Apr 17 17:46:05.395069 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:05.394977 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/11e623a7-dfdc-44aa-a491-dacb3a4166ac-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-ffllb\" (UID: \"11e623a7-dfdc-44aa-a491-dacb3a4166ac\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ffllb" Apr 17 17:46:05.395069 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:05.395019 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/11e623a7-dfdc-44aa-a491-dacb3a4166ac-kserve-provision-location\") pod \"isvc-paddle-predictor-6b8b7cfb4b-ffllb\" (UID: \"11e623a7-dfdc-44aa-a491-dacb3a4166ac\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ffllb" Apr 17 17:46:05.395296 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:46:05.395132 2566 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-paddle-predictor-serving-cert: secret "isvc-paddle-predictor-serving-cert" not found Apr 17 17:46:05.395296 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:05.395147 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/11e623a7-dfdc-44aa-a491-dacb3a4166ac-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-predictor-6b8b7cfb4b-ffllb\" (UID: \"11e623a7-dfdc-44aa-a491-dacb3a4166ac\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ffllb" Apr 17 17:46:05.395296 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:46:05.395193 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11e623a7-dfdc-44aa-a491-dacb3a4166ac-proxy-tls podName:11e623a7-dfdc-44aa-a491-dacb3a4166ac nodeName:}" failed. No retries permitted until 2026-04-17 17:46:05.895177838 +0000 UTC m=+1277.565257462 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/11e623a7-dfdc-44aa-a491-dacb3a4166ac-proxy-tls") pod "isvc-paddle-predictor-6b8b7cfb4b-ffllb" (UID: "11e623a7-dfdc-44aa-a491-dacb3a4166ac") : secret "isvc-paddle-predictor-serving-cert" not found Apr 17 17:46:05.395296 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:05.395219 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hqr48\" (UniqueName: \"kubernetes.io/projected/11e623a7-dfdc-44aa-a491-dacb3a4166ac-kube-api-access-hqr48\") pod \"isvc-paddle-predictor-6b8b7cfb4b-ffllb\" (UID: \"11e623a7-dfdc-44aa-a491-dacb3a4166ac\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ffllb" Apr 17 17:46:05.395471 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:05.395409 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/11e623a7-dfdc-44aa-a491-dacb3a4166ac-kserve-provision-location\") pod \"isvc-paddle-predictor-6b8b7cfb4b-ffllb\" (UID: \"11e623a7-dfdc-44aa-a491-dacb3a4166ac\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ffllb" Apr 17 17:46:05.395745 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:05.395727 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/11e623a7-dfdc-44aa-a491-dacb3a4166ac-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-predictor-6b8b7cfb4b-ffllb\" (UID: \"11e623a7-dfdc-44aa-a491-dacb3a4166ac\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ffllb" Apr 17 17:46:05.403926 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:05.403905 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqr48\" (UniqueName: \"kubernetes.io/projected/11e623a7-dfdc-44aa-a491-dacb3a4166ac-kube-api-access-hqr48\") pod \"isvc-paddle-predictor-6b8b7cfb4b-ffllb\" (UID: \"11e623a7-dfdc-44aa-a491-dacb3a4166ac\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ffllb" Apr 17 17:46:05.787886 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:05.787807 2566 generic.go:358] "Generic (PLEG): container finished" podID="939f6ff3-0ca9-40c7-b791-cc203da8037b" containerID="08f70bdbe38b6a7de6577a6184826cc82946f94c50cd5b0f490957b8543f119d" exitCode=2 Apr 17 17:46:05.787886 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:05.787853 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k" event={"ID":"939f6ff3-0ca9-40c7-b791-cc203da8037b","Type":"ContainerDied","Data":"08f70bdbe38b6a7de6577a6184826cc82946f94c50cd5b0f490957b8543f119d"} Apr 17 17:46:05.899790 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:05.899744 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/11e623a7-dfdc-44aa-a491-dacb3a4166ac-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-ffllb\" (UID: \"11e623a7-dfdc-44aa-a491-dacb3a4166ac\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ffllb" Apr 17 17:46:05.902182 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:05.902156 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/11e623a7-dfdc-44aa-a491-dacb3a4166ac-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-ffllb\" (UID: \"11e623a7-dfdc-44aa-a491-dacb3a4166ac\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ffllb" Apr 17 17:46:06.140609 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:06.140560 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ffllb" Apr 17 17:46:06.264327 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:06.264302 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ffllb"] Apr 17 17:46:06.266367 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:46:06.266339 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11e623a7_dfdc_44aa_a491_dacb3a4166ac.slice/crio-2fa6d6ad4f3ce2f8b73bdd8eea4d42528394fd2652975ae9b42ed61085edf71d WatchSource:0}: Error finding container 2fa6d6ad4f3ce2f8b73bdd8eea4d42528394fd2652975ae9b42ed61085edf71d: Status 404 returned error can't find the container with id 2fa6d6ad4f3ce2f8b73bdd8eea4d42528394fd2652975ae9b42ed61085edf71d Apr 17 17:46:06.792596 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:06.792554 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ffllb" event={"ID":"11e623a7-dfdc-44aa-a491-dacb3a4166ac","Type":"ContainerStarted","Data":"1471aff1750b4f6683a3f000c5abb56cdbd17a01e557c737670a1b4051efa9dc"} Apr 17 17:46:06.792596 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:06.792601 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ffllb" event={"ID":"11e623a7-dfdc-44aa-a491-dacb3a4166ac","Type":"ContainerStarted","Data":"2fa6d6ad4f3ce2f8b73bdd8eea4d42528394fd2652975ae9b42ed61085edf71d"} Apr 17 17:46:07.797068 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:07.797037 2566 generic.go:358] "Generic (PLEG): container finished" podID="939f6ff3-0ca9-40c7-b791-cc203da8037b" containerID="c1f82dee20fed28f9222e6bf60fdada7dd01459f6f554cccb806ccf545f831bd" exitCode=0 Apr 17 17:46:07.797461 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:07.797110 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k" event={"ID":"939f6ff3-0ca9-40c7-b791-cc203da8037b","Type":"ContainerDied","Data":"c1f82dee20fed28f9222e6bf60fdada7dd01459f6f554cccb806ccf545f831bd"} Apr 17 17:46:09.578428 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:09.578391 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k" podUID="939f6ff3-0ca9-40c7-b791-cc203da8037b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.27:8643/healthz\": dial tcp 10.133.0.27:8643: connect: connection refused" Apr 17 17:46:09.583573 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:09.583533 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k" podUID="939f6ff3-0ca9-40c7-b791-cc203da8037b" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.27:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.133.0.27:8080: connect: connection refused" Apr 17 17:46:10.807660 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:10.807625 2566 generic.go:358] "Generic (PLEG): container finished" podID="11e623a7-dfdc-44aa-a491-dacb3a4166ac" containerID="1471aff1750b4f6683a3f000c5abb56cdbd17a01e557c737670a1b4051efa9dc" exitCode=0 Apr 17 17:46:10.807983 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:10.807694 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ffllb" event={"ID":"11e623a7-dfdc-44aa-a491-dacb3a4166ac","Type":"ContainerDied","Data":"1471aff1750b4f6683a3f000c5abb56cdbd17a01e557c737670a1b4051efa9dc"} Apr 17 17:46:14.577960 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:14.577914 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k" podUID="939f6ff3-0ca9-40c7-b791-cc203da8037b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.27:8643/healthz\": dial tcp 10.133.0.27:8643: connect: connection refused" Apr 17 17:46:19.578545 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:19.578502 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k" podUID="939f6ff3-0ca9-40c7-b791-cc203da8037b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.27:8643/healthz\": dial tcp 10.133.0.27:8643: connect: connection refused" Apr 17 17:46:19.579004 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:19.578657 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k" Apr 17 17:46:19.583431 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:19.583393 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k" podUID="939f6ff3-0ca9-40c7-b791-cc203da8037b" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.27:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.133.0.27:8080: connect: connection refused" Apr 17 17:46:22.849446 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:22.849412 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ffllb" event={"ID":"11e623a7-dfdc-44aa-a491-dacb3a4166ac","Type":"ContainerStarted","Data":"15115e5aa6e4e0fc44353236905c77a7716629190cbfea4888721f06db99ee85"} Apr 17 17:46:22.849446 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:22.849452 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ffllb" event={"ID":"11e623a7-dfdc-44aa-a491-dacb3a4166ac","Type":"ContainerStarted","Data":"8227c5a9d8e5888f7b382c2c67d8dfd081b882b82c0833d3b939acd73a526f21"} Apr 17 17:46:22.849879 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:22.849764 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ffllb" Apr 17 17:46:22.849920 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:22.849893 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ffllb" Apr 17 17:46:22.850982 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:22.850957 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ffllb" podUID="11e623a7-dfdc-44aa-a491-dacb3a4166ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 17 17:46:22.872662 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:22.872618 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ffllb" podStartSLOduration=6.385961243 podStartE2EDuration="17.872605626s" podCreationTimestamp="2026-04-17 17:46:05 +0000 UTC" firstStartedPulling="2026-04-17 17:46:10.808837848 +0000 UTC m=+1282.478917468" lastFinishedPulling="2026-04-17 17:46:22.295482229 +0000 UTC m=+1293.965561851" observedRunningTime="2026-04-17 17:46:22.870988153 +0000 UTC m=+1294.541067794" watchObservedRunningTime="2026-04-17 17:46:22.872605626 +0000 UTC m=+1294.542685269" Apr 17 17:46:23.852159 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:23.852118 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ffllb" podUID="11e623a7-dfdc-44aa-a491-dacb3a4166ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 17 17:46:24.578317 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:24.578246 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k" podUID="939f6ff3-0ca9-40c7-b791-cc203da8037b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.27:8643/healthz\": dial tcp 10.133.0.27:8643: connect: connection refused" Apr 17 17:46:28.856127 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:28.856100 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ffllb" Apr 17 17:46:28.856673 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:28.856650 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ffllb" podUID="11e623a7-dfdc-44aa-a491-dacb3a4166ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 17 17:46:29.578629 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:29.578594 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k" podUID="939f6ff3-0ca9-40c7-b791-cc203da8037b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.27:8643/healthz\": dial tcp 10.133.0.27:8643: connect: connection refused" Apr 17 17:46:29.583425 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:29.583392 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k" podUID="939f6ff3-0ca9-40c7-b791-cc203da8037b" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.27:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.133.0.27:8080: connect: connection refused" Apr 17 17:46:29.583561 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:29.583493 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k" Apr 17 17:46:34.577813 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:34.577765 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k" podUID="939f6ff3-0ca9-40c7-b791-cc203da8037b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.27:8643/healthz\": dial tcp 10.133.0.27:8643: connect: connection refused" Apr 17 17:46:35.269849 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:35.269828 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k" Apr 17 17:46:35.355564 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:35.355529 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/939f6ff3-0ca9-40c7-b791-cc203da8037b-proxy-tls\") pod \"939f6ff3-0ca9-40c7-b791-cc203da8037b\" (UID: \"939f6ff3-0ca9-40c7-b791-cc203da8037b\") " Apr 17 17:46:35.355564 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:35.355565 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/939f6ff3-0ca9-40c7-b791-cc203da8037b-kserve-provision-location\") pod \"939f6ff3-0ca9-40c7-b791-cc203da8037b\" (UID: \"939f6ff3-0ca9-40c7-b791-cc203da8037b\") " Apr 17 17:46:35.355761 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:35.355589 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfp2q\" (UniqueName: \"kubernetes.io/projected/939f6ff3-0ca9-40c7-b791-cc203da8037b-kube-api-access-qfp2q\") pod \"939f6ff3-0ca9-40c7-b791-cc203da8037b\" (UID: \"939f6ff3-0ca9-40c7-b791-cc203da8037b\") " Apr 17 17:46:35.355761 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:35.355653 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/939f6ff3-0ca9-40c7-b791-cc203da8037b-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"939f6ff3-0ca9-40c7-b791-cc203da8037b\" (UID: \"939f6ff3-0ca9-40c7-b791-cc203da8037b\") " Apr 17 17:46:35.355993 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:35.355955 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/939f6ff3-0ca9-40c7-b791-cc203da8037b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "939f6ff3-0ca9-40c7-b791-cc203da8037b" (UID: "939f6ff3-0ca9-40c7-b791-cc203da8037b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:46:35.356125 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:35.356021 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/939f6ff3-0ca9-40c7-b791-cc203da8037b-isvc-sklearn-mcp-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-mcp-kube-rbac-proxy-sar-config") pod "939f6ff3-0ca9-40c7-b791-cc203da8037b" (UID: "939f6ff3-0ca9-40c7-b791-cc203da8037b"). InnerVolumeSpecName "isvc-sklearn-mcp-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:46:35.357657 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:35.357633 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/939f6ff3-0ca9-40c7-b791-cc203da8037b-kube-api-access-qfp2q" (OuterVolumeSpecName: "kube-api-access-qfp2q") pod "939f6ff3-0ca9-40c7-b791-cc203da8037b" (UID: "939f6ff3-0ca9-40c7-b791-cc203da8037b"). InnerVolumeSpecName "kube-api-access-qfp2q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:46:35.357725 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:35.357669 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/939f6ff3-0ca9-40c7-b791-cc203da8037b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "939f6ff3-0ca9-40c7-b791-cc203da8037b" (UID: "939f6ff3-0ca9-40c7-b791-cc203da8037b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:46:35.456274 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:35.456174 2566 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/939f6ff3-0ca9-40c7-b791-cc203da8037b-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:46:35.456274 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:35.456225 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/939f6ff3-0ca9-40c7-b791-cc203da8037b-proxy-tls\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:46:35.456274 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:35.456237 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/939f6ff3-0ca9-40c7-b791-cc203da8037b-kserve-provision-location\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:46:35.456274 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:35.456246 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qfp2q\" (UniqueName: \"kubernetes.io/projected/939f6ff3-0ca9-40c7-b791-cc203da8037b-kube-api-access-qfp2q\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:46:35.888919 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:35.888879 2566 generic.go:358] "Generic (PLEG): container finished" podID="939f6ff3-0ca9-40c7-b791-cc203da8037b" containerID="286041c69611ab759a86fb9d9195cfe476111415034e858c7cb8a4c3a5c7b783" exitCode=137 Apr 17 17:46:35.888919 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:35.888921 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k" event={"ID":"939f6ff3-0ca9-40c7-b791-cc203da8037b","Type":"ContainerDied","Data":"286041c69611ab759a86fb9d9195cfe476111415034e858c7cb8a4c3a5c7b783"} Apr 17 17:46:35.889414 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:35.888955 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k" event={"ID":"939f6ff3-0ca9-40c7-b791-cc203da8037b","Type":"ContainerDied","Data":"3d005cca5a9b20abd5c49b4c16ee032f07f40c33fc906f08f090284521db2035"} Apr 17 17:46:35.889414 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:35.888968 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k" Apr 17 17:46:35.889414 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:35.888974 2566 scope.go:117] "RemoveContainer" containerID="08f70bdbe38b6a7de6577a6184826cc82946f94c50cd5b0f490957b8543f119d" Apr 17 17:46:35.897170 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:35.897151 2566 scope.go:117] "RemoveContainer" containerID="286041c69611ab759a86fb9d9195cfe476111415034e858c7cb8a4c3a5c7b783" Apr 17 17:46:35.903961 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:35.903945 2566 scope.go:117] "RemoveContainer" containerID="c1f82dee20fed28f9222e6bf60fdada7dd01459f6f554cccb806ccf545f831bd" Apr 17 17:46:35.910687 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:35.910672 2566 scope.go:117] "RemoveContainer" containerID="64f7839821b191a91de16334c17a16aff5bd84b00ec2d8395685fc5f5a60749c" Apr 17 17:46:35.911004 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:35.910982 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k"] Apr 17 17:46:35.914227 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:35.914207 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wr87k"] Apr 17 17:46:35.917559 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:35.917541 2566 scope.go:117] "RemoveContainer" containerID="08f70bdbe38b6a7de6577a6184826cc82946f94c50cd5b0f490957b8543f119d" Apr 17 17:46:35.917802 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:46:35.917783 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08f70bdbe38b6a7de6577a6184826cc82946f94c50cd5b0f490957b8543f119d\": container with ID starting with 08f70bdbe38b6a7de6577a6184826cc82946f94c50cd5b0f490957b8543f119d not found: ID does not exist" containerID="08f70bdbe38b6a7de6577a6184826cc82946f94c50cd5b0f490957b8543f119d" Apr 17 17:46:35.917871 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:35.917814 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08f70bdbe38b6a7de6577a6184826cc82946f94c50cd5b0f490957b8543f119d"} err="failed to get container status \"08f70bdbe38b6a7de6577a6184826cc82946f94c50cd5b0f490957b8543f119d\": rpc error: code = NotFound desc = could not find container \"08f70bdbe38b6a7de6577a6184826cc82946f94c50cd5b0f490957b8543f119d\": container with ID starting with 08f70bdbe38b6a7de6577a6184826cc82946f94c50cd5b0f490957b8543f119d not found: ID does not exist" Apr 17 17:46:35.917871 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:35.917841 2566 scope.go:117] "RemoveContainer" containerID="286041c69611ab759a86fb9d9195cfe476111415034e858c7cb8a4c3a5c7b783" Apr 17 17:46:35.918073 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:46:35.918058 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"286041c69611ab759a86fb9d9195cfe476111415034e858c7cb8a4c3a5c7b783\": container with ID starting with 286041c69611ab759a86fb9d9195cfe476111415034e858c7cb8a4c3a5c7b783 not found: ID does not exist" containerID="286041c69611ab759a86fb9d9195cfe476111415034e858c7cb8a4c3a5c7b783" Apr 17 17:46:35.918111 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:35.918078 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"286041c69611ab759a86fb9d9195cfe476111415034e858c7cb8a4c3a5c7b783"} err="failed to get container status \"286041c69611ab759a86fb9d9195cfe476111415034e858c7cb8a4c3a5c7b783\": rpc error: code = NotFound desc = could not find container \"286041c69611ab759a86fb9d9195cfe476111415034e858c7cb8a4c3a5c7b783\": container with ID starting with 286041c69611ab759a86fb9d9195cfe476111415034e858c7cb8a4c3a5c7b783 not found: ID does not exist" Apr 17 17:46:35.918111 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:35.918092 2566 scope.go:117] "RemoveContainer" containerID="c1f82dee20fed28f9222e6bf60fdada7dd01459f6f554cccb806ccf545f831bd" Apr 17 17:46:35.918344 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:46:35.918321 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1f82dee20fed28f9222e6bf60fdada7dd01459f6f554cccb806ccf545f831bd\": container with ID starting with c1f82dee20fed28f9222e6bf60fdada7dd01459f6f554cccb806ccf545f831bd not found: ID does not exist" containerID="c1f82dee20fed28f9222e6bf60fdada7dd01459f6f554cccb806ccf545f831bd" Apr 17 17:46:35.918438 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:35.918350 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1f82dee20fed28f9222e6bf60fdada7dd01459f6f554cccb806ccf545f831bd"} err="failed to get container status \"c1f82dee20fed28f9222e6bf60fdada7dd01459f6f554cccb806ccf545f831bd\": rpc error: code = NotFound desc = could not find container \"c1f82dee20fed28f9222e6bf60fdada7dd01459f6f554cccb806ccf545f831bd\": container with ID starting with c1f82dee20fed28f9222e6bf60fdada7dd01459f6f554cccb806ccf545f831bd not found: ID does not exist" Apr 17 17:46:35.918438 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:35.918368 2566 scope.go:117] "RemoveContainer" containerID="64f7839821b191a91de16334c17a16aff5bd84b00ec2d8395685fc5f5a60749c" Apr 17 17:46:35.918583 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:46:35.918569 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64f7839821b191a91de16334c17a16aff5bd84b00ec2d8395685fc5f5a60749c\": container with ID starting with 64f7839821b191a91de16334c17a16aff5bd84b00ec2d8395685fc5f5a60749c not found: ID does not exist" containerID="64f7839821b191a91de16334c17a16aff5bd84b00ec2d8395685fc5f5a60749c" Apr 17 17:46:35.918622 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:35.918588 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64f7839821b191a91de16334c17a16aff5bd84b00ec2d8395685fc5f5a60749c"} err="failed to get container status \"64f7839821b191a91de16334c17a16aff5bd84b00ec2d8395685fc5f5a60749c\": rpc error: code = NotFound desc = could not find container \"64f7839821b191a91de16334c17a16aff5bd84b00ec2d8395685fc5f5a60749c\": container with ID starting with 64f7839821b191a91de16334c17a16aff5bd84b00ec2d8395685fc5f5a60749c not found: ID does not exist" Apr 17 17:46:36.838881 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:36.838850 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="939f6ff3-0ca9-40c7-b791-cc203da8037b" path="/var/lib/kubelet/pods/939f6ff3-0ca9-40c7-b791-cc203da8037b/volumes" Apr 17 17:46:38.857070 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:38.857035 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ffllb" podUID="11e623a7-dfdc-44aa-a491-dacb3a4166ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 17 17:46:48.857475 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:48.857420 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ffllb" podUID="11e623a7-dfdc-44aa-a491-dacb3a4166ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 17 17:46:58.857037 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:46:58.856952 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ffllb" podUID="11e623a7-dfdc-44aa-a491-dacb3a4166ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 17 17:47:08.857220 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:08.857186 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ffllb" Apr 17 17:47:16.712517 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:16.712483 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ffllb"] Apr 17 17:47:16.713036 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:16.712804 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ffllb" podUID="11e623a7-dfdc-44aa-a491-dacb3a4166ac" containerName="kserve-container" containerID="cri-o://8227c5a9d8e5888f7b382c2c67d8dfd081b882b82c0833d3b939acd73a526f21" gracePeriod=30 Apr 17 17:47:16.713036 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:16.712889 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ffllb" podUID="11e623a7-dfdc-44aa-a491-dacb3a4166ac" containerName="kube-rbac-proxy" containerID="cri-o://15115e5aa6e4e0fc44353236905c77a7716629190cbfea4888721f06db99ee85" gracePeriod=30 Apr 17 17:47:16.807648 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:16.807613 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2"] Apr 17 17:47:16.807939 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:16.807926 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="939f6ff3-0ca9-40c7-b791-cc203da8037b" containerName="storage-initializer" Apr 17 17:47:16.807981 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:16.807941 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="939f6ff3-0ca9-40c7-b791-cc203da8037b" containerName="storage-initializer" Apr 17 17:47:16.807981 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:16.807948 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="939f6ff3-0ca9-40c7-b791-cc203da8037b" containerName="kube-rbac-proxy" Apr 17 17:47:16.807981 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:16.807954 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="939f6ff3-0ca9-40c7-b791-cc203da8037b" containerName="kube-rbac-proxy" Apr 17 17:47:16.807981 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:16.807961 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="939f6ff3-0ca9-40c7-b791-cc203da8037b" containerName="kserve-container" Apr 17 17:47:16.807981 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:16.807967 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="939f6ff3-0ca9-40c7-b791-cc203da8037b" containerName="kserve-container" Apr 17 17:47:16.808159 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:16.807984 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="939f6ff3-0ca9-40c7-b791-cc203da8037b" containerName="kserve-agent" Apr 17 17:47:16.808159 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:16.807990 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="939f6ff3-0ca9-40c7-b791-cc203da8037b" containerName="kserve-agent" Apr 17 17:47:16.808159 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:16.808035 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="939f6ff3-0ca9-40c7-b791-cc203da8037b" containerName="kube-rbac-proxy" Apr 17 17:47:16.808159 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:16.808045 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="939f6ff3-0ca9-40c7-b791-cc203da8037b" containerName="kserve-agent" Apr 17 17:47:16.808159 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:16.808052 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="939f6ff3-0ca9-40c7-b791-cc203da8037b" containerName="kserve-container" Apr 17 17:47:16.811348 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:16.811332 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2" Apr 17 17:47:16.813555 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:16.813522 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-runtime-predictor-serving-cert\"" Apr 17 17:47:16.813664 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:16.813614 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-runtime-kube-rbac-proxy-sar-config\"" Apr 17 17:47:16.821970 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:16.821947 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2"] Apr 17 17:47:16.909227 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:16.909198 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eb143e63-227b-43ee-8ba5-00a3b64e331c-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2\" (UID: \"eb143e63-227b-43ee-8ba5-00a3b64e331c\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2" Apr 17 17:47:16.909227 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:16.909238 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/eb143e63-227b-43ee-8ba5-00a3b64e331c-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2\" (UID: \"eb143e63-227b-43ee-8ba5-00a3b64e331c\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2" Apr 17 17:47:16.909466 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:16.909271 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7dxc\" (UniqueName: \"kubernetes.io/projected/eb143e63-227b-43ee-8ba5-00a3b64e331c-kube-api-access-q7dxc\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2\" (UID: \"eb143e63-227b-43ee-8ba5-00a3b64e331c\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2" Apr 17 17:47:16.909466 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:16.909350 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eb143e63-227b-43ee-8ba5-00a3b64e331c-proxy-tls\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2\" (UID: \"eb143e63-227b-43ee-8ba5-00a3b64e331c\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2" Apr 17 17:47:17.009920 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:17.009842 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eb143e63-227b-43ee-8ba5-00a3b64e331c-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2\" (UID: \"eb143e63-227b-43ee-8ba5-00a3b64e331c\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2" Apr 17 17:47:17.009920 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:17.009880 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/eb143e63-227b-43ee-8ba5-00a3b64e331c-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2\" (UID: \"eb143e63-227b-43ee-8ba5-00a3b64e331c\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2" Apr 17 17:47:17.010133 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:17.009925 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q7dxc\" (UniqueName: \"kubernetes.io/projected/eb143e63-227b-43ee-8ba5-00a3b64e331c-kube-api-access-q7dxc\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2\" (UID: \"eb143e63-227b-43ee-8ba5-00a3b64e331c\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2" Apr 17 17:47:17.010133 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:17.010070 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eb143e63-227b-43ee-8ba5-00a3b64e331c-proxy-tls\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2\" (UID: \"eb143e63-227b-43ee-8ba5-00a3b64e331c\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2" Apr 17 17:47:17.010304 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:17.010283 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eb143e63-227b-43ee-8ba5-00a3b64e331c-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2\" (UID: \"eb143e63-227b-43ee-8ba5-00a3b64e331c\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2" Apr 17 17:47:17.010606 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:17.010585 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/eb143e63-227b-43ee-8ba5-00a3b64e331c-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2\" (UID: \"eb143e63-227b-43ee-8ba5-00a3b64e331c\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2" Apr 17 17:47:17.012492 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:17.012475 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eb143e63-227b-43ee-8ba5-00a3b64e331c-proxy-tls\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2\" (UID: \"eb143e63-227b-43ee-8ba5-00a3b64e331c\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2" Apr 17 17:47:17.014589 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:17.014559 2566 generic.go:358] "Generic (PLEG): container finished" podID="11e623a7-dfdc-44aa-a491-dacb3a4166ac" containerID="15115e5aa6e4e0fc44353236905c77a7716629190cbfea4888721f06db99ee85" exitCode=2 Apr 17 17:47:17.014675 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:17.014604 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ffllb" event={"ID":"11e623a7-dfdc-44aa-a491-dacb3a4166ac","Type":"ContainerDied","Data":"15115e5aa6e4e0fc44353236905c77a7716629190cbfea4888721f06db99ee85"} Apr 17 17:47:17.019871 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:17.019848 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7dxc\" (UniqueName: \"kubernetes.io/projected/eb143e63-227b-43ee-8ba5-00a3b64e331c-kube-api-access-q7dxc\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2\" (UID: \"eb143e63-227b-43ee-8ba5-00a3b64e331c\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2" Apr 17 17:47:17.122444 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:17.122405 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2" Apr 17 17:47:17.248892 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:17.248864 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2"] Apr 17 17:47:17.250816 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:47:17.250788 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb143e63_227b_43ee_8ba5_00a3b64e331c.slice/crio-39b4ad75a9a424ce8a7e0a99d6f70f8c8a373555c112e26b0fb46169cb9537b5 WatchSource:0}: Error finding container 39b4ad75a9a424ce8a7e0a99d6f70f8c8a373555c112e26b0fb46169cb9537b5: Status 404 returned error can't find the container with id 39b4ad75a9a424ce8a7e0a99d6f70f8c8a373555c112e26b0fb46169cb9537b5 Apr 17 17:47:18.018797 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:18.018707 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2" event={"ID":"eb143e63-227b-43ee-8ba5-00a3b64e331c","Type":"ContainerStarted","Data":"8dff3b4bbfa550f6bc9e67679691a438d332bb31e4ef6d3aaf046a3dcc793c1e"} Apr 17 17:47:18.018797 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:18.018743 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2" event={"ID":"eb143e63-227b-43ee-8ba5-00a3b64e331c","Type":"ContainerStarted","Data":"39b4ad75a9a424ce8a7e0a99d6f70f8c8a373555c112e26b0fb46169cb9537b5"} Apr 17 17:47:18.852722 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:18.852680 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ffllb" podUID="11e623a7-dfdc-44aa-a491-dacb3a4166ac" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.28:8643/healthz\": dial tcp 10.133.0.28:8643: connect: connection refused" Apr 17 17:47:18.856962 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:18.856932 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ffllb" podUID="11e623a7-dfdc-44aa-a491-dacb3a4166ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 17 17:47:19.462835 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:19.462813 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ffllb" Apr 17 17:47:19.631649 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:19.631613 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/11e623a7-dfdc-44aa-a491-dacb3a4166ac-proxy-tls\") pod \"11e623a7-dfdc-44aa-a491-dacb3a4166ac\" (UID: \"11e623a7-dfdc-44aa-a491-dacb3a4166ac\") " Apr 17 17:47:19.631834 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:19.631662 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqr48\" (UniqueName: \"kubernetes.io/projected/11e623a7-dfdc-44aa-a491-dacb3a4166ac-kube-api-access-hqr48\") pod \"11e623a7-dfdc-44aa-a491-dacb3a4166ac\" (UID: \"11e623a7-dfdc-44aa-a491-dacb3a4166ac\") " Apr 17 17:47:19.631834 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:19.631750 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/11e623a7-dfdc-44aa-a491-dacb3a4166ac-kserve-provision-location\") pod \"11e623a7-dfdc-44aa-a491-dacb3a4166ac\" (UID: \"11e623a7-dfdc-44aa-a491-dacb3a4166ac\") " Apr 17 17:47:19.631995 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:19.631973 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/11e623a7-dfdc-44aa-a491-dacb3a4166ac-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"11e623a7-dfdc-44aa-a491-dacb3a4166ac\" (UID: \"11e623a7-dfdc-44aa-a491-dacb3a4166ac\") " Apr 17 17:47:19.632229 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:19.632204 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11e623a7-dfdc-44aa-a491-dacb3a4166ac-isvc-paddle-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-paddle-kube-rbac-proxy-sar-config") pod "11e623a7-dfdc-44aa-a491-dacb3a4166ac" (UID: "11e623a7-dfdc-44aa-a491-dacb3a4166ac"). InnerVolumeSpecName "isvc-paddle-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:47:19.633698 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:19.633670 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11e623a7-dfdc-44aa-a491-dacb3a4166ac-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "11e623a7-dfdc-44aa-a491-dacb3a4166ac" (UID: "11e623a7-dfdc-44aa-a491-dacb3a4166ac"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:47:19.634089 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:19.634066 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11e623a7-dfdc-44aa-a491-dacb3a4166ac-kube-api-access-hqr48" (OuterVolumeSpecName: "kube-api-access-hqr48") pod "11e623a7-dfdc-44aa-a491-dacb3a4166ac" (UID: "11e623a7-dfdc-44aa-a491-dacb3a4166ac"). InnerVolumeSpecName "kube-api-access-hqr48". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:47:19.642318 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:19.642290 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11e623a7-dfdc-44aa-a491-dacb3a4166ac-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "11e623a7-dfdc-44aa-a491-dacb3a4166ac" (UID: "11e623a7-dfdc-44aa-a491-dacb3a4166ac"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:47:19.733415 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:19.733366 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/11e623a7-dfdc-44aa-a491-dacb3a4166ac-proxy-tls\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:47:19.733415 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:19.733407 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hqr48\" (UniqueName: \"kubernetes.io/projected/11e623a7-dfdc-44aa-a491-dacb3a4166ac-kube-api-access-hqr48\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:47:19.733415 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:19.733419 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/11e623a7-dfdc-44aa-a491-dacb3a4166ac-kserve-provision-location\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:47:19.733415 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:19.733430 2566 reconciler_common.go:299] "Volume detached for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/11e623a7-dfdc-44aa-a491-dacb3a4166ac-isvc-paddle-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:47:20.026077 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:20.025997 2566 generic.go:358] "Generic (PLEG): container finished" podID="11e623a7-dfdc-44aa-a491-dacb3a4166ac" containerID="8227c5a9d8e5888f7b382c2c67d8dfd081b882b82c0833d3b939acd73a526f21" exitCode=0 Apr 17 17:47:20.026077 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:20.026046 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ffllb" event={"ID":"11e623a7-dfdc-44aa-a491-dacb3a4166ac","Type":"ContainerDied","Data":"8227c5a9d8e5888f7b382c2c67d8dfd081b882b82c0833d3b939acd73a526f21"} Apr 17 17:47:20.026077 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:20.026073 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ffllb" event={"ID":"11e623a7-dfdc-44aa-a491-dacb3a4166ac","Type":"ContainerDied","Data":"2fa6d6ad4f3ce2f8b73bdd8eea4d42528394fd2652975ae9b42ed61085edf71d"} Apr 17 17:47:20.026315 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:20.026080 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ffllb" Apr 17 17:47:20.026315 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:20.026087 2566 scope.go:117] "RemoveContainer" containerID="15115e5aa6e4e0fc44353236905c77a7716629190cbfea4888721f06db99ee85" Apr 17 17:47:20.034008 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:20.033992 2566 scope.go:117] "RemoveContainer" containerID="8227c5a9d8e5888f7b382c2c67d8dfd081b882b82c0833d3b939acd73a526f21" Apr 17 17:47:20.041229 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:20.041212 2566 scope.go:117] "RemoveContainer" containerID="1471aff1750b4f6683a3f000c5abb56cdbd17a01e557c737670a1b4051efa9dc" Apr 17 17:47:20.047744 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:20.047722 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ffllb"] Apr 17 17:47:20.048759 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:20.048742 2566 scope.go:117] "RemoveContainer" containerID="15115e5aa6e4e0fc44353236905c77a7716629190cbfea4888721f06db99ee85" Apr 17 17:47:20.048994 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:47:20.048977 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15115e5aa6e4e0fc44353236905c77a7716629190cbfea4888721f06db99ee85\": container with ID starting with 15115e5aa6e4e0fc44353236905c77a7716629190cbfea4888721f06db99ee85 not found: ID does not exist" containerID="15115e5aa6e4e0fc44353236905c77a7716629190cbfea4888721f06db99ee85" Apr 17 17:47:20.049108 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:20.049006 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15115e5aa6e4e0fc44353236905c77a7716629190cbfea4888721f06db99ee85"} err="failed to get container status \"15115e5aa6e4e0fc44353236905c77a7716629190cbfea4888721f06db99ee85\": rpc error: code = NotFound desc = could not find container \"15115e5aa6e4e0fc44353236905c77a7716629190cbfea4888721f06db99ee85\": container with ID starting with 15115e5aa6e4e0fc44353236905c77a7716629190cbfea4888721f06db99ee85 not found: ID does not exist" Apr 17 17:47:20.049108 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:20.049030 2566 scope.go:117] "RemoveContainer" containerID="8227c5a9d8e5888f7b382c2c67d8dfd081b882b82c0833d3b939acd73a526f21" Apr 17 17:47:20.049294 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:47:20.049274 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8227c5a9d8e5888f7b382c2c67d8dfd081b882b82c0833d3b939acd73a526f21\": container with ID starting with 8227c5a9d8e5888f7b382c2c67d8dfd081b882b82c0833d3b939acd73a526f21 not found: ID does not exist" containerID="8227c5a9d8e5888f7b382c2c67d8dfd081b882b82c0833d3b939acd73a526f21" Apr 17 17:47:20.049341 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:20.049301 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8227c5a9d8e5888f7b382c2c67d8dfd081b882b82c0833d3b939acd73a526f21"} err="failed to get container status \"8227c5a9d8e5888f7b382c2c67d8dfd081b882b82c0833d3b939acd73a526f21\": rpc error: code = NotFound desc = could not find container \"8227c5a9d8e5888f7b382c2c67d8dfd081b882b82c0833d3b939acd73a526f21\": container with ID starting with 8227c5a9d8e5888f7b382c2c67d8dfd081b882b82c0833d3b939acd73a526f21 not found: ID does not exist" Apr 17 17:47:20.049341 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:20.049315 2566 scope.go:117] "RemoveContainer" containerID="1471aff1750b4f6683a3f000c5abb56cdbd17a01e557c737670a1b4051efa9dc" Apr 17 17:47:20.049559 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:47:20.049540 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1471aff1750b4f6683a3f000c5abb56cdbd17a01e557c737670a1b4051efa9dc\": container with ID starting with 1471aff1750b4f6683a3f000c5abb56cdbd17a01e557c737670a1b4051efa9dc not found: ID does not exist" containerID="1471aff1750b4f6683a3f000c5abb56cdbd17a01e557c737670a1b4051efa9dc" Apr 17 17:47:20.049609 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:20.049566 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1471aff1750b4f6683a3f000c5abb56cdbd17a01e557c737670a1b4051efa9dc"} err="failed to get container status \"1471aff1750b4f6683a3f000c5abb56cdbd17a01e557c737670a1b4051efa9dc\": rpc error: code = NotFound desc = could not find container \"1471aff1750b4f6683a3f000c5abb56cdbd17a01e557c737670a1b4051efa9dc\": container with ID starting with 1471aff1750b4f6683a3f000c5abb56cdbd17a01e557c737670a1b4051efa9dc not found: ID does not exist" Apr 17 17:47:20.054060 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:20.054035 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ffllb"] Apr 17 17:47:20.838531 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:20.838497 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11e623a7-dfdc-44aa-a491-dacb3a4166ac" path="/var/lib/kubelet/pods/11e623a7-dfdc-44aa-a491-dacb3a4166ac/volumes" Apr 17 17:47:22.034665 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:22.034584 2566 generic.go:358] "Generic (PLEG): container finished" podID="eb143e63-227b-43ee-8ba5-00a3b64e331c" containerID="8dff3b4bbfa550f6bc9e67679691a438d332bb31e4ef6d3aaf046a3dcc793c1e" exitCode=0 Apr 17 17:47:22.035082 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:22.034661 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2" event={"ID":"eb143e63-227b-43ee-8ba5-00a3b64e331c","Type":"ContainerDied","Data":"8dff3b4bbfa550f6bc9e67679691a438d332bb31e4ef6d3aaf046a3dcc793c1e"} Apr 17 17:47:23.039745 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:23.039710 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2" event={"ID":"eb143e63-227b-43ee-8ba5-00a3b64e331c","Type":"ContainerStarted","Data":"e9debc71db8a04c46aac651204c4e17b0259b9258d9782ccaed5bd2c4e970ec8"} Apr 17 17:47:23.040216 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:23.039755 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2" event={"ID":"eb143e63-227b-43ee-8ba5-00a3b64e331c","Type":"ContainerStarted","Data":"565166dfe8b8aff8395327403c3ef5dc9ad76ad8734e8a00baa9728126fb9462"} Apr 17 17:47:23.040216 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:23.040068 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2" Apr 17 17:47:23.061494 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:23.061452 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2" podStartSLOduration=7.061439385 podStartE2EDuration="7.061439385s" podCreationTimestamp="2026-04-17 17:47:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:47:23.060043214 +0000 UTC m=+1354.730122856" watchObservedRunningTime="2026-04-17 17:47:23.061439385 +0000 UTC m=+1354.731519026" Apr 17 17:47:24.043241 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:24.043208 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2" Apr 17 17:47:24.044362 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:24.044335 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2" podUID="eb143e63-227b-43ee-8ba5-00a3b64e331c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 17 17:47:25.046623 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:25.046585 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2" podUID="eb143e63-227b-43ee-8ba5-00a3b64e331c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 17 17:47:30.051133 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:30.051102 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2" Apr 17 17:47:30.051745 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:30.051718 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2" podUID="eb143e63-227b-43ee-8ba5-00a3b64e331c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 17 17:47:40.052420 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:40.052379 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2" podUID="eb143e63-227b-43ee-8ba5-00a3b64e331c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 17 17:47:50.052462 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:47:50.052422 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2" podUID="eb143e63-227b-43ee-8ba5-00a3b64e331c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 17 17:48:00.051855 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:00.051809 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2" podUID="eb143e63-227b-43ee-8ba5-00a3b64e331c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 17 17:48:10.052722 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:10.052692 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2" Apr 17 17:48:18.288104 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:18.288069 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2"] Apr 17 17:48:18.288578 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:18.288514 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2" podUID="eb143e63-227b-43ee-8ba5-00a3b64e331c" containerName="kserve-container" containerID="cri-o://565166dfe8b8aff8395327403c3ef5dc9ad76ad8734e8a00baa9728126fb9462" gracePeriod=30 Apr 17 17:48:18.288647 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:18.288574 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2" podUID="eb143e63-227b-43ee-8ba5-00a3b64e331c" containerName="kube-rbac-proxy" containerID="cri-o://e9debc71db8a04c46aac651204c4e17b0259b9258d9782ccaed5bd2c4e970ec8" gracePeriod=30 Apr 17 17:48:18.390380 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:18.390345 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd"] Apr 17 17:48:18.390704 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:18.390692 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="11e623a7-dfdc-44aa-a491-dacb3a4166ac" containerName="kserve-container" Apr 17 17:48:18.390751 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:18.390706 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="11e623a7-dfdc-44aa-a491-dacb3a4166ac" containerName="kserve-container" Apr 17 17:48:18.390751 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:18.390724 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="11e623a7-dfdc-44aa-a491-dacb3a4166ac" containerName="kube-rbac-proxy" Apr 17 17:48:18.390751 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:18.390731 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="11e623a7-dfdc-44aa-a491-dacb3a4166ac" containerName="kube-rbac-proxy" Apr 17 17:48:18.390751 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:18.390744 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="11e623a7-dfdc-44aa-a491-dacb3a4166ac" containerName="storage-initializer" Apr 17 17:48:18.390751 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:18.390750 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="11e623a7-dfdc-44aa-a491-dacb3a4166ac" containerName="storage-initializer" Apr 17 17:48:18.390905 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:18.390805 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="11e623a7-dfdc-44aa-a491-dacb3a4166ac" containerName="kube-rbac-proxy" Apr 17 17:48:18.390905 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:18.390818 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="11e623a7-dfdc-44aa-a491-dacb3a4166ac" containerName="kserve-container" Apr 17 17:48:18.394091 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:18.394069 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd" Apr 17 17:48:18.396417 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:18.396396 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-v2-kserve-predictor-serving-cert\"" Apr 17 17:48:18.396601 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:18.396581 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\"" Apr 17 17:48:18.402706 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:18.402681 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd"] Apr 17 17:48:18.421110 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:18.421072 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cbf70011-f8e8-4afe-b1b4-2ed56bfe4528-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd\" (UID: \"cbf70011-f8e8-4afe-b1b4-2ed56bfe4528\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd" Apr 17 17:48:18.421235 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:18.421121 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cbf70011-f8e8-4afe-b1b4-2ed56bfe4528-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd\" (UID: \"cbf70011-f8e8-4afe-b1b4-2ed56bfe4528\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd" Apr 17 17:48:18.421235 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:18.421180 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cbf70011-f8e8-4afe-b1b4-2ed56bfe4528-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd\" (UID: \"cbf70011-f8e8-4afe-b1b4-2ed56bfe4528\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd" Apr 17 17:48:18.421235 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:18.421210 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48ssc\" (UniqueName: \"kubernetes.io/projected/cbf70011-f8e8-4afe-b1b4-2ed56bfe4528-kube-api-access-48ssc\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd\" (UID: \"cbf70011-f8e8-4afe-b1b4-2ed56bfe4528\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd" Apr 17 17:48:18.522241 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:18.522197 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-48ssc\" (UniqueName: \"kubernetes.io/projected/cbf70011-f8e8-4afe-b1b4-2ed56bfe4528-kube-api-access-48ssc\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd\" (UID: \"cbf70011-f8e8-4afe-b1b4-2ed56bfe4528\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd" Apr 17 17:48:18.522434 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:18.522326 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cbf70011-f8e8-4afe-b1b4-2ed56bfe4528-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd\" (UID: \"cbf70011-f8e8-4afe-b1b4-2ed56bfe4528\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd" Apr 17 17:48:18.522434 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:18.522372 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cbf70011-f8e8-4afe-b1b4-2ed56bfe4528-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd\" (UID: \"cbf70011-f8e8-4afe-b1b4-2ed56bfe4528\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd" Apr 17 17:48:18.522523 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:18.522440 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cbf70011-f8e8-4afe-b1b4-2ed56bfe4528-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd\" (UID: \"cbf70011-f8e8-4afe-b1b4-2ed56bfe4528\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd" Apr 17 17:48:18.522523 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:48:18.522493 2566 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-serving-cert: secret "isvc-paddle-v2-kserve-predictor-serving-cert" not found Apr 17 17:48:18.522588 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:48:18.522556 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbf70011-f8e8-4afe-b1b4-2ed56bfe4528-proxy-tls podName:cbf70011-f8e8-4afe-b1b4-2ed56bfe4528 nodeName:}" failed. No retries permitted until 2026-04-17 17:48:19.022539996 +0000 UTC m=+1410.692619619 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/cbf70011-f8e8-4afe-b1b4-2ed56bfe4528-proxy-tls") pod "isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd" (UID: "cbf70011-f8e8-4afe-b1b4-2ed56bfe4528") : secret "isvc-paddle-v2-kserve-predictor-serving-cert" not found Apr 17 17:48:18.522793 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:18.522776 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cbf70011-f8e8-4afe-b1b4-2ed56bfe4528-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd\" (UID: \"cbf70011-f8e8-4afe-b1b4-2ed56bfe4528\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd" Apr 17 17:48:18.523062 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:18.523043 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cbf70011-f8e8-4afe-b1b4-2ed56bfe4528-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd\" (UID: \"cbf70011-f8e8-4afe-b1b4-2ed56bfe4528\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd" Apr 17 17:48:18.533590 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:18.533554 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-48ssc\" (UniqueName: \"kubernetes.io/projected/cbf70011-f8e8-4afe-b1b4-2ed56bfe4528-kube-api-access-48ssc\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd\" (UID: \"cbf70011-f8e8-4afe-b1b4-2ed56bfe4528\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd" Apr 17 17:48:19.026660 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:19.026620 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cbf70011-f8e8-4afe-b1b4-2ed56bfe4528-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd\" (UID: \"cbf70011-f8e8-4afe-b1b4-2ed56bfe4528\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd" Apr 17 17:48:19.029005 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:19.028982 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cbf70011-f8e8-4afe-b1b4-2ed56bfe4528-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd\" (UID: \"cbf70011-f8e8-4afe-b1b4-2ed56bfe4528\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd" Apr 17 17:48:19.211093 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:19.211059 2566 generic.go:358] "Generic (PLEG): container finished" podID="eb143e63-227b-43ee-8ba5-00a3b64e331c" containerID="e9debc71db8a04c46aac651204c4e17b0259b9258d9782ccaed5bd2c4e970ec8" exitCode=2 Apr 17 17:48:19.211093 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:19.211097 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2" event={"ID":"eb143e63-227b-43ee-8ba5-00a3b64e331c","Type":"ContainerDied","Data":"e9debc71db8a04c46aac651204c4e17b0259b9258d9782ccaed5bd2c4e970ec8"} Apr 17 17:48:19.306651 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:19.306571 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd" Apr 17 17:48:19.430176 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:19.430033 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd"] Apr 17 17:48:19.432933 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:48:19.432902 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbf70011_f8e8_4afe_b1b4_2ed56bfe4528.slice/crio-4b5727aef296f181b6512387463905c98aeb6865798370a06650be45ae21a5a9 WatchSource:0}: Error finding container 4b5727aef296f181b6512387463905c98aeb6865798370a06650be45ae21a5a9: Status 404 returned error can't find the container with id 4b5727aef296f181b6512387463905c98aeb6865798370a06650be45ae21a5a9 Apr 17 17:48:19.434763 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:19.434747 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:48:20.047347 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:20.047304 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2" podUID="eb143e63-227b-43ee-8ba5-00a3b64e331c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.29:8643/healthz\": dial tcp 10.133.0.29:8643: connect: connection refused" Apr 17 17:48:20.052425 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:20.052396 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2" podUID="eb143e63-227b-43ee-8ba5-00a3b64e331c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 17 17:48:20.215606 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:20.215567 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd" event={"ID":"cbf70011-f8e8-4afe-b1b4-2ed56bfe4528","Type":"ContainerStarted","Data":"22898d1f8daf950c25553b71bec734dae6e3f6a0602aba9552355981d01ef20a"} Apr 17 17:48:20.215606 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:20.215605 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd" event={"ID":"cbf70011-f8e8-4afe-b1b4-2ed56bfe4528","Type":"ContainerStarted","Data":"4b5727aef296f181b6512387463905c98aeb6865798370a06650be45ae21a5a9"} Apr 17 17:48:21.027153 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:21.027129 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2" Apr 17 17:48:21.046971 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:21.046945 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7dxc\" (UniqueName: \"kubernetes.io/projected/eb143e63-227b-43ee-8ba5-00a3b64e331c-kube-api-access-q7dxc\") pod \"eb143e63-227b-43ee-8ba5-00a3b64e331c\" (UID: \"eb143e63-227b-43ee-8ba5-00a3b64e331c\") " Apr 17 17:48:21.047149 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:21.046990 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eb143e63-227b-43ee-8ba5-00a3b64e331c-proxy-tls\") pod \"eb143e63-227b-43ee-8ba5-00a3b64e331c\" (UID: \"eb143e63-227b-43ee-8ba5-00a3b64e331c\") " Apr 17 17:48:21.047149 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:21.047024 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/eb143e63-227b-43ee-8ba5-00a3b64e331c-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"eb143e63-227b-43ee-8ba5-00a3b64e331c\" (UID: \"eb143e63-227b-43ee-8ba5-00a3b64e331c\") " Apr 17 17:48:21.047149 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:21.047050 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eb143e63-227b-43ee-8ba5-00a3b64e331c-kserve-provision-location\") pod \"eb143e63-227b-43ee-8ba5-00a3b64e331c\" (UID: \"eb143e63-227b-43ee-8ba5-00a3b64e331c\") " Apr 17 17:48:21.047482 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:21.047453 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb143e63-227b-43ee-8ba5-00a3b64e331c-isvc-paddle-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-paddle-runtime-kube-rbac-proxy-sar-config") pod "eb143e63-227b-43ee-8ba5-00a3b64e331c" (UID: "eb143e63-227b-43ee-8ba5-00a3b64e331c"). InnerVolumeSpecName "isvc-paddle-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:48:21.050171 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:21.049547 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb143e63-227b-43ee-8ba5-00a3b64e331c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "eb143e63-227b-43ee-8ba5-00a3b64e331c" (UID: "eb143e63-227b-43ee-8ba5-00a3b64e331c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:48:21.050408 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:21.050379 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb143e63-227b-43ee-8ba5-00a3b64e331c-kube-api-access-q7dxc" (OuterVolumeSpecName: "kube-api-access-q7dxc") pod "eb143e63-227b-43ee-8ba5-00a3b64e331c" (UID: "eb143e63-227b-43ee-8ba5-00a3b64e331c"). InnerVolumeSpecName "kube-api-access-q7dxc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:48:21.059766 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:21.059736 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb143e63-227b-43ee-8ba5-00a3b64e331c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "eb143e63-227b-43ee-8ba5-00a3b64e331c" (UID: "eb143e63-227b-43ee-8ba5-00a3b64e331c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:48:21.148009 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:21.147927 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eb143e63-227b-43ee-8ba5-00a3b64e331c-kserve-provision-location\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:48:21.148009 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:21.147954 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q7dxc\" (UniqueName: \"kubernetes.io/projected/eb143e63-227b-43ee-8ba5-00a3b64e331c-kube-api-access-q7dxc\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:48:21.148009 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:21.147989 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eb143e63-227b-43ee-8ba5-00a3b64e331c-proxy-tls\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:48:21.148009 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:21.147999 2566 reconciler_common.go:299] "Volume detached for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/eb143e63-227b-43ee-8ba5-00a3b64e331c-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:48:21.219799 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:21.219763 2566 generic.go:358] "Generic (PLEG): container finished" podID="eb143e63-227b-43ee-8ba5-00a3b64e331c" containerID="565166dfe8b8aff8395327403c3ef5dc9ad76ad8734e8a00baa9728126fb9462" exitCode=0 Apr 17 17:48:21.219967 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:21.219851 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2" Apr 17 17:48:21.219967 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:21.219844 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2" event={"ID":"eb143e63-227b-43ee-8ba5-00a3b64e331c","Type":"ContainerDied","Data":"565166dfe8b8aff8395327403c3ef5dc9ad76ad8734e8a00baa9728126fb9462"} Apr 17 17:48:21.219967 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:21.219960 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2" event={"ID":"eb143e63-227b-43ee-8ba5-00a3b64e331c","Type":"ContainerDied","Data":"39b4ad75a9a424ce8a7e0a99d6f70f8c8a373555c112e26b0fb46169cb9537b5"} Apr 17 17:48:21.220087 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:21.219976 2566 scope.go:117] "RemoveContainer" containerID="e9debc71db8a04c46aac651204c4e17b0259b9258d9782ccaed5bd2c4e970ec8" Apr 17 17:48:21.228784 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:21.228769 2566 scope.go:117] "RemoveContainer" containerID="565166dfe8b8aff8395327403c3ef5dc9ad76ad8734e8a00baa9728126fb9462" Apr 17 17:48:21.235973 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:21.235956 2566 scope.go:117] "RemoveContainer" containerID="8dff3b4bbfa550f6bc9e67679691a438d332bb31e4ef6d3aaf046a3dcc793c1e" Apr 17 17:48:21.241200 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:21.241177 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2"] Apr 17 17:48:21.243302 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:21.243235 2566 scope.go:117] "RemoveContainer" containerID="e9debc71db8a04c46aac651204c4e17b0259b9258d9782ccaed5bd2c4e970ec8" Apr 17 17:48:21.243587 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:48:21.243565 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9debc71db8a04c46aac651204c4e17b0259b9258d9782ccaed5bd2c4e970ec8\": container with ID starting with e9debc71db8a04c46aac651204c4e17b0259b9258d9782ccaed5bd2c4e970ec8 not found: ID does not exist" containerID="e9debc71db8a04c46aac651204c4e17b0259b9258d9782ccaed5bd2c4e970ec8" Apr 17 17:48:21.243682 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:21.243593 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9debc71db8a04c46aac651204c4e17b0259b9258d9782ccaed5bd2c4e970ec8"} err="failed to get container status \"e9debc71db8a04c46aac651204c4e17b0259b9258d9782ccaed5bd2c4e970ec8\": rpc error: code = NotFound desc = could not find container \"e9debc71db8a04c46aac651204c4e17b0259b9258d9782ccaed5bd2c4e970ec8\": container with ID starting with e9debc71db8a04c46aac651204c4e17b0259b9258d9782ccaed5bd2c4e970ec8 not found: ID does not exist" Apr 17 17:48:21.243682 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:21.243615 2566 scope.go:117] "RemoveContainer" containerID="565166dfe8b8aff8395327403c3ef5dc9ad76ad8734e8a00baa9728126fb9462" Apr 17 17:48:21.243874 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:48:21.243856 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"565166dfe8b8aff8395327403c3ef5dc9ad76ad8734e8a00baa9728126fb9462\": container with ID starting with 565166dfe8b8aff8395327403c3ef5dc9ad76ad8734e8a00baa9728126fb9462 not found: ID does not exist" containerID="565166dfe8b8aff8395327403c3ef5dc9ad76ad8734e8a00baa9728126fb9462" Apr 17 17:48:21.243929 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:21.243878 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"565166dfe8b8aff8395327403c3ef5dc9ad76ad8734e8a00baa9728126fb9462"} err="failed to get container status \"565166dfe8b8aff8395327403c3ef5dc9ad76ad8734e8a00baa9728126fb9462\": rpc error: code = NotFound desc = could not find container \"565166dfe8b8aff8395327403c3ef5dc9ad76ad8734e8a00baa9728126fb9462\": container with ID starting with 565166dfe8b8aff8395327403c3ef5dc9ad76ad8734e8a00baa9728126fb9462 not found: ID does not exist" Apr 17 17:48:21.243929 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:21.243892 2566 scope.go:117] "RemoveContainer" containerID="8dff3b4bbfa550f6bc9e67679691a438d332bb31e4ef6d3aaf046a3dcc793c1e" Apr 17 17:48:21.244154 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:48:21.244129 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8dff3b4bbfa550f6bc9e67679691a438d332bb31e4ef6d3aaf046a3dcc793c1e\": container with ID starting with 8dff3b4bbfa550f6bc9e67679691a438d332bb31e4ef6d3aaf046a3dcc793c1e not found: ID does not exist" containerID="8dff3b4bbfa550f6bc9e67679691a438d332bb31e4ef6d3aaf046a3dcc793c1e" Apr 17 17:48:21.244202 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:21.244158 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dff3b4bbfa550f6bc9e67679691a438d332bb31e4ef6d3aaf046a3dcc793c1e"} err="failed to get container status \"8dff3b4bbfa550f6bc9e67679691a438d332bb31e4ef6d3aaf046a3dcc793c1e\": rpc error: code = NotFound desc = could not find container \"8dff3b4bbfa550f6bc9e67679691a438d332bb31e4ef6d3aaf046a3dcc793c1e\": container with ID starting with 8dff3b4bbfa550f6bc9e67679691a438d332bb31e4ef6d3aaf046a3dcc793c1e not found: ID does not exist" Apr 17 17:48:21.244719 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:21.244702 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hrz2"] Apr 17 17:48:22.838114 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:22.838073 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb143e63-227b-43ee-8ba5-00a3b64e331c" path="/var/lib/kubelet/pods/eb143e63-227b-43ee-8ba5-00a3b64e331c/volumes" Apr 17 17:48:24.230445 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:24.230414 2566 generic.go:358] "Generic (PLEG): container finished" podID="cbf70011-f8e8-4afe-b1b4-2ed56bfe4528" containerID="22898d1f8daf950c25553b71bec734dae6e3f6a0602aba9552355981d01ef20a" exitCode=0 Apr 17 17:48:24.230830 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:24.230487 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd" event={"ID":"cbf70011-f8e8-4afe-b1b4-2ed56bfe4528","Type":"ContainerDied","Data":"22898d1f8daf950c25553b71bec734dae6e3f6a0602aba9552355981d01ef20a"} Apr 17 17:48:25.235179 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:25.235145 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd" event={"ID":"cbf70011-f8e8-4afe-b1b4-2ed56bfe4528","Type":"ContainerStarted","Data":"e9287caec549835445bc544c4a6e2bc12e7ef6c3a1d19286e7107f79e5089fbb"} Apr 17 17:48:25.235179 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:25.235185 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd" event={"ID":"cbf70011-f8e8-4afe-b1b4-2ed56bfe4528","Type":"ContainerStarted","Data":"874f9d87a6fdc76104f04d20d1f5bfdeadd2d4fe1e04cc2ac3f45ab32749beac"} Apr 17 17:48:25.235684 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:25.235510 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd" Apr 17 17:48:25.235684 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:25.235544 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd" Apr 17 17:48:25.237115 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:25.237092 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd" podUID="cbf70011-f8e8-4afe-b1b4-2ed56bfe4528" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 17 17:48:25.255882 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:25.255832 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd" podStartSLOduration=7.2558202210000005 podStartE2EDuration="7.255820221s" podCreationTimestamp="2026-04-17 17:48:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:48:25.255042484 +0000 UTC m=+1416.925122126" watchObservedRunningTime="2026-04-17 17:48:25.255820221 +0000 UTC m=+1416.925899862" Apr 17 17:48:26.238377 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:26.238291 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd" podUID="cbf70011-f8e8-4afe-b1b4-2ed56bfe4528" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 17 17:48:31.243790 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:31.243762 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd" Apr 17 17:48:31.244443 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:31.244410 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd" podUID="cbf70011-f8e8-4afe-b1b4-2ed56bfe4528" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 17 17:48:41.244661 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:41.244625 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd" podUID="cbf70011-f8e8-4afe-b1b4-2ed56bfe4528" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 17 17:48:51.244729 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:48:51.244687 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd" podUID="cbf70011-f8e8-4afe-b1b4-2ed56bfe4528" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 17 17:49:01.245316 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:01.245269 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd" podUID="cbf70011-f8e8-4afe-b1b4-2ed56bfe4528" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 17 17:49:11.245576 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:11.245539 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd" Apr 17 17:49:20.092054 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:20.092017 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd"] Apr 17 17:49:20.092555 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:20.092399 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd" podUID="cbf70011-f8e8-4afe-b1b4-2ed56bfe4528" containerName="kserve-container" containerID="cri-o://874f9d87a6fdc76104f04d20d1f5bfdeadd2d4fe1e04cc2ac3f45ab32749beac" gracePeriod=30 Apr 17 17:49:20.092555 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:20.092400 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd" podUID="cbf70011-f8e8-4afe-b1b4-2ed56bfe4528" containerName="kube-rbac-proxy" containerID="cri-o://e9287caec549835445bc544c4a6e2bc12e7ef6c3a1d19286e7107f79e5089fbb" gracePeriod=30 Apr 17 17:49:20.180493 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:20.180457 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-d959k"] Apr 17 17:49:20.180880 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:20.180861 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eb143e63-227b-43ee-8ba5-00a3b64e331c" containerName="storage-initializer" Apr 17 17:49:20.180880 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:20.180882 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb143e63-227b-43ee-8ba5-00a3b64e331c" containerName="storage-initializer" Apr 17 17:49:20.181045 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:20.180896 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eb143e63-227b-43ee-8ba5-00a3b64e331c" containerName="kserve-container" Apr 17 17:49:20.181045 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:20.180904 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb143e63-227b-43ee-8ba5-00a3b64e331c" containerName="kserve-container" Apr 17 17:49:20.181045 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:20.180914 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eb143e63-227b-43ee-8ba5-00a3b64e331c" containerName="kube-rbac-proxy" Apr 17 17:49:20.181045 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:20.180923 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb143e63-227b-43ee-8ba5-00a3b64e331c" containerName="kube-rbac-proxy" Apr 17 17:49:20.181045 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:20.181017 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="eb143e63-227b-43ee-8ba5-00a3b64e331c" containerName="kube-rbac-proxy" Apr 17 17:49:20.181045 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:20.181031 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="eb143e63-227b-43ee-8ba5-00a3b64e331c" containerName="kserve-container" Apr 17 17:49:20.183127 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:20.183106 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-d959k" Apr 17 17:49:20.185361 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:20.185342 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-kube-rbac-proxy-sar-config\"" Apr 17 17:49:20.185461 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:20.185425 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-predictor-serving-cert\"" Apr 17 17:49:20.195365 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:20.195342 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-d959k"] Apr 17 17:49:20.239132 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:20.239105 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d8aec2cc-326b-429a-93a3-d1fec3bc6cfb-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-d959k\" (UID: \"d8aec2cc-326b-429a-93a3-d1fec3bc6cfb\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-d959k" Apr 17 17:49:20.239297 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:20.239176 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d8aec2cc-326b-429a-93a3-d1fec3bc6cfb-kserve-provision-location\") pod \"isvc-pmml-predictor-8bb578669-d959k\" (UID: \"d8aec2cc-326b-429a-93a3-d1fec3bc6cfb\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-d959k" Apr 17 17:49:20.239297 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:20.239239 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpcvf\" (UniqueName: \"kubernetes.io/projected/d8aec2cc-326b-429a-93a3-d1fec3bc6cfb-kube-api-access-jpcvf\") pod \"isvc-pmml-predictor-8bb578669-d959k\" (UID: \"d8aec2cc-326b-429a-93a3-d1fec3bc6cfb\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-d959k" Apr 17 17:49:20.239404 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:20.239332 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d8aec2cc-326b-429a-93a3-d1fec3bc6cfb-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-predictor-8bb578669-d959k\" (UID: \"d8aec2cc-326b-429a-93a3-d1fec3bc6cfb\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-d959k" Apr 17 17:49:20.340495 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:20.340451 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d8aec2cc-326b-429a-93a3-d1fec3bc6cfb-kserve-provision-location\") pod \"isvc-pmml-predictor-8bb578669-d959k\" (UID: \"d8aec2cc-326b-429a-93a3-d1fec3bc6cfb\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-d959k" Apr 17 17:49:20.340693 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:20.340510 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jpcvf\" (UniqueName: \"kubernetes.io/projected/d8aec2cc-326b-429a-93a3-d1fec3bc6cfb-kube-api-access-jpcvf\") pod \"isvc-pmml-predictor-8bb578669-d959k\" (UID: \"d8aec2cc-326b-429a-93a3-d1fec3bc6cfb\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-d959k" Apr 17 17:49:20.340693 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:20.340553 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d8aec2cc-326b-429a-93a3-d1fec3bc6cfb-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-predictor-8bb578669-d959k\" (UID: \"d8aec2cc-326b-429a-93a3-d1fec3bc6cfb\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-d959k" Apr 17 17:49:20.340693 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:20.340578 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d8aec2cc-326b-429a-93a3-d1fec3bc6cfb-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-d959k\" (UID: \"d8aec2cc-326b-429a-93a3-d1fec3bc6cfb\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-d959k" Apr 17 17:49:20.340856 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:49:20.340693 2566 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-pmml-predictor-serving-cert: secret "isvc-pmml-predictor-serving-cert" not found Apr 17 17:49:20.340856 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:49:20.340756 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8aec2cc-326b-429a-93a3-d1fec3bc6cfb-proxy-tls podName:d8aec2cc-326b-429a-93a3-d1fec3bc6cfb nodeName:}" failed. No retries permitted until 2026-04-17 17:49:20.840735177 +0000 UTC m=+1472.510814801 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/d8aec2cc-326b-429a-93a3-d1fec3bc6cfb-proxy-tls") pod "isvc-pmml-predictor-8bb578669-d959k" (UID: "d8aec2cc-326b-429a-93a3-d1fec3bc6cfb") : secret "isvc-pmml-predictor-serving-cert" not found Apr 17 17:49:20.340951 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:20.340885 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d8aec2cc-326b-429a-93a3-d1fec3bc6cfb-kserve-provision-location\") pod \"isvc-pmml-predictor-8bb578669-d959k\" (UID: \"d8aec2cc-326b-429a-93a3-d1fec3bc6cfb\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-d959k" Apr 17 17:49:20.341312 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:20.341289 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d8aec2cc-326b-429a-93a3-d1fec3bc6cfb-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-predictor-8bb578669-d959k\" (UID: \"d8aec2cc-326b-429a-93a3-d1fec3bc6cfb\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-d959k" Apr 17 17:49:20.350214 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:20.350152 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpcvf\" (UniqueName: \"kubernetes.io/projected/d8aec2cc-326b-429a-93a3-d1fec3bc6cfb-kube-api-access-jpcvf\") pod \"isvc-pmml-predictor-8bb578669-d959k\" (UID: \"d8aec2cc-326b-429a-93a3-d1fec3bc6cfb\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-d959k" Apr 17 17:49:20.391559 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:20.391519 2566 generic.go:358] "Generic (PLEG): container finished" podID="cbf70011-f8e8-4afe-b1b4-2ed56bfe4528" containerID="e9287caec549835445bc544c4a6e2bc12e7ef6c3a1d19286e7107f79e5089fbb" exitCode=2 Apr 17 17:49:20.391697 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:20.391593 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd" event={"ID":"cbf70011-f8e8-4afe-b1b4-2ed56bfe4528","Type":"ContainerDied","Data":"e9287caec549835445bc544c4a6e2bc12e7ef6c3a1d19286e7107f79e5089fbb"} Apr 17 17:49:20.845476 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:20.845445 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d8aec2cc-326b-429a-93a3-d1fec3bc6cfb-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-d959k\" (UID: \"d8aec2cc-326b-429a-93a3-d1fec3bc6cfb\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-d959k" Apr 17 17:49:20.847786 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:20.847768 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d8aec2cc-326b-429a-93a3-d1fec3bc6cfb-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-d959k\" (UID: \"d8aec2cc-326b-429a-93a3-d1fec3bc6cfb\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-d959k" Apr 17 17:49:21.092370 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:21.092335 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-d959k" Apr 17 17:49:21.208849 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:21.208811 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-d959k"] Apr 17 17:49:21.212878 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:49:21.212839 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8aec2cc_326b_429a_93a3_d1fec3bc6cfb.slice/crio-2484e861b56b8b8949ecc127b403e35496eb4fb897c4c6188e62f3a727903d34 WatchSource:0}: Error finding container 2484e861b56b8b8949ecc127b403e35496eb4fb897c4c6188e62f3a727903d34: Status 404 returned error can't find the container with id 2484e861b56b8b8949ecc127b403e35496eb4fb897c4c6188e62f3a727903d34 Apr 17 17:49:21.239419 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:21.239388 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd" podUID="cbf70011-f8e8-4afe-b1b4-2ed56bfe4528" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.30:8643/healthz\": dial tcp 10.133.0.30:8643: connect: connection refused" Apr 17 17:49:21.244841 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:21.244815 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd" podUID="cbf70011-f8e8-4afe-b1b4-2ed56bfe4528" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 17 17:49:21.396028 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:21.395933 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-d959k" event={"ID":"d8aec2cc-326b-429a-93a3-d1fec3bc6cfb","Type":"ContainerStarted","Data":"9430e1da60af15d09be19e9b0bdb5c43ccf6428fd74063717dbec7f928a0dff5"} Apr 17 17:49:21.396028 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:21.395973 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-d959k" event={"ID":"d8aec2cc-326b-429a-93a3-d1fec3bc6cfb","Type":"ContainerStarted","Data":"2484e861b56b8b8949ecc127b403e35496eb4fb897c4c6188e62f3a727903d34"} Apr 17 17:49:22.838701 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:22.838679 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd" Apr 17 17:49:22.863938 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:22.863913 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cbf70011-f8e8-4afe-b1b4-2ed56bfe4528-kserve-provision-location\") pod \"cbf70011-f8e8-4afe-b1b4-2ed56bfe4528\" (UID: \"cbf70011-f8e8-4afe-b1b4-2ed56bfe4528\") " Apr 17 17:49:22.864064 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:22.863955 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cbf70011-f8e8-4afe-b1b4-2ed56bfe4528-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"cbf70011-f8e8-4afe-b1b4-2ed56bfe4528\" (UID: \"cbf70011-f8e8-4afe-b1b4-2ed56bfe4528\") " Apr 17 17:49:22.864064 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:22.863995 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48ssc\" (UniqueName: \"kubernetes.io/projected/cbf70011-f8e8-4afe-b1b4-2ed56bfe4528-kube-api-access-48ssc\") pod \"cbf70011-f8e8-4afe-b1b4-2ed56bfe4528\" (UID: \"cbf70011-f8e8-4afe-b1b4-2ed56bfe4528\") " Apr 17 17:49:22.864064 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:22.864024 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cbf70011-f8e8-4afe-b1b4-2ed56bfe4528-proxy-tls\") pod \"cbf70011-f8e8-4afe-b1b4-2ed56bfe4528\" (UID: \"cbf70011-f8e8-4afe-b1b4-2ed56bfe4528\") " Apr 17 17:49:22.864523 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:22.864392 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbf70011-f8e8-4afe-b1b4-2ed56bfe4528-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config") pod "cbf70011-f8e8-4afe-b1b4-2ed56bfe4528" (UID: "cbf70011-f8e8-4afe-b1b4-2ed56bfe4528"). InnerVolumeSpecName "isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:49:22.866131 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:22.866092 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbf70011-f8e8-4afe-b1b4-2ed56bfe4528-kube-api-access-48ssc" (OuterVolumeSpecName: "kube-api-access-48ssc") pod "cbf70011-f8e8-4afe-b1b4-2ed56bfe4528" (UID: "cbf70011-f8e8-4afe-b1b4-2ed56bfe4528"). InnerVolumeSpecName "kube-api-access-48ssc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:49:22.866505 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:22.866488 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbf70011-f8e8-4afe-b1b4-2ed56bfe4528-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "cbf70011-f8e8-4afe-b1b4-2ed56bfe4528" (UID: "cbf70011-f8e8-4afe-b1b4-2ed56bfe4528"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:49:22.877031 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:22.877007 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbf70011-f8e8-4afe-b1b4-2ed56bfe4528-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "cbf70011-f8e8-4afe-b1b4-2ed56bfe4528" (UID: "cbf70011-f8e8-4afe-b1b4-2ed56bfe4528"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:49:22.964696 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:22.964617 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cbf70011-f8e8-4afe-b1b4-2ed56bfe4528-kserve-provision-location\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:49:22.964696 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:22.964646 2566 reconciler_common.go:299] "Volume detached for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cbf70011-f8e8-4afe-b1b4-2ed56bfe4528-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:49:22.964696 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:22.964657 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-48ssc\" (UniqueName: \"kubernetes.io/projected/cbf70011-f8e8-4afe-b1b4-2ed56bfe4528-kube-api-access-48ssc\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:49:22.964696 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:22.964668 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cbf70011-f8e8-4afe-b1b4-2ed56bfe4528-proxy-tls\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:49:23.402668 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:23.402631 2566 generic.go:358] "Generic (PLEG): container finished" podID="cbf70011-f8e8-4afe-b1b4-2ed56bfe4528" containerID="874f9d87a6fdc76104f04d20d1f5bfdeadd2d4fe1e04cc2ac3f45ab32749beac" exitCode=0 Apr 17 17:49:23.402844 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:23.402682 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd" event={"ID":"cbf70011-f8e8-4afe-b1b4-2ed56bfe4528","Type":"ContainerDied","Data":"874f9d87a6fdc76104f04d20d1f5bfdeadd2d4fe1e04cc2ac3f45ab32749beac"} Apr 17 17:49:23.402844 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:23.402730 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd" Apr 17 17:49:23.402844 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:23.402734 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd" event={"ID":"cbf70011-f8e8-4afe-b1b4-2ed56bfe4528","Type":"ContainerDied","Data":"4b5727aef296f181b6512387463905c98aeb6865798370a06650be45ae21a5a9"} Apr 17 17:49:23.402844 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:23.402756 2566 scope.go:117] "RemoveContainer" containerID="e9287caec549835445bc544c4a6e2bc12e7ef6c3a1d19286e7107f79e5089fbb" Apr 17 17:49:23.411039 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:23.411020 2566 scope.go:117] "RemoveContainer" containerID="874f9d87a6fdc76104f04d20d1f5bfdeadd2d4fe1e04cc2ac3f45ab32749beac" Apr 17 17:49:23.417666 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:23.417648 2566 scope.go:117] "RemoveContainer" containerID="22898d1f8daf950c25553b71bec734dae6e3f6a0602aba9552355981d01ef20a" Apr 17 17:49:23.424102 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:23.424081 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd"] Apr 17 17:49:23.424798 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:23.424768 2566 scope.go:117] "RemoveContainer" containerID="e9287caec549835445bc544c4a6e2bc12e7ef6c3a1d19286e7107f79e5089fbb" Apr 17 17:49:23.425066 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:49:23.425048 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9287caec549835445bc544c4a6e2bc12e7ef6c3a1d19286e7107f79e5089fbb\": container with ID starting with e9287caec549835445bc544c4a6e2bc12e7ef6c3a1d19286e7107f79e5089fbb not found: ID does not exist" containerID="e9287caec549835445bc544c4a6e2bc12e7ef6c3a1d19286e7107f79e5089fbb" Apr 17 17:49:23.425136 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:23.425073 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9287caec549835445bc544c4a6e2bc12e7ef6c3a1d19286e7107f79e5089fbb"} err="failed to get container status \"e9287caec549835445bc544c4a6e2bc12e7ef6c3a1d19286e7107f79e5089fbb\": rpc error: code = NotFound desc = could not find container \"e9287caec549835445bc544c4a6e2bc12e7ef6c3a1d19286e7107f79e5089fbb\": container with ID starting with e9287caec549835445bc544c4a6e2bc12e7ef6c3a1d19286e7107f79e5089fbb not found: ID does not exist" Apr 17 17:49:23.425136 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:23.425089 2566 scope.go:117] "RemoveContainer" containerID="874f9d87a6fdc76104f04d20d1f5bfdeadd2d4fe1e04cc2ac3f45ab32749beac" Apr 17 17:49:23.425325 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:49:23.425310 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"874f9d87a6fdc76104f04d20d1f5bfdeadd2d4fe1e04cc2ac3f45ab32749beac\": container with ID starting with 874f9d87a6fdc76104f04d20d1f5bfdeadd2d4fe1e04cc2ac3f45ab32749beac not found: ID does not exist" containerID="874f9d87a6fdc76104f04d20d1f5bfdeadd2d4fe1e04cc2ac3f45ab32749beac" Apr 17 17:49:23.425383 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:23.425328 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"874f9d87a6fdc76104f04d20d1f5bfdeadd2d4fe1e04cc2ac3f45ab32749beac"} err="failed to get container status \"874f9d87a6fdc76104f04d20d1f5bfdeadd2d4fe1e04cc2ac3f45ab32749beac\": rpc error: code = NotFound desc = could not find container \"874f9d87a6fdc76104f04d20d1f5bfdeadd2d4fe1e04cc2ac3f45ab32749beac\": container with ID starting with 874f9d87a6fdc76104f04d20d1f5bfdeadd2d4fe1e04cc2ac3f45ab32749beac not found: ID does not exist" Apr 17 17:49:23.425383 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:23.425341 2566 scope.go:117] "RemoveContainer" containerID="22898d1f8daf950c25553b71bec734dae6e3f6a0602aba9552355981d01ef20a" Apr 17 17:49:23.425534 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:49:23.425520 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22898d1f8daf950c25553b71bec734dae6e3f6a0602aba9552355981d01ef20a\": container with ID starting with 22898d1f8daf950c25553b71bec734dae6e3f6a0602aba9552355981d01ef20a not found: ID does not exist" containerID="22898d1f8daf950c25553b71bec734dae6e3f6a0602aba9552355981d01ef20a" Apr 17 17:49:23.425573 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:23.425537 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22898d1f8daf950c25553b71bec734dae6e3f6a0602aba9552355981d01ef20a"} err="failed to get container status \"22898d1f8daf950c25553b71bec734dae6e3f6a0602aba9552355981d01ef20a\": rpc error: code = NotFound desc = could not find container \"22898d1f8daf950c25553b71bec734dae6e3f6a0602aba9552355981d01ef20a\": container with ID starting with 22898d1f8daf950c25553b71bec734dae6e3f6a0602aba9552355981d01ef20a not found: ID does not exist" Apr 17 17:49:23.428484 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:23.428464 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-8x2nd"] Apr 17 17:49:24.837298 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:24.837249 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbf70011-f8e8-4afe-b1b4-2ed56bfe4528" path="/var/lib/kubelet/pods/cbf70011-f8e8-4afe-b1b4-2ed56bfe4528/volumes" Apr 17 17:49:25.410518 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:25.410482 2566 generic.go:358] "Generic (PLEG): container finished" podID="d8aec2cc-326b-429a-93a3-d1fec3bc6cfb" containerID="9430e1da60af15d09be19e9b0bdb5c43ccf6428fd74063717dbec7f928a0dff5" exitCode=0 Apr 17 17:49:25.410698 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:25.410556 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-d959k" event={"ID":"d8aec2cc-326b-429a-93a3-d1fec3bc6cfb","Type":"ContainerDied","Data":"9430e1da60af15d09be19e9b0bdb5c43ccf6428fd74063717dbec7f928a0dff5"} Apr 17 17:49:33.440966 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:33.440930 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-d959k" event={"ID":"d8aec2cc-326b-429a-93a3-d1fec3bc6cfb","Type":"ContainerStarted","Data":"70243c28e83410a3989c9f95d312edb78d620758c0a990e5a5aed49a47547af0"} Apr 17 17:49:33.440966 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:33.440974 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-d959k" event={"ID":"d8aec2cc-326b-429a-93a3-d1fec3bc6cfb","Type":"ContainerStarted","Data":"b83c224ca2564b28e1d07626e9a345caa6c80a5d15be2c61b8da45744b8a764c"} Apr 17 17:49:33.441525 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:33.441194 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-d959k" Apr 17 17:49:33.464211 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:33.464160 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-d959k" podStartSLOduration=6.455106548 podStartE2EDuration="13.464145155s" podCreationTimestamp="2026-04-17 17:49:20 +0000 UTC" firstStartedPulling="2026-04-17 17:49:25.411697947 +0000 UTC m=+1477.081777568" lastFinishedPulling="2026-04-17 17:49:32.420736552 +0000 UTC m=+1484.090816175" observedRunningTime="2026-04-17 17:49:33.46220574 +0000 UTC m=+1485.132285381" watchObservedRunningTime="2026-04-17 17:49:33.464145155 +0000 UTC m=+1485.134224796" Apr 17 17:49:34.443635 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:34.443601 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-d959k" Apr 17 17:49:34.444842 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:34.444808 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-d959k" podUID="d8aec2cc-326b-429a-93a3-d1fec3bc6cfb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 17 17:49:35.446654 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:35.446610 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-d959k" podUID="d8aec2cc-326b-429a-93a3-d1fec3bc6cfb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 17 17:49:40.452008 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:40.451976 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-d959k" Apr 17 17:49:40.452511 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:40.452485 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-d959k" podUID="d8aec2cc-326b-429a-93a3-d1fec3bc6cfb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 17 17:49:48.856939 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:48.856912 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz777_35fe1bc2-e1a4-4744-8f67-3cc38747e567/ovn-acl-logging/0.log" Apr 17 17:49:48.858612 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:48.858591 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz777_35fe1bc2-e1a4-4744-8f67-3cc38747e567/ovn-acl-logging/0.log" Apr 17 17:49:50.452645 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:49:50.452608 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-d959k" podUID="d8aec2cc-326b-429a-93a3-d1fec3bc6cfb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 17 17:50:00.453144 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:50:00.453055 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-d959k" podUID="d8aec2cc-326b-429a-93a3-d1fec3bc6cfb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 17 17:50:10.452500 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:50:10.452455 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-d959k" podUID="d8aec2cc-326b-429a-93a3-d1fec3bc6cfb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 17 17:50:20.453138 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:50:20.453098 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-d959k" podUID="d8aec2cc-326b-429a-93a3-d1fec3bc6cfb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 17 17:50:30.453467 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:50:30.453424 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-d959k" podUID="d8aec2cc-326b-429a-93a3-d1fec3bc6cfb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 17 17:50:40.452624 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:50:40.452583 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-d959k" podUID="d8aec2cc-326b-429a-93a3-d1fec3bc6cfb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 17 17:50:50.453106 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:50:50.453076 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-d959k" Apr 17 17:51:01.609068 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:01.609036 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-d959k"] Apr 17 17:51:01.609566 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:01.609388 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-d959k" podUID="d8aec2cc-326b-429a-93a3-d1fec3bc6cfb" containerName="kserve-container" containerID="cri-o://b83c224ca2564b28e1d07626e9a345caa6c80a5d15be2c61b8da45744b8a764c" gracePeriod=30 Apr 17 17:51:01.609566 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:01.609449 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-d959k" podUID="d8aec2cc-326b-429a-93a3-d1fec3bc6cfb" containerName="kube-rbac-proxy" containerID="cri-o://70243c28e83410a3989c9f95d312edb78d620758c0a990e5a5aed49a47547af0" gracePeriod=30 Apr 17 17:51:01.712777 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:01.712742 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-l747t"] Apr 17 17:51:01.713150 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:01.713122 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cbf70011-f8e8-4afe-b1b4-2ed56bfe4528" containerName="kube-rbac-proxy" Apr 17 17:51:01.713150 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:01.713142 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbf70011-f8e8-4afe-b1b4-2ed56bfe4528" containerName="kube-rbac-proxy" Apr 17 17:51:01.713350 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:01.713161 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cbf70011-f8e8-4afe-b1b4-2ed56bfe4528" containerName="kserve-container" Apr 17 17:51:01.713350 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:01.713170 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbf70011-f8e8-4afe-b1b4-2ed56bfe4528" containerName="kserve-container" Apr 17 17:51:01.713350 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:01.713186 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cbf70011-f8e8-4afe-b1b4-2ed56bfe4528" containerName="storage-initializer" Apr 17 17:51:01.713350 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:01.713196 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbf70011-f8e8-4afe-b1b4-2ed56bfe4528" containerName="storage-initializer" Apr 17 17:51:01.713350 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:01.713295 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="cbf70011-f8e8-4afe-b1b4-2ed56bfe4528" containerName="kserve-container" Apr 17 17:51:01.713350 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:01.713313 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="cbf70011-f8e8-4afe-b1b4-2ed56bfe4528" containerName="kube-rbac-proxy" Apr 17 17:51:01.715643 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:01.715623 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-l747t" Apr 17 17:51:01.717882 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:01.717862 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-runtime-predictor-serving-cert\"" Apr 17 17:51:01.718038 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:01.718018 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-runtime-kube-rbac-proxy-sar-config\"" Apr 17 17:51:01.726725 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:01.726701 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-l747t"] Apr 17 17:51:01.809075 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:01.809043 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/135f8205-c96c-4744-96fa-566f9008229d-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-l747t\" (UID: \"135f8205-c96c-4744-96fa-566f9008229d\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-l747t" Apr 17 17:51:01.809075 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:01.809075 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shzwk\" (UniqueName: \"kubernetes.io/projected/135f8205-c96c-4744-96fa-566f9008229d-kube-api-access-shzwk\") pod \"isvc-pmml-runtime-predictor-67bc544947-l747t\" (UID: \"135f8205-c96c-4744-96fa-566f9008229d\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-l747t" Apr 17 17:51:01.809378 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:01.809161 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/135f8205-c96c-4744-96fa-566f9008229d-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-67bc544947-l747t\" (UID: \"135f8205-c96c-4744-96fa-566f9008229d\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-l747t" Apr 17 17:51:01.809378 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:01.809280 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/135f8205-c96c-4744-96fa-566f9008229d-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-runtime-predictor-67bc544947-l747t\" (UID: \"135f8205-c96c-4744-96fa-566f9008229d\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-l747t" Apr 17 17:51:01.910640 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:01.910528 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/135f8205-c96c-4744-96fa-566f9008229d-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-67bc544947-l747t\" (UID: \"135f8205-c96c-4744-96fa-566f9008229d\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-l747t" Apr 17 17:51:01.910640 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:01.910618 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/135f8205-c96c-4744-96fa-566f9008229d-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-runtime-predictor-67bc544947-l747t\" (UID: \"135f8205-c96c-4744-96fa-566f9008229d\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-l747t" Apr 17 17:51:01.910885 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:01.910666 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/135f8205-c96c-4744-96fa-566f9008229d-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-l747t\" (UID: \"135f8205-c96c-4744-96fa-566f9008229d\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-l747t" Apr 17 17:51:01.910885 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:01.910775 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-shzwk\" (UniqueName: \"kubernetes.io/projected/135f8205-c96c-4744-96fa-566f9008229d-kube-api-access-shzwk\") pod \"isvc-pmml-runtime-predictor-67bc544947-l747t\" (UID: \"135f8205-c96c-4744-96fa-566f9008229d\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-l747t" Apr 17 17:51:01.911001 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:01.910946 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/135f8205-c96c-4744-96fa-566f9008229d-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-67bc544947-l747t\" (UID: \"135f8205-c96c-4744-96fa-566f9008229d\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-l747t" Apr 17 17:51:01.911336 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:01.911314 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/135f8205-c96c-4744-96fa-566f9008229d-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-runtime-predictor-67bc544947-l747t\" (UID: \"135f8205-c96c-4744-96fa-566f9008229d\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-l747t" Apr 17 17:51:01.913232 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:01.913205 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/135f8205-c96c-4744-96fa-566f9008229d-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-l747t\" (UID: \"135f8205-c96c-4744-96fa-566f9008229d\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-l747t" Apr 17 17:51:01.919574 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:01.919548 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-shzwk\" (UniqueName: \"kubernetes.io/projected/135f8205-c96c-4744-96fa-566f9008229d-kube-api-access-shzwk\") pod \"isvc-pmml-runtime-predictor-67bc544947-l747t\" (UID: \"135f8205-c96c-4744-96fa-566f9008229d\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-l747t" Apr 17 17:51:02.026722 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:02.026679 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-l747t" Apr 17 17:51:02.144482 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:02.144453 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-l747t"] Apr 17 17:51:02.146841 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:51:02.146812 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod135f8205_c96c_4744_96fa_566f9008229d.slice/crio-10774f5c7711366557f84d46684ba1d98e2c2b4fb1a74002f56c93eb77be64b6 WatchSource:0}: Error finding container 10774f5c7711366557f84d46684ba1d98e2c2b4fb1a74002f56c93eb77be64b6: Status 404 returned error can't find the container with id 10774f5c7711366557f84d46684ba1d98e2c2b4fb1a74002f56c93eb77be64b6 Apr 17 17:51:02.708890 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:02.708851 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-l747t" event={"ID":"135f8205-c96c-4744-96fa-566f9008229d","Type":"ContainerStarted","Data":"f47ddef986964c2e67bb8e105e47aa60e4eb7f93e75baa907d3ec75acc23bf95"} Apr 17 17:51:02.708890 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:02.708891 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-l747t" event={"ID":"135f8205-c96c-4744-96fa-566f9008229d","Type":"ContainerStarted","Data":"10774f5c7711366557f84d46684ba1d98e2c2b4fb1a74002f56c93eb77be64b6"} Apr 17 17:51:02.710731 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:02.710704 2566 generic.go:358] "Generic (PLEG): container finished" podID="d8aec2cc-326b-429a-93a3-d1fec3bc6cfb" containerID="70243c28e83410a3989c9f95d312edb78d620758c0a990e5a5aed49a47547af0" exitCode=2 Apr 17 17:51:02.710838 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:02.710770 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-d959k" event={"ID":"d8aec2cc-326b-429a-93a3-d1fec3bc6cfb","Type":"ContainerDied","Data":"70243c28e83410a3989c9f95d312edb78d620758c0a990e5a5aed49a47547af0"} Apr 17 17:51:05.245527 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:05.245505 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-d959k" Apr 17 17:51:05.342045 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:05.342022 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d8aec2cc-326b-429a-93a3-d1fec3bc6cfb-kserve-provision-location\") pod \"d8aec2cc-326b-429a-93a3-d1fec3bc6cfb\" (UID: \"d8aec2cc-326b-429a-93a3-d1fec3bc6cfb\") " Apr 17 17:51:05.342231 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:05.342064 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpcvf\" (UniqueName: \"kubernetes.io/projected/d8aec2cc-326b-429a-93a3-d1fec3bc6cfb-kube-api-access-jpcvf\") pod \"d8aec2cc-326b-429a-93a3-d1fec3bc6cfb\" (UID: \"d8aec2cc-326b-429a-93a3-d1fec3bc6cfb\") " Apr 17 17:51:05.342231 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:05.342107 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d8aec2cc-326b-429a-93a3-d1fec3bc6cfb-proxy-tls\") pod \"d8aec2cc-326b-429a-93a3-d1fec3bc6cfb\" (UID: \"d8aec2cc-326b-429a-93a3-d1fec3bc6cfb\") " Apr 17 17:51:05.342231 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:05.342151 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d8aec2cc-326b-429a-93a3-d1fec3bc6cfb-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"d8aec2cc-326b-429a-93a3-d1fec3bc6cfb\" (UID: \"d8aec2cc-326b-429a-93a3-d1fec3bc6cfb\") " Apr 17 17:51:05.342483 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:05.342430 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8aec2cc-326b-429a-93a3-d1fec3bc6cfb-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d8aec2cc-326b-429a-93a3-d1fec3bc6cfb" (UID: "d8aec2cc-326b-429a-93a3-d1fec3bc6cfb"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:51:05.342626 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:05.342599 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8aec2cc-326b-429a-93a3-d1fec3bc6cfb-isvc-pmml-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-pmml-kube-rbac-proxy-sar-config") pod "d8aec2cc-326b-429a-93a3-d1fec3bc6cfb" (UID: "d8aec2cc-326b-429a-93a3-d1fec3bc6cfb"). InnerVolumeSpecName "isvc-pmml-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:51:05.344205 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:05.344180 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8aec2cc-326b-429a-93a3-d1fec3bc6cfb-kube-api-access-jpcvf" (OuterVolumeSpecName: "kube-api-access-jpcvf") pod "d8aec2cc-326b-429a-93a3-d1fec3bc6cfb" (UID: "d8aec2cc-326b-429a-93a3-d1fec3bc6cfb"). InnerVolumeSpecName "kube-api-access-jpcvf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:51:05.344304 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:05.344198 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8aec2cc-326b-429a-93a3-d1fec3bc6cfb-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d8aec2cc-326b-429a-93a3-d1fec3bc6cfb" (UID: "d8aec2cc-326b-429a-93a3-d1fec3bc6cfb"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:51:05.442853 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:05.442817 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jpcvf\" (UniqueName: \"kubernetes.io/projected/d8aec2cc-326b-429a-93a3-d1fec3bc6cfb-kube-api-access-jpcvf\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:51:05.442853 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:05.442848 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d8aec2cc-326b-429a-93a3-d1fec3bc6cfb-proxy-tls\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:51:05.442853 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:05.442863 2566 reconciler_common.go:299] "Volume detached for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d8aec2cc-326b-429a-93a3-d1fec3bc6cfb-isvc-pmml-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:51:05.443087 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:05.442874 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d8aec2cc-326b-429a-93a3-d1fec3bc6cfb-kserve-provision-location\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:51:05.722108 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:05.722029 2566 generic.go:358] "Generic (PLEG): container finished" podID="d8aec2cc-326b-429a-93a3-d1fec3bc6cfb" containerID="b83c224ca2564b28e1d07626e9a345caa6c80a5d15be2c61b8da45744b8a764c" exitCode=0 Apr 17 17:51:05.722108 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:05.722089 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-d959k" event={"ID":"d8aec2cc-326b-429a-93a3-d1fec3bc6cfb","Type":"ContainerDied","Data":"b83c224ca2564b28e1d07626e9a345caa6c80a5d15be2c61b8da45744b8a764c"} Apr 17 17:51:05.722108 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:05.722103 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-d959k" Apr 17 17:51:05.722415 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:05.722115 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-d959k" event={"ID":"d8aec2cc-326b-429a-93a3-d1fec3bc6cfb","Type":"ContainerDied","Data":"2484e861b56b8b8949ecc127b403e35496eb4fb897c4c6188e62f3a727903d34"} Apr 17 17:51:05.722415 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:05.722131 2566 scope.go:117] "RemoveContainer" containerID="70243c28e83410a3989c9f95d312edb78d620758c0a990e5a5aed49a47547af0" Apr 17 17:51:05.730410 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:05.730209 2566 scope.go:117] "RemoveContainer" containerID="b83c224ca2564b28e1d07626e9a345caa6c80a5d15be2c61b8da45744b8a764c" Apr 17 17:51:05.737177 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:05.737158 2566 scope.go:117] "RemoveContainer" containerID="9430e1da60af15d09be19e9b0bdb5c43ccf6428fd74063717dbec7f928a0dff5" Apr 17 17:51:05.745001 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:05.744919 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-d959k"] Apr 17 17:51:05.745075 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:05.745023 2566 scope.go:117] "RemoveContainer" containerID="70243c28e83410a3989c9f95d312edb78d620758c0a990e5a5aed49a47547af0" Apr 17 17:51:05.745295 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:51:05.745275 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70243c28e83410a3989c9f95d312edb78d620758c0a990e5a5aed49a47547af0\": container with ID starting with 70243c28e83410a3989c9f95d312edb78d620758c0a990e5a5aed49a47547af0 not found: ID does not exist" containerID="70243c28e83410a3989c9f95d312edb78d620758c0a990e5a5aed49a47547af0" Apr 17 17:51:05.745402 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:05.745302 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70243c28e83410a3989c9f95d312edb78d620758c0a990e5a5aed49a47547af0"} err="failed to get container status \"70243c28e83410a3989c9f95d312edb78d620758c0a990e5a5aed49a47547af0\": rpc error: code = NotFound desc = could not find container \"70243c28e83410a3989c9f95d312edb78d620758c0a990e5a5aed49a47547af0\": container with ID starting with 70243c28e83410a3989c9f95d312edb78d620758c0a990e5a5aed49a47547af0 not found: ID does not exist" Apr 17 17:51:05.745402 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:05.745324 2566 scope.go:117] "RemoveContainer" containerID="b83c224ca2564b28e1d07626e9a345caa6c80a5d15be2c61b8da45744b8a764c" Apr 17 17:51:05.745573 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:51:05.745555 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b83c224ca2564b28e1d07626e9a345caa6c80a5d15be2c61b8da45744b8a764c\": container with ID starting with b83c224ca2564b28e1d07626e9a345caa6c80a5d15be2c61b8da45744b8a764c not found: ID does not exist" containerID="b83c224ca2564b28e1d07626e9a345caa6c80a5d15be2c61b8da45744b8a764c" Apr 17 17:51:05.745614 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:05.745578 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b83c224ca2564b28e1d07626e9a345caa6c80a5d15be2c61b8da45744b8a764c"} err="failed to get container status \"b83c224ca2564b28e1d07626e9a345caa6c80a5d15be2c61b8da45744b8a764c\": rpc error: code = NotFound desc = could not find container \"b83c224ca2564b28e1d07626e9a345caa6c80a5d15be2c61b8da45744b8a764c\": container with ID starting with b83c224ca2564b28e1d07626e9a345caa6c80a5d15be2c61b8da45744b8a764c not found: ID does not exist" Apr 17 17:51:05.745614 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:05.745595 2566 scope.go:117] "RemoveContainer" containerID="9430e1da60af15d09be19e9b0bdb5c43ccf6428fd74063717dbec7f928a0dff5" Apr 17 17:51:05.745894 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:51:05.745876 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9430e1da60af15d09be19e9b0bdb5c43ccf6428fd74063717dbec7f928a0dff5\": container with ID starting with 9430e1da60af15d09be19e9b0bdb5c43ccf6428fd74063717dbec7f928a0dff5 not found: ID does not exist" containerID="9430e1da60af15d09be19e9b0bdb5c43ccf6428fd74063717dbec7f928a0dff5" Apr 17 17:51:05.745955 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:05.745900 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9430e1da60af15d09be19e9b0bdb5c43ccf6428fd74063717dbec7f928a0dff5"} err="failed to get container status \"9430e1da60af15d09be19e9b0bdb5c43ccf6428fd74063717dbec7f928a0dff5\": rpc error: code = NotFound desc = could not find container \"9430e1da60af15d09be19e9b0bdb5c43ccf6428fd74063717dbec7f928a0dff5\": container with ID starting with 9430e1da60af15d09be19e9b0bdb5c43ccf6428fd74063717dbec7f928a0dff5 not found: ID does not exist" Apr 17 17:51:05.747924 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:05.747904 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-d959k"] Apr 17 17:51:06.726960 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:06.726928 2566 generic.go:358] "Generic (PLEG): container finished" podID="135f8205-c96c-4744-96fa-566f9008229d" containerID="f47ddef986964c2e67bb8e105e47aa60e4eb7f93e75baa907d3ec75acc23bf95" exitCode=0 Apr 17 17:51:06.727411 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:06.727002 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-l747t" event={"ID":"135f8205-c96c-4744-96fa-566f9008229d","Type":"ContainerDied","Data":"f47ddef986964c2e67bb8e105e47aa60e4eb7f93e75baa907d3ec75acc23bf95"} Apr 17 17:51:06.838172 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:06.838135 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8aec2cc-326b-429a-93a3-d1fec3bc6cfb" path="/var/lib/kubelet/pods/d8aec2cc-326b-429a-93a3-d1fec3bc6cfb/volumes" Apr 17 17:51:07.732614 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:07.732581 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-l747t" event={"ID":"135f8205-c96c-4744-96fa-566f9008229d","Type":"ContainerStarted","Data":"ea28fcdd1d3f427580619c55170e4feb73b36cfe5a8bed13c2908ef6aaaf02a1"} Apr 17 17:51:07.732614 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:07.732621 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-l747t" event={"ID":"135f8205-c96c-4744-96fa-566f9008229d","Type":"ContainerStarted","Data":"9845d6514e2744b92eca571f0b2c4493230e321104ce2453d6bf10cba50c1eaa"} Apr 17 17:51:07.732999 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:07.732835 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-l747t" Apr 17 17:51:07.753191 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:07.753147 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-l747t" podStartSLOduration=6.753134548 podStartE2EDuration="6.753134548s" podCreationTimestamp="2026-04-17 17:51:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:51:07.751781188 +0000 UTC m=+1579.421860841" watchObservedRunningTime="2026-04-17 17:51:07.753134548 +0000 UTC m=+1579.423214190" Apr 17 17:51:08.736355 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:08.736320 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-l747t" Apr 17 17:51:08.737702 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:08.737678 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-l747t" podUID="135f8205-c96c-4744-96fa-566f9008229d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 17 17:51:09.739321 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:09.739271 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-l747t" podUID="135f8205-c96c-4744-96fa-566f9008229d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 17 17:51:14.744880 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:14.744848 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-l747t" Apr 17 17:51:14.745445 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:14.745420 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-l747t" podUID="135f8205-c96c-4744-96fa-566f9008229d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 17 17:51:24.746337 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:24.746291 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-l747t" podUID="135f8205-c96c-4744-96fa-566f9008229d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 17 17:51:34.745563 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:34.745518 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-l747t" podUID="135f8205-c96c-4744-96fa-566f9008229d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 17 17:51:44.745484 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:44.745440 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-l747t" podUID="135f8205-c96c-4744-96fa-566f9008229d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 17 17:51:54.746302 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:51:54.746239 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-l747t" podUID="135f8205-c96c-4744-96fa-566f9008229d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 17 17:52:04.745320 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:04.745282 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-l747t" podUID="135f8205-c96c-4744-96fa-566f9008229d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 17 17:52:14.746123 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:14.746081 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-l747t" podUID="135f8205-c96c-4744-96fa-566f9008229d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 17 17:52:24.746373 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:24.746337 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-l747t" Apr 17 17:52:32.917108 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:32.917070 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-l747t"] Apr 17 17:52:32.917533 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:32.917466 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-l747t" podUID="135f8205-c96c-4744-96fa-566f9008229d" containerName="kserve-container" containerID="cri-o://9845d6514e2744b92eca571f0b2c4493230e321104ce2453d6bf10cba50c1eaa" gracePeriod=30 Apr 17 17:52:32.917602 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:32.917513 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-l747t" podUID="135f8205-c96c-4744-96fa-566f9008229d" containerName="kube-rbac-proxy" containerID="cri-o://ea28fcdd1d3f427580619c55170e4feb73b36cfe5a8bed13c2908ef6aaaf02a1" gracePeriod=30 Apr 17 17:52:33.010703 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:33.010665 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz"] Apr 17 17:52:33.010974 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:33.010961 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d8aec2cc-326b-429a-93a3-d1fec3bc6cfb" containerName="storage-initializer" Apr 17 17:52:33.010974 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:33.010975 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8aec2cc-326b-429a-93a3-d1fec3bc6cfb" containerName="storage-initializer" Apr 17 17:52:33.011072 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:33.010988 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d8aec2cc-326b-429a-93a3-d1fec3bc6cfb" containerName="kserve-container" Apr 17 17:52:33.011072 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:33.010994 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8aec2cc-326b-429a-93a3-d1fec3bc6cfb" containerName="kserve-container" Apr 17 17:52:33.011072 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:33.011007 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d8aec2cc-326b-429a-93a3-d1fec3bc6cfb" containerName="kube-rbac-proxy" Apr 17 17:52:33.011072 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:33.011013 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8aec2cc-326b-429a-93a3-d1fec3bc6cfb" containerName="kube-rbac-proxy" Apr 17 17:52:33.011072 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:33.011067 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="d8aec2cc-326b-429a-93a3-d1fec3bc6cfb" containerName="kserve-container" Apr 17 17:52:33.011226 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:33.011076 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="d8aec2cc-326b-429a-93a3-d1fec3bc6cfb" containerName="kube-rbac-proxy" Apr 17 17:52:33.014234 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:33.014220 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz" Apr 17 17:52:33.016598 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:33.016569 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-v2-kserve-predictor-serving-cert\"" Apr 17 17:52:33.016693 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:33.016579 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\"" Apr 17 17:52:33.024574 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:33.024537 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz"] Apr 17 17:52:33.086011 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:33.085981 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz\" (UID: \"3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz" Apr 17 17:52:33.086193 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:33.086044 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz\" (UID: \"3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz" Apr 17 17:52:33.086193 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:33.086073 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz\" (UID: \"3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz" Apr 17 17:52:33.086193 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:33.086097 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd4rv\" (UniqueName: \"kubernetes.io/projected/3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8-kube-api-access-gd4rv\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz\" (UID: \"3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz" Apr 17 17:52:33.186619 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:33.186513 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gd4rv\" (UniqueName: \"kubernetes.io/projected/3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8-kube-api-access-gd4rv\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz\" (UID: \"3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz" Apr 17 17:52:33.186619 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:33.186561 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz\" (UID: \"3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz" Apr 17 17:52:33.186873 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:33.186681 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz\" (UID: \"3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz" Apr 17 17:52:33.186873 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:33.186713 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz\" (UID: \"3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz" Apr 17 17:52:33.186873 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:52:33.186827 2566 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-serving-cert: secret "isvc-pmml-v2-kserve-predictor-serving-cert" not found Apr 17 17:52:33.187047 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:52:33.186891 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8-proxy-tls podName:3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8 nodeName:}" failed. No retries permitted until 2026-04-17 17:52:33.686874145 +0000 UTC m=+1665.356953766 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8-proxy-tls") pod "isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz" (UID: "3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8") : secret "isvc-pmml-v2-kserve-predictor-serving-cert" not found Apr 17 17:52:33.187047 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:33.187019 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz\" (UID: \"3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz" Apr 17 17:52:33.187278 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:33.187239 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz\" (UID: \"3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz" Apr 17 17:52:33.197383 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:33.197360 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd4rv\" (UniqueName: \"kubernetes.io/projected/3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8-kube-api-access-gd4rv\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz\" (UID: \"3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz" Apr 17 17:52:33.691453 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:33.691412 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz\" (UID: \"3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz" Apr 17 17:52:33.693797 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:33.693774 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz\" (UID: \"3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz" Apr 17 17:52:33.925305 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:33.925269 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz" Apr 17 17:52:33.983485 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:33.983399 2566 generic.go:358] "Generic (PLEG): container finished" podID="135f8205-c96c-4744-96fa-566f9008229d" containerID="ea28fcdd1d3f427580619c55170e4feb73b36cfe5a8bed13c2908ef6aaaf02a1" exitCode=2 Apr 17 17:52:33.983485 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:33.983450 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-l747t" event={"ID":"135f8205-c96c-4744-96fa-566f9008229d","Type":"ContainerDied","Data":"ea28fcdd1d3f427580619c55170e4feb73b36cfe5a8bed13c2908ef6aaaf02a1"} Apr 17 17:52:34.049942 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:34.049915 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz"] Apr 17 17:52:34.052369 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:52:34.052341 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fe31fe9_e664_4bf1_aba1_3f26c2ceccd8.slice/crio-a67b365193210cc10ecee3aea28924a3c56a259f5b1714ebc819c084a891ee17 WatchSource:0}: Error finding container a67b365193210cc10ecee3aea28924a3c56a259f5b1714ebc819c084a891ee17: Status 404 returned error can't find the container with id a67b365193210cc10ecee3aea28924a3c56a259f5b1714ebc819c084a891ee17 Apr 17 17:52:34.739489 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:34.739453 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-l747t" podUID="135f8205-c96c-4744-96fa-566f9008229d" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.32:8643/healthz\": dial tcp 10.133.0.32:8643: connect: connection refused" Apr 17 17:52:34.745939 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:34.745907 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-l747t" podUID="135f8205-c96c-4744-96fa-566f9008229d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 17 17:52:34.987504 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:34.987464 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz" event={"ID":"3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8","Type":"ContainerStarted","Data":"e47dace4d0837f2c3c03678f679f23f8c1ee2065ea005a67971d4e27fb512bca"} Apr 17 17:52:34.987504 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:34.987500 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz" event={"ID":"3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8","Type":"ContainerStarted","Data":"a67b365193210cc10ecee3aea28924a3c56a259f5b1714ebc819c084a891ee17"} Apr 17 17:52:36.564750 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:36.564724 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-l747t" Apr 17 17:52:36.618340 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:36.618239 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shzwk\" (UniqueName: \"kubernetes.io/projected/135f8205-c96c-4744-96fa-566f9008229d-kube-api-access-shzwk\") pod \"135f8205-c96c-4744-96fa-566f9008229d\" (UID: \"135f8205-c96c-4744-96fa-566f9008229d\") " Apr 17 17:52:36.618340 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:36.618291 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/135f8205-c96c-4744-96fa-566f9008229d-proxy-tls\") pod \"135f8205-c96c-4744-96fa-566f9008229d\" (UID: \"135f8205-c96c-4744-96fa-566f9008229d\") " Apr 17 17:52:36.618340 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:36.618330 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/135f8205-c96c-4744-96fa-566f9008229d-kserve-provision-location\") pod \"135f8205-c96c-4744-96fa-566f9008229d\" (UID: \"135f8205-c96c-4744-96fa-566f9008229d\") " Apr 17 17:52:36.618624 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:36.618352 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/135f8205-c96c-4744-96fa-566f9008229d-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"135f8205-c96c-4744-96fa-566f9008229d\" (UID: \"135f8205-c96c-4744-96fa-566f9008229d\") " Apr 17 17:52:36.618699 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:36.618671 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/135f8205-c96c-4744-96fa-566f9008229d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "135f8205-c96c-4744-96fa-566f9008229d" (UID: "135f8205-c96c-4744-96fa-566f9008229d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:52:36.618757 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:36.618739 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/135f8205-c96c-4744-96fa-566f9008229d-isvc-pmml-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-pmml-runtime-kube-rbac-proxy-sar-config") pod "135f8205-c96c-4744-96fa-566f9008229d" (UID: "135f8205-c96c-4744-96fa-566f9008229d"). InnerVolumeSpecName "isvc-pmml-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:52:36.620412 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:36.620383 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/135f8205-c96c-4744-96fa-566f9008229d-kube-api-access-shzwk" (OuterVolumeSpecName: "kube-api-access-shzwk") pod "135f8205-c96c-4744-96fa-566f9008229d" (UID: "135f8205-c96c-4744-96fa-566f9008229d"). InnerVolumeSpecName "kube-api-access-shzwk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:52:36.620522 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:36.620419 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/135f8205-c96c-4744-96fa-566f9008229d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "135f8205-c96c-4744-96fa-566f9008229d" (UID: "135f8205-c96c-4744-96fa-566f9008229d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:52:36.719843 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:36.719806 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-shzwk\" (UniqueName: \"kubernetes.io/projected/135f8205-c96c-4744-96fa-566f9008229d-kube-api-access-shzwk\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:52:36.719843 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:36.719839 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/135f8205-c96c-4744-96fa-566f9008229d-proxy-tls\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:52:36.719843 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:36.719850 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/135f8205-c96c-4744-96fa-566f9008229d-kserve-provision-location\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:52:36.720078 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:36.719861 2566 reconciler_common.go:299] "Volume detached for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/135f8205-c96c-4744-96fa-566f9008229d-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:52:36.995563 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:36.995469 2566 generic.go:358] "Generic (PLEG): container finished" podID="135f8205-c96c-4744-96fa-566f9008229d" containerID="9845d6514e2744b92eca571f0b2c4493230e321104ce2453d6bf10cba50c1eaa" exitCode=0 Apr 17 17:52:36.995563 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:36.995530 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-l747t" event={"ID":"135f8205-c96c-4744-96fa-566f9008229d","Type":"ContainerDied","Data":"9845d6514e2744b92eca571f0b2c4493230e321104ce2453d6bf10cba50c1eaa"} Apr 17 17:52:36.995563 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:36.995560 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-l747t" event={"ID":"135f8205-c96c-4744-96fa-566f9008229d","Type":"ContainerDied","Data":"10774f5c7711366557f84d46684ba1d98e2c2b4fb1a74002f56c93eb77be64b6"} Apr 17 17:52:36.995800 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:36.995563 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-l747t" Apr 17 17:52:36.995800 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:36.995575 2566 scope.go:117] "RemoveContainer" containerID="ea28fcdd1d3f427580619c55170e4feb73b36cfe5a8bed13c2908ef6aaaf02a1" Apr 17 17:52:37.003263 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:37.003228 2566 scope.go:117] "RemoveContainer" containerID="9845d6514e2744b92eca571f0b2c4493230e321104ce2453d6bf10cba50c1eaa" Apr 17 17:52:37.010263 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:37.010231 2566 scope.go:117] "RemoveContainer" containerID="f47ddef986964c2e67bb8e105e47aa60e4eb7f93e75baa907d3ec75acc23bf95" Apr 17 17:52:37.014180 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:37.014154 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-l747t"] Apr 17 17:52:37.017524 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:37.017497 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-l747t"] Apr 17 17:52:37.018815 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:37.018797 2566 scope.go:117] "RemoveContainer" containerID="ea28fcdd1d3f427580619c55170e4feb73b36cfe5a8bed13c2908ef6aaaf02a1" Apr 17 17:52:37.019075 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:52:37.019053 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea28fcdd1d3f427580619c55170e4feb73b36cfe5a8bed13c2908ef6aaaf02a1\": container with ID starting with ea28fcdd1d3f427580619c55170e4feb73b36cfe5a8bed13c2908ef6aaaf02a1 not found: ID does not exist" containerID="ea28fcdd1d3f427580619c55170e4feb73b36cfe5a8bed13c2908ef6aaaf02a1" Apr 17 17:52:37.019125 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:37.019084 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea28fcdd1d3f427580619c55170e4feb73b36cfe5a8bed13c2908ef6aaaf02a1"} err="failed to get container status \"ea28fcdd1d3f427580619c55170e4feb73b36cfe5a8bed13c2908ef6aaaf02a1\": rpc error: code = NotFound desc = could not find container \"ea28fcdd1d3f427580619c55170e4feb73b36cfe5a8bed13c2908ef6aaaf02a1\": container with ID starting with ea28fcdd1d3f427580619c55170e4feb73b36cfe5a8bed13c2908ef6aaaf02a1 not found: ID does not exist" Apr 17 17:52:37.019125 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:37.019103 2566 scope.go:117] "RemoveContainer" containerID="9845d6514e2744b92eca571f0b2c4493230e321104ce2453d6bf10cba50c1eaa" Apr 17 17:52:37.019323 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:52:37.019307 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9845d6514e2744b92eca571f0b2c4493230e321104ce2453d6bf10cba50c1eaa\": container with ID starting with 9845d6514e2744b92eca571f0b2c4493230e321104ce2453d6bf10cba50c1eaa not found: ID does not exist" containerID="9845d6514e2744b92eca571f0b2c4493230e321104ce2453d6bf10cba50c1eaa" Apr 17 17:52:37.019376 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:37.019329 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9845d6514e2744b92eca571f0b2c4493230e321104ce2453d6bf10cba50c1eaa"} err="failed to get container status \"9845d6514e2744b92eca571f0b2c4493230e321104ce2453d6bf10cba50c1eaa\": rpc error: code = NotFound desc = could not find container \"9845d6514e2744b92eca571f0b2c4493230e321104ce2453d6bf10cba50c1eaa\": container with ID starting with 9845d6514e2744b92eca571f0b2c4493230e321104ce2453d6bf10cba50c1eaa not found: ID does not exist" Apr 17 17:52:37.019376 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:37.019344 2566 scope.go:117] "RemoveContainer" containerID="f47ddef986964c2e67bb8e105e47aa60e4eb7f93e75baa907d3ec75acc23bf95" Apr 17 17:52:37.019596 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:52:37.019580 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f47ddef986964c2e67bb8e105e47aa60e4eb7f93e75baa907d3ec75acc23bf95\": container with ID starting with f47ddef986964c2e67bb8e105e47aa60e4eb7f93e75baa907d3ec75acc23bf95 not found: ID does not exist" containerID="f47ddef986964c2e67bb8e105e47aa60e4eb7f93e75baa907d3ec75acc23bf95" Apr 17 17:52:37.019641 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:37.019601 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f47ddef986964c2e67bb8e105e47aa60e4eb7f93e75baa907d3ec75acc23bf95"} err="failed to get container status \"f47ddef986964c2e67bb8e105e47aa60e4eb7f93e75baa907d3ec75acc23bf95\": rpc error: code = NotFound desc = could not find container \"f47ddef986964c2e67bb8e105e47aa60e4eb7f93e75baa907d3ec75acc23bf95\": container with ID starting with f47ddef986964c2e67bb8e105e47aa60e4eb7f93e75baa907d3ec75acc23bf95 not found: ID does not exist" Apr 17 17:52:38.000637 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:38.000603 2566 generic.go:358] "Generic (PLEG): container finished" podID="3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8" containerID="e47dace4d0837f2c3c03678f679f23f8c1ee2065ea005a67971d4e27fb512bca" exitCode=0 Apr 17 17:52:38.001029 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:38.000666 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz" event={"ID":"3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8","Type":"ContainerDied","Data":"e47dace4d0837f2c3c03678f679f23f8c1ee2065ea005a67971d4e27fb512bca"} Apr 17 17:52:38.839125 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:38.839095 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="135f8205-c96c-4744-96fa-566f9008229d" path="/var/lib/kubelet/pods/135f8205-c96c-4744-96fa-566f9008229d/volumes" Apr 17 17:52:39.005162 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:39.005123 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz" event={"ID":"3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8","Type":"ContainerStarted","Data":"e65850cb47afeae78d8d044a91a7d0a633d57aa5f2f4905a33b508e97d784193"} Apr 17 17:52:39.005162 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:39.005169 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz" event={"ID":"3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8","Type":"ContainerStarted","Data":"5cffa2b9e2dc112cd0563d67ba658034b3e1d473304e167108b9df80f7e84752"} Apr 17 17:52:39.005578 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:39.005397 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz" Apr 17 17:52:39.026324 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:39.026279 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz" podStartSLOduration=7.026247783 podStartE2EDuration="7.026247783s" podCreationTimestamp="2026-04-17 17:52:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:52:39.02400233 +0000 UTC m=+1670.694081972" watchObservedRunningTime="2026-04-17 17:52:39.026247783 +0000 UTC m=+1670.696327425" Apr 17 17:52:40.008139 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:40.008106 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz" Apr 17 17:52:40.009154 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:40.009129 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz" podUID="3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 17 17:52:41.010515 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:41.010471 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz" podUID="3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 17 17:52:46.015011 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:46.014982 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz" Apr 17 17:52:46.015593 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:46.015566 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz" podUID="3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 17 17:52:56.016398 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:52:56.016356 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz" podUID="3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 17 17:53:06.015713 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:53:06.015669 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz" podUID="3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 17 17:53:16.016481 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:53:16.016440 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz" podUID="3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 17 17:53:26.015579 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:53:26.015540 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz" podUID="3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 17 17:53:36.016058 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:53:36.016015 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz" podUID="3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 17 17:53:46.015630 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:53:46.015590 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz" podUID="3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 17 17:53:56.015533 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:53:56.015478 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz" podUID="3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 17 17:53:57.835036 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:53:57.834997 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz" Apr 17 17:54:04.330382 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:04.330348 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz"] Apr 17 17:54:04.330788 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:04.330763 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz" podUID="3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8" containerName="kserve-container" containerID="cri-o://5cffa2b9e2dc112cd0563d67ba658034b3e1d473304e167108b9df80f7e84752" gracePeriod=30 Apr 17 17:54:04.330873 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:04.330814 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz" podUID="3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8" containerName="kube-rbac-proxy" containerID="cri-o://e65850cb47afeae78d8d044a91a7d0a633d57aa5f2f4905a33b508e97d784193" gracePeriod=30 Apr 17 17:54:04.429280 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:04.429223 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-529833-predictor-55864c5984-dcs8f"] Apr 17 17:54:04.429587 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:04.429574 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="135f8205-c96c-4744-96fa-566f9008229d" containerName="kserve-container" Apr 17 17:54:04.429635 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:04.429589 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="135f8205-c96c-4744-96fa-566f9008229d" containerName="kserve-container" Apr 17 17:54:04.429635 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:04.429600 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="135f8205-c96c-4744-96fa-566f9008229d" containerName="kube-rbac-proxy" Apr 17 17:54:04.429635 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:04.429605 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="135f8205-c96c-4744-96fa-566f9008229d" containerName="kube-rbac-proxy" Apr 17 17:54:04.429635 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:04.429618 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="135f8205-c96c-4744-96fa-566f9008229d" containerName="storage-initializer" Apr 17 17:54:04.429635 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:04.429625 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="135f8205-c96c-4744-96fa-566f9008229d" containerName="storage-initializer" Apr 17 17:54:04.429802 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:04.429680 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="135f8205-c96c-4744-96fa-566f9008229d" containerName="kube-rbac-proxy" Apr 17 17:54:04.429802 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:04.429692 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="135f8205-c96c-4744-96fa-566f9008229d" containerName="kserve-container" Apr 17 17:54:04.432996 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:04.432975 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-529833-predictor-55864c5984-dcs8f" Apr 17 17:54:04.435224 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:04.435199 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-529833-predictor-serving-cert\"" Apr 17 17:54:04.435224 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:04.435220 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-529833-kube-rbac-proxy-sar-config\"" Apr 17 17:54:04.443990 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:04.443967 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-529833-predictor-55864c5984-dcs8f"] Apr 17 17:54:04.572795 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:04.572765 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a93bc4a-3c60-41e2-9501-0a94cde5b119-kserve-provision-location\") pod \"isvc-primary-529833-predictor-55864c5984-dcs8f\" (UID: \"0a93bc4a-3c60-41e2-9501-0a94cde5b119\") " pod="kserve-ci-e2e-test/isvc-primary-529833-predictor-55864c5984-dcs8f" Apr 17 17:54:04.572795 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:04.572799 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lvcb\" (UniqueName: \"kubernetes.io/projected/0a93bc4a-3c60-41e2-9501-0a94cde5b119-kube-api-access-6lvcb\") pod \"isvc-primary-529833-predictor-55864c5984-dcs8f\" (UID: \"0a93bc4a-3c60-41e2-9501-0a94cde5b119\") " pod="kserve-ci-e2e-test/isvc-primary-529833-predictor-55864c5984-dcs8f" Apr 17 17:54:04.573040 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:04.572941 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a93bc4a-3c60-41e2-9501-0a94cde5b119-proxy-tls\") pod \"isvc-primary-529833-predictor-55864c5984-dcs8f\" (UID: \"0a93bc4a-3c60-41e2-9501-0a94cde5b119\") " pod="kserve-ci-e2e-test/isvc-primary-529833-predictor-55864c5984-dcs8f" Apr 17 17:54:04.573040 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:04.572994 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-primary-529833-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0a93bc4a-3c60-41e2-9501-0a94cde5b119-isvc-primary-529833-kube-rbac-proxy-sar-config\") pod \"isvc-primary-529833-predictor-55864c5984-dcs8f\" (UID: \"0a93bc4a-3c60-41e2-9501-0a94cde5b119\") " pod="kserve-ci-e2e-test/isvc-primary-529833-predictor-55864c5984-dcs8f" Apr 17 17:54:04.673688 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:04.673592 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a93bc4a-3c60-41e2-9501-0a94cde5b119-proxy-tls\") pod \"isvc-primary-529833-predictor-55864c5984-dcs8f\" (UID: \"0a93bc4a-3c60-41e2-9501-0a94cde5b119\") " pod="kserve-ci-e2e-test/isvc-primary-529833-predictor-55864c5984-dcs8f" Apr 17 17:54:04.673688 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:04.673647 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-primary-529833-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0a93bc4a-3c60-41e2-9501-0a94cde5b119-isvc-primary-529833-kube-rbac-proxy-sar-config\") pod \"isvc-primary-529833-predictor-55864c5984-dcs8f\" (UID: \"0a93bc4a-3c60-41e2-9501-0a94cde5b119\") " pod="kserve-ci-e2e-test/isvc-primary-529833-predictor-55864c5984-dcs8f" Apr 17 17:54:04.673688 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:04.673680 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a93bc4a-3c60-41e2-9501-0a94cde5b119-kserve-provision-location\") pod \"isvc-primary-529833-predictor-55864c5984-dcs8f\" (UID: \"0a93bc4a-3c60-41e2-9501-0a94cde5b119\") " pod="kserve-ci-e2e-test/isvc-primary-529833-predictor-55864c5984-dcs8f" Apr 17 17:54:04.673969 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:04.673706 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6lvcb\" (UniqueName: \"kubernetes.io/projected/0a93bc4a-3c60-41e2-9501-0a94cde5b119-kube-api-access-6lvcb\") pod \"isvc-primary-529833-predictor-55864c5984-dcs8f\" (UID: \"0a93bc4a-3c60-41e2-9501-0a94cde5b119\") " pod="kserve-ci-e2e-test/isvc-primary-529833-predictor-55864c5984-dcs8f" Apr 17 17:54:04.674106 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:04.674083 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a93bc4a-3c60-41e2-9501-0a94cde5b119-kserve-provision-location\") pod \"isvc-primary-529833-predictor-55864c5984-dcs8f\" (UID: \"0a93bc4a-3c60-41e2-9501-0a94cde5b119\") " pod="kserve-ci-e2e-test/isvc-primary-529833-predictor-55864c5984-dcs8f" Apr 17 17:54:04.674379 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:04.674359 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-primary-529833-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0a93bc4a-3c60-41e2-9501-0a94cde5b119-isvc-primary-529833-kube-rbac-proxy-sar-config\") pod \"isvc-primary-529833-predictor-55864c5984-dcs8f\" (UID: \"0a93bc4a-3c60-41e2-9501-0a94cde5b119\") " pod="kserve-ci-e2e-test/isvc-primary-529833-predictor-55864c5984-dcs8f" Apr 17 17:54:04.676020 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:04.675998 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a93bc4a-3c60-41e2-9501-0a94cde5b119-proxy-tls\") pod \"isvc-primary-529833-predictor-55864c5984-dcs8f\" (UID: \"0a93bc4a-3c60-41e2-9501-0a94cde5b119\") " pod="kserve-ci-e2e-test/isvc-primary-529833-predictor-55864c5984-dcs8f" Apr 17 17:54:04.683327 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:04.683307 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lvcb\" (UniqueName: \"kubernetes.io/projected/0a93bc4a-3c60-41e2-9501-0a94cde5b119-kube-api-access-6lvcb\") pod \"isvc-primary-529833-predictor-55864c5984-dcs8f\" (UID: \"0a93bc4a-3c60-41e2-9501-0a94cde5b119\") " pod="kserve-ci-e2e-test/isvc-primary-529833-predictor-55864c5984-dcs8f" Apr 17 17:54:04.743411 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:04.743373 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-529833-predictor-55864c5984-dcs8f" Apr 17 17:54:04.865450 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:04.865419 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-529833-predictor-55864c5984-dcs8f"] Apr 17 17:54:04.868409 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:54:04.868382 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a93bc4a_3c60_41e2_9501_0a94cde5b119.slice/crio-8e70176243169e437dadfbfe83cf4dffb31d5966f8e56a90d8126d0e70d05e98 WatchSource:0}: Error finding container 8e70176243169e437dadfbfe83cf4dffb31d5966f8e56a90d8126d0e70d05e98: Status 404 returned error can't find the container with id 8e70176243169e437dadfbfe83cf4dffb31d5966f8e56a90d8126d0e70d05e98 Apr 17 17:54:04.870126 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:04.870110 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:54:05.254689 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:05.254658 2566 generic.go:358] "Generic (PLEG): container finished" podID="3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8" containerID="e65850cb47afeae78d8d044a91a7d0a633d57aa5f2f4905a33b508e97d784193" exitCode=2 Apr 17 17:54:05.254871 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:05.254742 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz" event={"ID":"3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8","Type":"ContainerDied","Data":"e65850cb47afeae78d8d044a91a7d0a633d57aa5f2f4905a33b508e97d784193"} Apr 17 17:54:05.256036 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:05.256008 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-529833-predictor-55864c5984-dcs8f" event={"ID":"0a93bc4a-3c60-41e2-9501-0a94cde5b119","Type":"ContainerStarted","Data":"4dc66894c9f5148c41f32c73d88eb0e02cd3a9c498aa0ec7044f87765c371aa6"} Apr 17 17:54:05.256036 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:05.256032 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-529833-predictor-55864c5984-dcs8f" event={"ID":"0a93bc4a-3c60-41e2-9501-0a94cde5b119","Type":"ContainerStarted","Data":"8e70176243169e437dadfbfe83cf4dffb31d5966f8e56a90d8126d0e70d05e98"} Apr 17 17:54:06.011139 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:06.011096 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz" podUID="3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.33:8643/healthz\": dial tcp 10.133.0.33:8643: connect: connection refused" Apr 17 17:54:07.835400 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:07.835364 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz" podUID="3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 17 17:54:07.971049 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:07.971021 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz" Apr 17 17:54:07.996890 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:07.996861 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8-kserve-provision-location\") pod \"3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8\" (UID: \"3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8\") " Apr 17 17:54:07.997032 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:07.996916 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8-proxy-tls\") pod \"3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8\" (UID: \"3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8\") " Apr 17 17:54:07.997032 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:07.996952 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8\" (UID: \"3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8\") " Apr 17 17:54:07.997032 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:07.996971 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gd4rv\" (UniqueName: \"kubernetes.io/projected/3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8-kube-api-access-gd4rv\") pod \"3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8\" (UID: \"3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8\") " Apr 17 17:54:07.997248 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:07.997219 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8" (UID: "3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:54:07.997333 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:07.997308 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config") pod "3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8" (UID: "3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8"). InnerVolumeSpecName "isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:54:07.999132 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:07.999105 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8" (UID: "3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:54:07.999233 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:07.999110 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8-kube-api-access-gd4rv" (OuterVolumeSpecName: "kube-api-access-gd4rv") pod "3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8" (UID: "3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8"). InnerVolumeSpecName "kube-api-access-gd4rv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:54:08.097669 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:08.097568 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8-kserve-provision-location\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:54:08.097669 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:08.097611 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8-proxy-tls\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:54:08.097669 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:08.097624 2566 reconciler_common.go:299] "Volume detached for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:54:08.097669 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:08.097634 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gd4rv\" (UniqueName: \"kubernetes.io/projected/3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8-kube-api-access-gd4rv\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:54:08.266561 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:08.266520 2566 generic.go:358] "Generic (PLEG): container finished" podID="3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8" containerID="5cffa2b9e2dc112cd0563d67ba658034b3e1d473304e167108b9df80f7e84752" exitCode=0 Apr 17 17:54:08.266722 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:08.266596 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz" Apr 17 17:54:08.266722 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:08.266604 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz" event={"ID":"3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8","Type":"ContainerDied","Data":"5cffa2b9e2dc112cd0563d67ba658034b3e1d473304e167108b9df80f7e84752"} Apr 17 17:54:08.266722 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:08.266646 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz" event={"ID":"3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8","Type":"ContainerDied","Data":"a67b365193210cc10ecee3aea28924a3c56a259f5b1714ebc819c084a891ee17"} Apr 17 17:54:08.266722 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:08.266663 2566 scope.go:117] "RemoveContainer" containerID="e65850cb47afeae78d8d044a91a7d0a633d57aa5f2f4905a33b508e97d784193" Apr 17 17:54:08.275169 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:08.275146 2566 scope.go:117] "RemoveContainer" containerID="5cffa2b9e2dc112cd0563d67ba658034b3e1d473304e167108b9df80f7e84752" Apr 17 17:54:08.282920 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:08.282903 2566 scope.go:117] "RemoveContainer" containerID="e47dace4d0837f2c3c03678f679f23f8c1ee2065ea005a67971d4e27fb512bca" Apr 17 17:54:08.288397 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:08.288373 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz"] Apr 17 17:54:08.291131 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:08.291113 2566 scope.go:117] "RemoveContainer" containerID="e65850cb47afeae78d8d044a91a7d0a633d57aa5f2f4905a33b508e97d784193" Apr 17 17:54:08.291466 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:54:08.291443 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e65850cb47afeae78d8d044a91a7d0a633d57aa5f2f4905a33b508e97d784193\": container with ID starting with e65850cb47afeae78d8d044a91a7d0a633d57aa5f2f4905a33b508e97d784193 not found: ID does not exist" containerID="e65850cb47afeae78d8d044a91a7d0a633d57aa5f2f4905a33b508e97d784193" Apr 17 17:54:08.291584 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:08.291471 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e65850cb47afeae78d8d044a91a7d0a633d57aa5f2f4905a33b508e97d784193"} err="failed to get container status \"e65850cb47afeae78d8d044a91a7d0a633d57aa5f2f4905a33b508e97d784193\": rpc error: code = NotFound desc = could not find container \"e65850cb47afeae78d8d044a91a7d0a633d57aa5f2f4905a33b508e97d784193\": container with ID starting with e65850cb47afeae78d8d044a91a7d0a633d57aa5f2f4905a33b508e97d784193 not found: ID does not exist" Apr 17 17:54:08.291584 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:08.291490 2566 scope.go:117] "RemoveContainer" containerID="5cffa2b9e2dc112cd0563d67ba658034b3e1d473304e167108b9df80f7e84752" Apr 17 17:54:08.291771 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:54:08.291752 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cffa2b9e2dc112cd0563d67ba658034b3e1d473304e167108b9df80f7e84752\": container with ID starting with 5cffa2b9e2dc112cd0563d67ba658034b3e1d473304e167108b9df80f7e84752 not found: ID does not exist" containerID="5cffa2b9e2dc112cd0563d67ba658034b3e1d473304e167108b9df80f7e84752" Apr 17 17:54:08.291835 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:08.291778 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cffa2b9e2dc112cd0563d67ba658034b3e1d473304e167108b9df80f7e84752"} err="failed to get container status \"5cffa2b9e2dc112cd0563d67ba658034b3e1d473304e167108b9df80f7e84752\": rpc error: code = NotFound desc = could not find container \"5cffa2b9e2dc112cd0563d67ba658034b3e1d473304e167108b9df80f7e84752\": container with ID starting with 5cffa2b9e2dc112cd0563d67ba658034b3e1d473304e167108b9df80f7e84752 not found: ID does not exist" Apr 17 17:54:08.291835 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:08.291796 2566 scope.go:117] "RemoveContainer" containerID="e47dace4d0837f2c3c03678f679f23f8c1ee2065ea005a67971d4e27fb512bca" Apr 17 17:54:08.292056 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:54:08.292033 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e47dace4d0837f2c3c03678f679f23f8c1ee2065ea005a67971d4e27fb512bca\": container with ID starting with e47dace4d0837f2c3c03678f679f23f8c1ee2065ea005a67971d4e27fb512bca not found: ID does not exist" containerID="e47dace4d0837f2c3c03678f679f23f8c1ee2065ea005a67971d4e27fb512bca" Apr 17 17:54:08.292118 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:08.292061 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e47dace4d0837f2c3c03678f679f23f8c1ee2065ea005a67971d4e27fb512bca"} err="failed to get container status \"e47dace4d0837f2c3c03678f679f23f8c1ee2065ea005a67971d4e27fb512bca\": rpc error: code = NotFound desc = could not find container \"e47dace4d0837f2c3c03678f679f23f8c1ee2065ea005a67971d4e27fb512bca\": container with ID starting with e47dace4d0837f2c3c03678f679f23f8c1ee2065ea005a67971d4e27fb512bca not found: ID does not exist" Apr 17 17:54:08.292297 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:08.292275 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-jjzrz"] Apr 17 17:54:08.843787 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:08.843758 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8" path="/var/lib/kubelet/pods/3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8/volumes" Apr 17 17:54:09.272216 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:09.272126 2566 generic.go:358] "Generic (PLEG): container finished" podID="0a93bc4a-3c60-41e2-9501-0a94cde5b119" containerID="4dc66894c9f5148c41f32c73d88eb0e02cd3a9c498aa0ec7044f87765c371aa6" exitCode=0 Apr 17 17:54:09.272216 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:09.272184 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-529833-predictor-55864c5984-dcs8f" event={"ID":"0a93bc4a-3c60-41e2-9501-0a94cde5b119","Type":"ContainerDied","Data":"4dc66894c9f5148c41f32c73d88eb0e02cd3a9c498aa0ec7044f87765c371aa6"} Apr 17 17:54:10.277047 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:10.277010 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-529833-predictor-55864c5984-dcs8f" event={"ID":"0a93bc4a-3c60-41e2-9501-0a94cde5b119","Type":"ContainerStarted","Data":"38d750e7aa141393747f34e8365b1fb50853e9845cb3dee61e96a0651c70b42a"} Apr 17 17:54:10.277501 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:10.277053 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-529833-predictor-55864c5984-dcs8f" event={"ID":"0a93bc4a-3c60-41e2-9501-0a94cde5b119","Type":"ContainerStarted","Data":"c8edb0a3fc9c6f12745f2c032e8e7a8b39bdfdec35ec4f2eaea887ba6149668c"} Apr 17 17:54:10.277501 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:10.277294 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-529833-predictor-55864c5984-dcs8f" Apr 17 17:54:10.298893 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:10.298840 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-529833-predictor-55864c5984-dcs8f" podStartSLOduration=6.298824398 podStartE2EDuration="6.298824398s" podCreationTimestamp="2026-04-17 17:54:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:54:10.29870412 +0000 UTC m=+1761.968783763" watchObservedRunningTime="2026-04-17 17:54:10.298824398 +0000 UTC m=+1761.968904041" Apr 17 17:54:11.280236 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:11.280208 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-529833-predictor-55864c5984-dcs8f" Apr 17 17:54:11.281455 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:11.281419 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-529833-predictor-55864c5984-dcs8f" podUID="0a93bc4a-3c60-41e2-9501-0a94cde5b119" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 17 17:54:12.283861 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:12.283819 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-529833-predictor-55864c5984-dcs8f" podUID="0a93bc4a-3c60-41e2-9501-0a94cde5b119" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 17 17:54:17.287858 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:17.287831 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-529833-predictor-55864c5984-dcs8f" Apr 17 17:54:17.288521 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:17.288487 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-529833-predictor-55864c5984-dcs8f" podUID="0a93bc4a-3c60-41e2-9501-0a94cde5b119" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 17 17:54:27.289073 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:27.288993 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-529833-predictor-55864c5984-dcs8f" podUID="0a93bc4a-3c60-41e2-9501-0a94cde5b119" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 17 17:54:37.288428 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:37.288388 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-529833-predictor-55864c5984-dcs8f" podUID="0a93bc4a-3c60-41e2-9501-0a94cde5b119" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 17 17:54:47.288684 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:47.288642 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-529833-predictor-55864c5984-dcs8f" podUID="0a93bc4a-3c60-41e2-9501-0a94cde5b119" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 17 17:54:48.877872 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:48.877842 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz777_35fe1bc2-e1a4-4744-8f67-3cc38747e567/ovn-acl-logging/0.log" Apr 17 17:54:48.883985 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:48.883962 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz777_35fe1bc2-e1a4-4744-8f67-3cc38747e567/ovn-acl-logging/0.log" Apr 17 17:54:57.289271 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:54:57.289225 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-529833-predictor-55864c5984-dcs8f" podUID="0a93bc4a-3c60-41e2-9501-0a94cde5b119" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 17 17:55:07.289379 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:07.289337 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-529833-predictor-55864c5984-dcs8f" podUID="0a93bc4a-3c60-41e2-9501-0a94cde5b119" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 17 17:55:17.289419 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:17.289384 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-529833-predictor-55864c5984-dcs8f" Apr 17 17:55:24.614547 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:24.614511 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-529833-predictor-685956fddc-9lbjs"] Apr 17 17:55:24.614920 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:24.614817 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8" containerName="storage-initializer" Apr 17 17:55:24.614920 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:24.614832 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8" containerName="storage-initializer" Apr 17 17:55:24.614920 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:24.614859 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8" containerName="kube-rbac-proxy" Apr 17 17:55:24.614920 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:24.614866 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8" containerName="kube-rbac-proxy" Apr 17 17:55:24.614920 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:24.614880 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8" containerName="kserve-container" Apr 17 17:55:24.614920 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:24.614886 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8" containerName="kserve-container" Apr 17 17:55:24.615112 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:24.614939 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8" containerName="kube-rbac-proxy" Apr 17 17:55:24.615112 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:24.614948 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="3fe31fe9-e664-4bf1-aba1-3f26c2ceccd8" containerName="kserve-container" Apr 17 17:55:24.618140 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:24.618124 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-529833-predictor-685956fddc-9lbjs" Apr 17 17:55:24.623685 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:24.623651 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-529833\"" Apr 17 17:55:24.623685 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:24.623675 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-529833-kube-rbac-proxy-sar-config\"" Apr 17 17:55:24.623890 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:24.623722 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-529833-dockercfg-kzwvk\"" Apr 17 17:55:24.623946 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:24.623926 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 17 17:55:24.624321 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:24.624301 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-529833-predictor-serving-cert\"" Apr 17 17:55:24.630120 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:24.630098 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-529833-predictor-685956fddc-9lbjs"] Apr 17 17:55:24.729841 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:24.729800 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6608e481-f1cd-4b46-ad07-771dd38bec23-kserve-provision-location\") pod \"isvc-secondary-529833-predictor-685956fddc-9lbjs\" (UID: \"6608e481-f1cd-4b46-ad07-771dd38bec23\") " pod="kserve-ci-e2e-test/isvc-secondary-529833-predictor-685956fddc-9lbjs" Apr 17 17:55:24.730048 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:24.729848 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccjz8\" (UniqueName: \"kubernetes.io/projected/6608e481-f1cd-4b46-ad07-771dd38bec23-kube-api-access-ccjz8\") pod \"isvc-secondary-529833-predictor-685956fddc-9lbjs\" (UID: \"6608e481-f1cd-4b46-ad07-771dd38bec23\") " pod="kserve-ci-e2e-test/isvc-secondary-529833-predictor-685956fddc-9lbjs" Apr 17 17:55:24.730048 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:24.729917 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6608e481-f1cd-4b46-ad07-771dd38bec23-proxy-tls\") pod \"isvc-secondary-529833-predictor-685956fddc-9lbjs\" (UID: \"6608e481-f1cd-4b46-ad07-771dd38bec23\") " pod="kserve-ci-e2e-test/isvc-secondary-529833-predictor-685956fddc-9lbjs" Apr 17 17:55:24.730048 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:24.729952 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-secondary-529833-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6608e481-f1cd-4b46-ad07-771dd38bec23-isvc-secondary-529833-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-529833-predictor-685956fddc-9lbjs\" (UID: \"6608e481-f1cd-4b46-ad07-771dd38bec23\") " pod="kserve-ci-e2e-test/isvc-secondary-529833-predictor-685956fddc-9lbjs" Apr 17 17:55:24.730048 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:24.730010 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6608e481-f1cd-4b46-ad07-771dd38bec23-cabundle-cert\") pod \"isvc-secondary-529833-predictor-685956fddc-9lbjs\" (UID: \"6608e481-f1cd-4b46-ad07-771dd38bec23\") " pod="kserve-ci-e2e-test/isvc-secondary-529833-predictor-685956fddc-9lbjs" Apr 17 17:55:24.831233 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:24.831199 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6608e481-f1cd-4b46-ad07-771dd38bec23-cabundle-cert\") pod \"isvc-secondary-529833-predictor-685956fddc-9lbjs\" (UID: \"6608e481-f1cd-4b46-ad07-771dd38bec23\") " pod="kserve-ci-e2e-test/isvc-secondary-529833-predictor-685956fddc-9lbjs" Apr 17 17:55:24.831462 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:24.831289 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6608e481-f1cd-4b46-ad07-771dd38bec23-kserve-provision-location\") pod \"isvc-secondary-529833-predictor-685956fddc-9lbjs\" (UID: \"6608e481-f1cd-4b46-ad07-771dd38bec23\") " pod="kserve-ci-e2e-test/isvc-secondary-529833-predictor-685956fddc-9lbjs" Apr 17 17:55:24.831462 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:24.831318 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ccjz8\" (UniqueName: \"kubernetes.io/projected/6608e481-f1cd-4b46-ad07-771dd38bec23-kube-api-access-ccjz8\") pod \"isvc-secondary-529833-predictor-685956fddc-9lbjs\" (UID: \"6608e481-f1cd-4b46-ad07-771dd38bec23\") " pod="kserve-ci-e2e-test/isvc-secondary-529833-predictor-685956fddc-9lbjs" Apr 17 17:55:24.831462 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:24.831354 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6608e481-f1cd-4b46-ad07-771dd38bec23-proxy-tls\") pod \"isvc-secondary-529833-predictor-685956fddc-9lbjs\" (UID: \"6608e481-f1cd-4b46-ad07-771dd38bec23\") " pod="kserve-ci-e2e-test/isvc-secondary-529833-predictor-685956fddc-9lbjs" Apr 17 17:55:24.831462 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:24.831387 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-secondary-529833-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6608e481-f1cd-4b46-ad07-771dd38bec23-isvc-secondary-529833-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-529833-predictor-685956fddc-9lbjs\" (UID: \"6608e481-f1cd-4b46-ad07-771dd38bec23\") " pod="kserve-ci-e2e-test/isvc-secondary-529833-predictor-685956fddc-9lbjs" Apr 17 17:55:24.831832 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:24.831807 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6608e481-f1cd-4b46-ad07-771dd38bec23-kserve-provision-location\") pod \"isvc-secondary-529833-predictor-685956fddc-9lbjs\" (UID: \"6608e481-f1cd-4b46-ad07-771dd38bec23\") " pod="kserve-ci-e2e-test/isvc-secondary-529833-predictor-685956fddc-9lbjs" Apr 17 17:55:24.832016 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:24.831997 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6608e481-f1cd-4b46-ad07-771dd38bec23-cabundle-cert\") pod \"isvc-secondary-529833-predictor-685956fddc-9lbjs\" (UID: \"6608e481-f1cd-4b46-ad07-771dd38bec23\") " pod="kserve-ci-e2e-test/isvc-secondary-529833-predictor-685956fddc-9lbjs" Apr 17 17:55:24.832090 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:24.832034 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-secondary-529833-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6608e481-f1cd-4b46-ad07-771dd38bec23-isvc-secondary-529833-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-529833-predictor-685956fddc-9lbjs\" (UID: \"6608e481-f1cd-4b46-ad07-771dd38bec23\") " pod="kserve-ci-e2e-test/isvc-secondary-529833-predictor-685956fddc-9lbjs" Apr 17 17:55:24.833966 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:24.833941 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6608e481-f1cd-4b46-ad07-771dd38bec23-proxy-tls\") pod \"isvc-secondary-529833-predictor-685956fddc-9lbjs\" (UID: \"6608e481-f1cd-4b46-ad07-771dd38bec23\") " pod="kserve-ci-e2e-test/isvc-secondary-529833-predictor-685956fddc-9lbjs" Apr 17 17:55:24.841663 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:24.841641 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccjz8\" (UniqueName: \"kubernetes.io/projected/6608e481-f1cd-4b46-ad07-771dd38bec23-kube-api-access-ccjz8\") pod \"isvc-secondary-529833-predictor-685956fddc-9lbjs\" (UID: \"6608e481-f1cd-4b46-ad07-771dd38bec23\") " pod="kserve-ci-e2e-test/isvc-secondary-529833-predictor-685956fddc-9lbjs" Apr 17 17:55:24.928834 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:24.928728 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-529833-predictor-685956fddc-9lbjs" Apr 17 17:55:25.055989 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:25.055966 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-529833-predictor-685956fddc-9lbjs"] Apr 17 17:55:25.057988 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:55:25.057958 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6608e481_f1cd_4b46_ad07_771dd38bec23.slice/crio-6d9b750611b7d030d8c57f2b918e9aca0861dfdda80743a761d75036a7f67c9d WatchSource:0}: Error finding container 6d9b750611b7d030d8c57f2b918e9aca0861dfdda80743a761d75036a7f67c9d: Status 404 returned error can't find the container with id 6d9b750611b7d030d8c57f2b918e9aca0861dfdda80743a761d75036a7f67c9d Apr 17 17:55:25.500313 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:25.500278 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-529833-predictor-685956fddc-9lbjs" event={"ID":"6608e481-f1cd-4b46-ad07-771dd38bec23","Type":"ContainerStarted","Data":"e70f9341d37cc78f5b308f88a10d5a977e17dd165457ee0cc7fb7ee1b95e9f40"} Apr 17 17:55:25.500313 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:25.500316 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-529833-predictor-685956fddc-9lbjs" event={"ID":"6608e481-f1cd-4b46-ad07-771dd38bec23","Type":"ContainerStarted","Data":"6d9b750611b7d030d8c57f2b918e9aca0861dfdda80743a761d75036a7f67c9d"} Apr 17 17:55:28.509786 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:28.509713 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-529833-predictor-685956fddc-9lbjs_6608e481-f1cd-4b46-ad07-771dd38bec23/storage-initializer/0.log" Apr 17 17:55:28.509786 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:28.509749 2566 generic.go:358] "Generic (PLEG): container finished" podID="6608e481-f1cd-4b46-ad07-771dd38bec23" containerID="e70f9341d37cc78f5b308f88a10d5a977e17dd165457ee0cc7fb7ee1b95e9f40" exitCode=1 Apr 17 17:55:28.510169 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:28.509831 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-529833-predictor-685956fddc-9lbjs" event={"ID":"6608e481-f1cd-4b46-ad07-771dd38bec23","Type":"ContainerDied","Data":"e70f9341d37cc78f5b308f88a10d5a977e17dd165457ee0cc7fb7ee1b95e9f40"} Apr 17 17:55:29.513933 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:29.513901 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-529833-predictor-685956fddc-9lbjs_6608e481-f1cd-4b46-ad07-771dd38bec23/storage-initializer/0.log" Apr 17 17:55:29.514353 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:29.514025 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-529833-predictor-685956fddc-9lbjs" event={"ID":"6608e481-f1cd-4b46-ad07-771dd38bec23","Type":"ContainerStarted","Data":"88c707899f13529c092fda0993a0c6826e8ded2b8adcaab89bdc00c05b8b985c"} Apr 17 17:55:34.531544 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:34.531457 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-529833-predictor-685956fddc-9lbjs_6608e481-f1cd-4b46-ad07-771dd38bec23/storage-initializer/1.log" Apr 17 17:55:34.531931 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:34.531852 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-529833-predictor-685956fddc-9lbjs_6608e481-f1cd-4b46-ad07-771dd38bec23/storage-initializer/0.log" Apr 17 17:55:34.531931 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:34.531886 2566 generic.go:358] "Generic (PLEG): container finished" podID="6608e481-f1cd-4b46-ad07-771dd38bec23" containerID="88c707899f13529c092fda0993a0c6826e8ded2b8adcaab89bdc00c05b8b985c" exitCode=1 Apr 17 17:55:34.532015 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:34.531966 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-529833-predictor-685956fddc-9lbjs" event={"ID":"6608e481-f1cd-4b46-ad07-771dd38bec23","Type":"ContainerDied","Data":"88c707899f13529c092fda0993a0c6826e8ded2b8adcaab89bdc00c05b8b985c"} Apr 17 17:55:34.532015 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:34.532010 2566 scope.go:117] "RemoveContainer" containerID="e70f9341d37cc78f5b308f88a10d5a977e17dd165457ee0cc7fb7ee1b95e9f40" Apr 17 17:55:34.532444 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:34.532420 2566 scope.go:117] "RemoveContainer" containerID="e70f9341d37cc78f5b308f88a10d5a977e17dd165457ee0cc7fb7ee1b95e9f40" Apr 17 17:55:34.542927 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:55:34.542898 2566 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-529833-predictor-685956fddc-9lbjs_kserve-ci-e2e-test_6608e481-f1cd-4b46-ad07-771dd38bec23_0 in pod sandbox 6d9b750611b7d030d8c57f2b918e9aca0861dfdda80743a761d75036a7f67c9d from index: no such id: 'e70f9341d37cc78f5b308f88a10d5a977e17dd165457ee0cc7fb7ee1b95e9f40'" containerID="e70f9341d37cc78f5b308f88a10d5a977e17dd165457ee0cc7fb7ee1b95e9f40" Apr 17 17:55:34.542999 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:34.542936 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e70f9341d37cc78f5b308f88a10d5a977e17dd165457ee0cc7fb7ee1b95e9f40"} err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-529833-predictor-685956fddc-9lbjs_kserve-ci-e2e-test_6608e481-f1cd-4b46-ad07-771dd38bec23_0 in pod sandbox 6d9b750611b7d030d8c57f2b918e9aca0861dfdda80743a761d75036a7f67c9d from index: no such id: 'e70f9341d37cc78f5b308f88a10d5a977e17dd165457ee0cc7fb7ee1b95e9f40'" Apr 17 17:55:34.543099 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:55:34.543081 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-529833-predictor-685956fddc-9lbjs_kserve-ci-e2e-test(6608e481-f1cd-4b46-ad07-771dd38bec23)\"" pod="kserve-ci-e2e-test/isvc-secondary-529833-predictor-685956fddc-9lbjs" podUID="6608e481-f1cd-4b46-ad07-771dd38bec23" Apr 17 17:55:35.536854 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:35.536821 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-529833-predictor-685956fddc-9lbjs_6608e481-f1cd-4b46-ad07-771dd38bec23/storage-initializer/1.log" Apr 17 17:55:40.644828 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:40.644797 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-529833-predictor-685956fddc-9lbjs"] Apr 17 17:55:40.711124 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:40.711095 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-529833-predictor-55864c5984-dcs8f"] Apr 17 17:55:40.711620 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:40.711561 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-529833-predictor-55864c5984-dcs8f" podUID="0a93bc4a-3c60-41e2-9501-0a94cde5b119" containerName="kserve-container" containerID="cri-o://c8edb0a3fc9c6f12745f2c032e8e7a8b39bdfdec35ec4f2eaea887ba6149668c" gracePeriod=30 Apr 17 17:55:40.711758 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:40.711587 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-529833-predictor-55864c5984-dcs8f" podUID="0a93bc4a-3c60-41e2-9501-0a94cde5b119" containerName="kube-rbac-proxy" containerID="cri-o://38d750e7aa141393747f34e8365b1fb50853e9845cb3dee61e96a0651c70b42a" gracePeriod=30 Apr 17 17:55:40.795926 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:40.795900 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr"] Apr 17 17:55:40.801222 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:40.801188 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr" Apr 17 17:55:40.804508 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:40.804485 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-59fc6e-predictor-serving-cert\"" Apr 17 17:55:40.804508 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:40.804491 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-59fc6e-dockercfg-rdrdd\"" Apr 17 17:55:40.804832 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:40.804814 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-59fc6e\"" Apr 17 17:55:40.805130 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:40.805113 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-59fc6e-kube-rbac-proxy-sar-config\"" Apr 17 17:55:40.811707 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:40.811689 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr"] Apr 17 17:55:40.830835 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:40.830818 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-529833-predictor-685956fddc-9lbjs_6608e481-f1cd-4b46-ad07-771dd38bec23/storage-initializer/1.log" Apr 17 17:55:40.830929 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:40.830877 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-529833-predictor-685956fddc-9lbjs" Apr 17 17:55:40.972540 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:40.972459 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6608e481-f1cd-4b46-ad07-771dd38bec23-cabundle-cert\") pod \"6608e481-f1cd-4b46-ad07-771dd38bec23\" (UID: \"6608e481-f1cd-4b46-ad07-771dd38bec23\") " Apr 17 17:55:40.972540 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:40.972502 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-secondary-529833-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6608e481-f1cd-4b46-ad07-771dd38bec23-isvc-secondary-529833-kube-rbac-proxy-sar-config\") pod \"6608e481-f1cd-4b46-ad07-771dd38bec23\" (UID: \"6608e481-f1cd-4b46-ad07-771dd38bec23\") " Apr 17 17:55:40.972540 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:40.972529 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6608e481-f1cd-4b46-ad07-771dd38bec23-proxy-tls\") pod \"6608e481-f1cd-4b46-ad07-771dd38bec23\" (UID: \"6608e481-f1cd-4b46-ad07-771dd38bec23\") " Apr 17 17:55:40.972773 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:40.972622 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6608e481-f1cd-4b46-ad07-771dd38bec23-kserve-provision-location\") pod \"6608e481-f1cd-4b46-ad07-771dd38bec23\" (UID: \"6608e481-f1cd-4b46-ad07-771dd38bec23\") " Apr 17 17:55:40.972773 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:40.972652 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccjz8\" (UniqueName: \"kubernetes.io/projected/6608e481-f1cd-4b46-ad07-771dd38bec23-kube-api-access-ccjz8\") pod \"6608e481-f1cd-4b46-ad07-771dd38bec23\" (UID: \"6608e481-f1cd-4b46-ad07-771dd38bec23\") " Apr 17 17:55:40.972851 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:40.972813 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-init-fail-59fc6e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/48bf8985-2f70-4c31-98ef-3ed4b0011947-isvc-init-fail-59fc6e-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr\" (UID: \"48bf8985-2f70-4c31-98ef-3ed4b0011947\") " pod="kserve-ci-e2e-test/isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr" Apr 17 17:55:40.972901 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:40.972878 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/48bf8985-2f70-4c31-98ef-3ed4b0011947-proxy-tls\") pod \"isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr\" (UID: \"48bf8985-2f70-4c31-98ef-3ed4b0011947\") " pod="kserve-ci-e2e-test/isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr" Apr 17 17:55:40.972952 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:40.972906 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6608e481-f1cd-4b46-ad07-771dd38bec23-isvc-secondary-529833-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-secondary-529833-kube-rbac-proxy-sar-config") pod "6608e481-f1cd-4b46-ad07-771dd38bec23" (UID: "6608e481-f1cd-4b46-ad07-771dd38bec23"). InnerVolumeSpecName "isvc-secondary-529833-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:55:40.973004 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:40.972943 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6608e481-f1cd-4b46-ad07-771dd38bec23-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6608e481-f1cd-4b46-ad07-771dd38bec23" (UID: "6608e481-f1cd-4b46-ad07-771dd38bec23"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:55:40.973004 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:40.972961 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6608e481-f1cd-4b46-ad07-771dd38bec23-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "6608e481-f1cd-4b46-ad07-771dd38bec23" (UID: "6608e481-f1cd-4b46-ad07-771dd38bec23"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:55:40.973079 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:40.973040 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/48bf8985-2f70-4c31-98ef-3ed4b0011947-cabundle-cert\") pod \"isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr\" (UID: \"48bf8985-2f70-4c31-98ef-3ed4b0011947\") " pod="kserve-ci-e2e-test/isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr" Apr 17 17:55:40.973129 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:40.973075 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbrmm\" (UniqueName: \"kubernetes.io/projected/48bf8985-2f70-4c31-98ef-3ed4b0011947-kube-api-access-mbrmm\") pod \"isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr\" (UID: \"48bf8985-2f70-4c31-98ef-3ed4b0011947\") " pod="kserve-ci-e2e-test/isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr" Apr 17 17:55:40.973129 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:40.973101 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/48bf8985-2f70-4c31-98ef-3ed4b0011947-kserve-provision-location\") pod \"isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr\" (UID: \"48bf8985-2f70-4c31-98ef-3ed4b0011947\") " pod="kserve-ci-e2e-test/isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr" Apr 17 17:55:40.973211 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:40.973194 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6608e481-f1cd-4b46-ad07-771dd38bec23-kserve-provision-location\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:55:40.973245 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:40.973219 2566 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6608e481-f1cd-4b46-ad07-771dd38bec23-cabundle-cert\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:55:40.973245 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:40.973235 2566 reconciler_common.go:299] "Volume detached for volume \"isvc-secondary-529833-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6608e481-f1cd-4b46-ad07-771dd38bec23-isvc-secondary-529833-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:55:40.974767 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:40.974744 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6608e481-f1cd-4b46-ad07-771dd38bec23-kube-api-access-ccjz8" (OuterVolumeSpecName: "kube-api-access-ccjz8") pod "6608e481-f1cd-4b46-ad07-771dd38bec23" (UID: "6608e481-f1cd-4b46-ad07-771dd38bec23"). InnerVolumeSpecName "kube-api-access-ccjz8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:55:40.974833 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:40.974786 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6608e481-f1cd-4b46-ad07-771dd38bec23-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "6608e481-f1cd-4b46-ad07-771dd38bec23" (UID: "6608e481-f1cd-4b46-ad07-771dd38bec23"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:55:41.073764 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:41.073726 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/48bf8985-2f70-4c31-98ef-3ed4b0011947-cabundle-cert\") pod \"isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr\" (UID: \"48bf8985-2f70-4c31-98ef-3ed4b0011947\") " pod="kserve-ci-e2e-test/isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr" Apr 17 17:55:41.073957 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:41.073773 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mbrmm\" (UniqueName: \"kubernetes.io/projected/48bf8985-2f70-4c31-98ef-3ed4b0011947-kube-api-access-mbrmm\") pod \"isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr\" (UID: \"48bf8985-2f70-4c31-98ef-3ed4b0011947\") " pod="kserve-ci-e2e-test/isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr" Apr 17 17:55:41.073957 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:41.073803 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/48bf8985-2f70-4c31-98ef-3ed4b0011947-kserve-provision-location\") pod \"isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr\" (UID: \"48bf8985-2f70-4c31-98ef-3ed4b0011947\") " pod="kserve-ci-e2e-test/isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr" Apr 17 17:55:41.073957 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:41.073842 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-init-fail-59fc6e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/48bf8985-2f70-4c31-98ef-3ed4b0011947-isvc-init-fail-59fc6e-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr\" (UID: \"48bf8985-2f70-4c31-98ef-3ed4b0011947\") " pod="kserve-ci-e2e-test/isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr" Apr 17 17:55:41.073957 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:41.073892 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/48bf8985-2f70-4c31-98ef-3ed4b0011947-proxy-tls\") pod \"isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr\" (UID: \"48bf8985-2f70-4c31-98ef-3ed4b0011947\") " pod="kserve-ci-e2e-test/isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr" Apr 17 17:55:41.074174 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:41.073982 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6608e481-f1cd-4b46-ad07-771dd38bec23-proxy-tls\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:55:41.074174 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:41.074000 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ccjz8\" (UniqueName: \"kubernetes.io/projected/6608e481-f1cd-4b46-ad07-771dd38bec23-kube-api-access-ccjz8\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:55:41.074174 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:55:41.074051 2566 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-init-fail-59fc6e-predictor-serving-cert: secret "isvc-init-fail-59fc6e-predictor-serving-cert" not found Apr 17 17:55:41.074174 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:55:41.074121 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48bf8985-2f70-4c31-98ef-3ed4b0011947-proxy-tls podName:48bf8985-2f70-4c31-98ef-3ed4b0011947 nodeName:}" failed. No retries permitted until 2026-04-17 17:55:41.574105206 +0000 UTC m=+1853.244184830 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/48bf8985-2f70-4c31-98ef-3ed4b0011947-proxy-tls") pod "isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr" (UID: "48bf8985-2f70-4c31-98ef-3ed4b0011947") : secret "isvc-init-fail-59fc6e-predictor-serving-cert" not found Apr 17 17:55:41.074377 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:41.074230 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/48bf8985-2f70-4c31-98ef-3ed4b0011947-kserve-provision-location\") pod \"isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr\" (UID: \"48bf8985-2f70-4c31-98ef-3ed4b0011947\") " pod="kserve-ci-e2e-test/isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr" Apr 17 17:55:41.074493 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:41.074475 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/48bf8985-2f70-4c31-98ef-3ed4b0011947-cabundle-cert\") pod \"isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr\" (UID: \"48bf8985-2f70-4c31-98ef-3ed4b0011947\") " pod="kserve-ci-e2e-test/isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr" Apr 17 17:55:41.074529 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:41.074492 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-init-fail-59fc6e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/48bf8985-2f70-4c31-98ef-3ed4b0011947-isvc-init-fail-59fc6e-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr\" (UID: \"48bf8985-2f70-4c31-98ef-3ed4b0011947\") " pod="kserve-ci-e2e-test/isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr" Apr 17 17:55:41.081988 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:41.081968 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbrmm\" (UniqueName: \"kubernetes.io/projected/48bf8985-2f70-4c31-98ef-3ed4b0011947-kube-api-access-mbrmm\") pod \"isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr\" (UID: \"48bf8985-2f70-4c31-98ef-3ed4b0011947\") " pod="kserve-ci-e2e-test/isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr" Apr 17 17:55:41.556268 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:41.556233 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-529833-predictor-685956fddc-9lbjs_6608e481-f1cd-4b46-ad07-771dd38bec23/storage-initializer/1.log" Apr 17 17:55:41.556443 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:41.556371 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-529833-predictor-685956fddc-9lbjs" Apr 17 17:55:41.556443 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:41.556374 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-529833-predictor-685956fddc-9lbjs" event={"ID":"6608e481-f1cd-4b46-ad07-771dd38bec23","Type":"ContainerDied","Data":"6d9b750611b7d030d8c57f2b918e9aca0861dfdda80743a761d75036a7f67c9d"} Apr 17 17:55:41.556443 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:41.556421 2566 scope.go:117] "RemoveContainer" containerID="88c707899f13529c092fda0993a0c6826e8ded2b8adcaab89bdc00c05b8b985c" Apr 17 17:55:41.558377 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:41.558354 2566 generic.go:358] "Generic (PLEG): container finished" podID="0a93bc4a-3c60-41e2-9501-0a94cde5b119" containerID="38d750e7aa141393747f34e8365b1fb50853e9845cb3dee61e96a0651c70b42a" exitCode=2 Apr 17 17:55:41.558506 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:41.558385 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-529833-predictor-55864c5984-dcs8f" event={"ID":"0a93bc4a-3c60-41e2-9501-0a94cde5b119","Type":"ContainerDied","Data":"38d750e7aa141393747f34e8365b1fb50853e9845cb3dee61e96a0651c70b42a"} Apr 17 17:55:41.579094 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:41.579072 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/48bf8985-2f70-4c31-98ef-3ed4b0011947-proxy-tls\") pod \"isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr\" (UID: \"48bf8985-2f70-4c31-98ef-3ed4b0011947\") " pod="kserve-ci-e2e-test/isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr" Apr 17 17:55:41.581336 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:41.581320 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/48bf8985-2f70-4c31-98ef-3ed4b0011947-proxy-tls\") pod \"isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr\" (UID: \"48bf8985-2f70-4c31-98ef-3ed4b0011947\") " pod="kserve-ci-e2e-test/isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr" Apr 17 17:55:41.591722 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:41.591699 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-529833-predictor-685956fddc-9lbjs"] Apr 17 17:55:41.595971 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:41.595953 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-529833-predictor-685956fddc-9lbjs"] Apr 17 17:55:41.711841 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:41.711801 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr" Apr 17 17:55:41.831691 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:41.831629 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr"] Apr 17 17:55:41.834240 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:55:41.834207 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48bf8985_2f70_4c31_98ef_3ed4b0011947.slice/crio-6cbd4583e7ce44e05adbbda257ee84d7ea4a6eaa12d1f855b2e97e117bfcded9 WatchSource:0}: Error finding container 6cbd4583e7ce44e05adbbda257ee84d7ea4a6eaa12d1f855b2e97e117bfcded9: Status 404 returned error can't find the container with id 6cbd4583e7ce44e05adbbda257ee84d7ea4a6eaa12d1f855b2e97e117bfcded9 Apr 17 17:55:42.284535 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:42.284442 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-529833-predictor-55864c5984-dcs8f" podUID="0a93bc4a-3c60-41e2-9501-0a94cde5b119" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.34:8643/healthz\": dial tcp 10.133.0.34:8643: connect: connection refused" Apr 17 17:55:42.563116 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:42.563082 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr" event={"ID":"48bf8985-2f70-4c31-98ef-3ed4b0011947","Type":"ContainerStarted","Data":"c8071997389aaa3436f5ad3c588a368760480b24ccf5d504fe297b35ae0d111c"} Apr 17 17:55:42.563350 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:42.563121 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr" event={"ID":"48bf8985-2f70-4c31-98ef-3ed4b0011947","Type":"ContainerStarted","Data":"6cbd4583e7ce44e05adbbda257ee84d7ea4a6eaa12d1f855b2e97e117bfcded9"} Apr 17 17:55:42.838553 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:42.838469 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6608e481-f1cd-4b46-ad07-771dd38bec23" path="/var/lib/kubelet/pods/6608e481-f1cd-4b46-ad07-771dd38bec23/volumes" Apr 17 17:55:45.049950 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:45.049927 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-529833-predictor-55864c5984-dcs8f" Apr 17 17:55:45.213581 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:45.213480 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-primary-529833-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0a93bc4a-3c60-41e2-9501-0a94cde5b119-isvc-primary-529833-kube-rbac-proxy-sar-config\") pod \"0a93bc4a-3c60-41e2-9501-0a94cde5b119\" (UID: \"0a93bc4a-3c60-41e2-9501-0a94cde5b119\") " Apr 17 17:55:45.213581 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:45.213558 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a93bc4a-3c60-41e2-9501-0a94cde5b119-proxy-tls\") pod \"0a93bc4a-3c60-41e2-9501-0a94cde5b119\" (UID: \"0a93bc4a-3c60-41e2-9501-0a94cde5b119\") " Apr 17 17:55:45.213819 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:45.213659 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a93bc4a-3c60-41e2-9501-0a94cde5b119-kserve-provision-location\") pod \"0a93bc4a-3c60-41e2-9501-0a94cde5b119\" (UID: \"0a93bc4a-3c60-41e2-9501-0a94cde5b119\") " Apr 17 17:55:45.213819 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:45.213699 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lvcb\" (UniqueName: \"kubernetes.io/projected/0a93bc4a-3c60-41e2-9501-0a94cde5b119-kube-api-access-6lvcb\") pod \"0a93bc4a-3c60-41e2-9501-0a94cde5b119\" (UID: \"0a93bc4a-3c60-41e2-9501-0a94cde5b119\") " Apr 17 17:55:45.213932 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:45.213864 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a93bc4a-3c60-41e2-9501-0a94cde5b119-isvc-primary-529833-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-primary-529833-kube-rbac-proxy-sar-config") pod "0a93bc4a-3c60-41e2-9501-0a94cde5b119" (UID: "0a93bc4a-3c60-41e2-9501-0a94cde5b119"). InnerVolumeSpecName "isvc-primary-529833-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:55:45.213975 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:45.213946 2566 reconciler_common.go:299] "Volume detached for volume \"isvc-primary-529833-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0a93bc4a-3c60-41e2-9501-0a94cde5b119-isvc-primary-529833-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:55:45.214007 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:45.213973 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a93bc4a-3c60-41e2-9501-0a94cde5b119-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0a93bc4a-3c60-41e2-9501-0a94cde5b119" (UID: "0a93bc4a-3c60-41e2-9501-0a94cde5b119"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:55:45.215664 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:45.215637 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a93bc4a-3c60-41e2-9501-0a94cde5b119-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0a93bc4a-3c60-41e2-9501-0a94cde5b119" (UID: "0a93bc4a-3c60-41e2-9501-0a94cde5b119"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:55:45.215664 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:45.215653 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a93bc4a-3c60-41e2-9501-0a94cde5b119-kube-api-access-6lvcb" (OuterVolumeSpecName: "kube-api-access-6lvcb") pod "0a93bc4a-3c60-41e2-9501-0a94cde5b119" (UID: "0a93bc4a-3c60-41e2-9501-0a94cde5b119"). InnerVolumeSpecName "kube-api-access-6lvcb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:55:45.314772 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:45.314740 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a93bc4a-3c60-41e2-9501-0a94cde5b119-proxy-tls\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:55:45.314772 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:45.314767 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a93bc4a-3c60-41e2-9501-0a94cde5b119-kserve-provision-location\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:55:45.314772 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:45.314777 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6lvcb\" (UniqueName: \"kubernetes.io/projected/0a93bc4a-3c60-41e2-9501-0a94cde5b119-kube-api-access-6lvcb\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:55:45.575577 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:45.575546 2566 generic.go:358] "Generic (PLEG): container finished" podID="0a93bc4a-3c60-41e2-9501-0a94cde5b119" containerID="c8edb0a3fc9c6f12745f2c032e8e7a8b39bdfdec35ec4f2eaea887ba6149668c" exitCode=0 Apr 17 17:55:45.575739 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:45.575622 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-529833-predictor-55864c5984-dcs8f" Apr 17 17:55:45.575739 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:45.575632 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-529833-predictor-55864c5984-dcs8f" event={"ID":"0a93bc4a-3c60-41e2-9501-0a94cde5b119","Type":"ContainerDied","Data":"c8edb0a3fc9c6f12745f2c032e8e7a8b39bdfdec35ec4f2eaea887ba6149668c"} Apr 17 17:55:45.575739 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:45.575672 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-529833-predictor-55864c5984-dcs8f" event={"ID":"0a93bc4a-3c60-41e2-9501-0a94cde5b119","Type":"ContainerDied","Data":"8e70176243169e437dadfbfe83cf4dffb31d5966f8e56a90d8126d0e70d05e98"} Apr 17 17:55:45.575739 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:45.575689 2566 scope.go:117] "RemoveContainer" containerID="38d750e7aa141393747f34e8365b1fb50853e9845cb3dee61e96a0651c70b42a" Apr 17 17:55:45.585832 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:45.585814 2566 scope.go:117] "RemoveContainer" containerID="c8edb0a3fc9c6f12745f2c032e8e7a8b39bdfdec35ec4f2eaea887ba6149668c" Apr 17 17:55:45.592612 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:45.592595 2566 scope.go:117] "RemoveContainer" containerID="4dc66894c9f5148c41f32c73d88eb0e02cd3a9c498aa0ec7044f87765c371aa6" Apr 17 17:55:45.597650 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:45.597628 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-529833-predictor-55864c5984-dcs8f"] Apr 17 17:55:45.599539 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:45.599524 2566 scope.go:117] "RemoveContainer" containerID="38d750e7aa141393747f34e8365b1fb50853e9845cb3dee61e96a0651c70b42a" Apr 17 17:55:45.599772 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:55:45.599754 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38d750e7aa141393747f34e8365b1fb50853e9845cb3dee61e96a0651c70b42a\": container with ID starting with 38d750e7aa141393747f34e8365b1fb50853e9845cb3dee61e96a0651c70b42a not found: ID does not exist" containerID="38d750e7aa141393747f34e8365b1fb50853e9845cb3dee61e96a0651c70b42a" Apr 17 17:55:45.599819 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:45.599779 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38d750e7aa141393747f34e8365b1fb50853e9845cb3dee61e96a0651c70b42a"} err="failed to get container status \"38d750e7aa141393747f34e8365b1fb50853e9845cb3dee61e96a0651c70b42a\": rpc error: code = NotFound desc = could not find container \"38d750e7aa141393747f34e8365b1fb50853e9845cb3dee61e96a0651c70b42a\": container with ID starting with 38d750e7aa141393747f34e8365b1fb50853e9845cb3dee61e96a0651c70b42a not found: ID does not exist" Apr 17 17:55:45.599819 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:45.599796 2566 scope.go:117] "RemoveContainer" containerID="c8edb0a3fc9c6f12745f2c032e8e7a8b39bdfdec35ec4f2eaea887ba6149668c" Apr 17 17:55:45.600024 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:55:45.600005 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8edb0a3fc9c6f12745f2c032e8e7a8b39bdfdec35ec4f2eaea887ba6149668c\": container with ID starting with c8edb0a3fc9c6f12745f2c032e8e7a8b39bdfdec35ec4f2eaea887ba6149668c not found: ID does not exist" containerID="c8edb0a3fc9c6f12745f2c032e8e7a8b39bdfdec35ec4f2eaea887ba6149668c" Apr 17 17:55:45.600065 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:45.600030 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8edb0a3fc9c6f12745f2c032e8e7a8b39bdfdec35ec4f2eaea887ba6149668c"} err="failed to get container status \"c8edb0a3fc9c6f12745f2c032e8e7a8b39bdfdec35ec4f2eaea887ba6149668c\": rpc error: code = NotFound desc = could not find container \"c8edb0a3fc9c6f12745f2c032e8e7a8b39bdfdec35ec4f2eaea887ba6149668c\": container with ID starting with c8edb0a3fc9c6f12745f2c032e8e7a8b39bdfdec35ec4f2eaea887ba6149668c not found: ID does not exist" Apr 17 17:55:45.600065 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:45.600046 2566 scope.go:117] "RemoveContainer" containerID="4dc66894c9f5148c41f32c73d88eb0e02cd3a9c498aa0ec7044f87765c371aa6" Apr 17 17:55:45.600285 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:55:45.600249 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dc66894c9f5148c41f32c73d88eb0e02cd3a9c498aa0ec7044f87765c371aa6\": container with ID starting with 4dc66894c9f5148c41f32c73d88eb0e02cd3a9c498aa0ec7044f87765c371aa6 not found: ID does not exist" containerID="4dc66894c9f5148c41f32c73d88eb0e02cd3a9c498aa0ec7044f87765c371aa6" Apr 17 17:55:45.600377 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:45.600286 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dc66894c9f5148c41f32c73d88eb0e02cd3a9c498aa0ec7044f87765c371aa6"} err="failed to get container status \"4dc66894c9f5148c41f32c73d88eb0e02cd3a9c498aa0ec7044f87765c371aa6\": rpc error: code = NotFound desc = could not find container \"4dc66894c9f5148c41f32c73d88eb0e02cd3a9c498aa0ec7044f87765c371aa6\": container with ID starting with 4dc66894c9f5148c41f32c73d88eb0e02cd3a9c498aa0ec7044f87765c371aa6 not found: ID does not exist" Apr 17 17:55:45.603032 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:45.603006 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-529833-predictor-55864c5984-dcs8f"] Apr 17 17:55:46.837698 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:46.837668 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a93bc4a-3c60-41e2-9501-0a94cde5b119" path="/var/lib/kubelet/pods/0a93bc4a-3c60-41e2-9501-0a94cde5b119/volumes" Apr 17 17:55:47.583370 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:47.583344 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr_48bf8985-2f70-4c31-98ef-3ed4b0011947/storage-initializer/0.log" Apr 17 17:55:47.583554 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:47.583380 2566 generic.go:358] "Generic (PLEG): container finished" podID="48bf8985-2f70-4c31-98ef-3ed4b0011947" containerID="c8071997389aaa3436f5ad3c588a368760480b24ccf5d504fe297b35ae0d111c" exitCode=1 Apr 17 17:55:47.583554 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:47.583429 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr" event={"ID":"48bf8985-2f70-4c31-98ef-3ed4b0011947","Type":"ContainerDied","Data":"c8071997389aaa3436f5ad3c588a368760480b24ccf5d504fe297b35ae0d111c"} Apr 17 17:55:48.587575 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:48.587545 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr_48bf8985-2f70-4c31-98ef-3ed4b0011947/storage-initializer/0.log" Apr 17 17:55:48.588017 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:48.587636 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr" event={"ID":"48bf8985-2f70-4c31-98ef-3ed4b0011947","Type":"ContainerStarted","Data":"1532119b9f8485bf49979c5b0caa381af8ba862f74862cd686afe5a3c07fc586"} Apr 17 17:55:50.594485 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:50.594453 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr_48bf8985-2f70-4c31-98ef-3ed4b0011947/storage-initializer/1.log" Apr 17 17:55:50.594885 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:50.594834 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr_48bf8985-2f70-4c31-98ef-3ed4b0011947/storage-initializer/0.log" Apr 17 17:55:50.594932 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:50.594876 2566 generic.go:358] "Generic (PLEG): container finished" podID="48bf8985-2f70-4c31-98ef-3ed4b0011947" containerID="1532119b9f8485bf49979c5b0caa381af8ba862f74862cd686afe5a3c07fc586" exitCode=1 Apr 17 17:55:50.594984 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:50.594948 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr" event={"ID":"48bf8985-2f70-4c31-98ef-3ed4b0011947","Type":"ContainerDied","Data":"1532119b9f8485bf49979c5b0caa381af8ba862f74862cd686afe5a3c07fc586"} Apr 17 17:55:50.595027 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:50.594996 2566 scope.go:117] "RemoveContainer" containerID="c8071997389aaa3436f5ad3c588a368760480b24ccf5d504fe297b35ae0d111c" Apr 17 17:55:50.595429 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:50.595407 2566 scope.go:117] "RemoveContainer" containerID="c8071997389aaa3436f5ad3c588a368760480b24ccf5d504fe297b35ae0d111c" Apr 17 17:55:50.605348 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:55:50.605318 2566 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr_kserve-ci-e2e-test_48bf8985-2f70-4c31-98ef-3ed4b0011947_0 in pod sandbox 6cbd4583e7ce44e05adbbda257ee84d7ea4a6eaa12d1f855b2e97e117bfcded9 from index: no such id: 'c8071997389aaa3436f5ad3c588a368760480b24ccf5d504fe297b35ae0d111c'" containerID="c8071997389aaa3436f5ad3c588a368760480b24ccf5d504fe297b35ae0d111c" Apr 17 17:55:50.605424 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:55:50.605360 2566 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr_kserve-ci-e2e-test_48bf8985-2f70-4c31-98ef-3ed4b0011947_0 in pod sandbox 6cbd4583e7ce44e05adbbda257ee84d7ea4a6eaa12d1f855b2e97e117bfcded9 from index: no such id: 'c8071997389aaa3436f5ad3c588a368760480b24ccf5d504fe297b35ae0d111c'; Skipping pod \"isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr_kserve-ci-e2e-test(48bf8985-2f70-4c31-98ef-3ed4b0011947)\"" logger="UnhandledError" Apr 17 17:55:50.606715 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:55:50.606689 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr_kserve-ci-e2e-test(48bf8985-2f70-4c31-98ef-3ed4b0011947)\"" pod="kserve-ci-e2e-test/isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr" podUID="48bf8985-2f70-4c31-98ef-3ed4b0011947" Apr 17 17:55:50.765623 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:50.765589 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr"] Apr 17 17:55:51.233899 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.233868 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-q2762"] Apr 17 17:55:51.234188 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.234177 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6608e481-f1cd-4b46-ad07-771dd38bec23" containerName="storage-initializer" Apr 17 17:55:51.234244 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.234190 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="6608e481-f1cd-4b46-ad07-771dd38bec23" containerName="storage-initializer" Apr 17 17:55:51.234244 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.234200 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a93bc4a-3c60-41e2-9501-0a94cde5b119" containerName="kube-rbac-proxy" Apr 17 17:55:51.234244 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.234205 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a93bc4a-3c60-41e2-9501-0a94cde5b119" containerName="kube-rbac-proxy" Apr 17 17:55:51.234244 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.234216 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a93bc4a-3c60-41e2-9501-0a94cde5b119" containerName="kserve-container" Apr 17 17:55:51.234244 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.234222 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a93bc4a-3c60-41e2-9501-0a94cde5b119" containerName="kserve-container" Apr 17 17:55:51.234244 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.234229 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a93bc4a-3c60-41e2-9501-0a94cde5b119" containerName="storage-initializer" Apr 17 17:55:51.234244 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.234235 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a93bc4a-3c60-41e2-9501-0a94cde5b119" containerName="storage-initializer" Apr 17 17:55:51.234244 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.234241 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6608e481-f1cd-4b46-ad07-771dd38bec23" containerName="storage-initializer" Apr 17 17:55:51.234244 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.234246 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="6608e481-f1cd-4b46-ad07-771dd38bec23" containerName="storage-initializer" Apr 17 17:55:51.234563 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.234311 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="6608e481-f1cd-4b46-ad07-771dd38bec23" containerName="storage-initializer" Apr 17 17:55:51.234563 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.234320 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="6608e481-f1cd-4b46-ad07-771dd38bec23" containerName="storage-initializer" Apr 17 17:55:51.234563 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.234326 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a93bc4a-3c60-41e2-9501-0a94cde5b119" containerName="kserve-container" Apr 17 17:55:51.234563 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.234334 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a93bc4a-3c60-41e2-9501-0a94cde5b119" containerName="kube-rbac-proxy" Apr 17 17:55:51.238721 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.238702 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-q2762" Apr 17 17:55:51.241093 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.241073 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-b25c4\"" Apr 17 17:55:51.241478 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.241457 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-predictor-serving-cert\"" Apr 17 17:55:51.241728 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.241715 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\"" Apr 17 17:55:51.247895 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.247875 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-q2762"] Apr 17 17:55:51.263838 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.263816 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f3150d04-4259-447a-808d-bf9278ad4eaa-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-q2762\" (UID: \"f3150d04-4259-447a-808d-bf9278ad4eaa\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-q2762" Apr 17 17:55:51.263926 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.263850 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f3150d04-4259-447a-808d-bf9278ad4eaa-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-q2762\" (UID: \"f3150d04-4259-447a-808d-bf9278ad4eaa\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-q2762" Apr 17 17:55:51.263971 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.263924 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f3150d04-4259-447a-808d-bf9278ad4eaa-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-q2762\" (UID: \"f3150d04-4259-447a-808d-bf9278ad4eaa\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-q2762" Apr 17 17:55:51.263971 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.263950 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdv4n\" (UniqueName: \"kubernetes.io/projected/f3150d04-4259-447a-808d-bf9278ad4eaa-kube-api-access-vdv4n\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-q2762\" (UID: \"f3150d04-4259-447a-808d-bf9278ad4eaa\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-q2762" Apr 17 17:55:51.365155 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.365114 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f3150d04-4259-447a-808d-bf9278ad4eaa-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-q2762\" (UID: \"f3150d04-4259-447a-808d-bf9278ad4eaa\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-q2762" Apr 17 17:55:51.365364 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.365169 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f3150d04-4259-447a-808d-bf9278ad4eaa-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-q2762\" (UID: \"f3150d04-4259-447a-808d-bf9278ad4eaa\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-q2762" Apr 17 17:55:51.365364 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.365245 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f3150d04-4259-447a-808d-bf9278ad4eaa-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-q2762\" (UID: \"f3150d04-4259-447a-808d-bf9278ad4eaa\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-q2762" Apr 17 17:55:51.365364 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.365302 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vdv4n\" (UniqueName: \"kubernetes.io/projected/f3150d04-4259-447a-808d-bf9278ad4eaa-kube-api-access-vdv4n\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-q2762\" (UID: \"f3150d04-4259-447a-808d-bf9278ad4eaa\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-q2762" Apr 17 17:55:51.365575 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.365555 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f3150d04-4259-447a-808d-bf9278ad4eaa-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-q2762\" (UID: \"f3150d04-4259-447a-808d-bf9278ad4eaa\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-q2762" Apr 17 17:55:51.365921 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.365899 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f3150d04-4259-447a-808d-bf9278ad4eaa-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-q2762\" (UID: \"f3150d04-4259-447a-808d-bf9278ad4eaa\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-q2762" Apr 17 17:55:51.367636 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.367618 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f3150d04-4259-447a-808d-bf9278ad4eaa-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-q2762\" (UID: \"f3150d04-4259-447a-808d-bf9278ad4eaa\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-q2762" Apr 17 17:55:51.374241 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.374218 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdv4n\" (UniqueName: \"kubernetes.io/projected/f3150d04-4259-447a-808d-bf9278ad4eaa-kube-api-access-vdv4n\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-q2762\" (UID: \"f3150d04-4259-447a-808d-bf9278ad4eaa\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-q2762" Apr 17 17:55:51.549624 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.549525 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-q2762" Apr 17 17:55:51.600004 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.599939 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr_48bf8985-2f70-4c31-98ef-3ed4b0011947/storage-initializer/1.log" Apr 17 17:55:51.674175 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.674145 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-q2762"] Apr 17 17:55:51.677149 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:55:51.677119 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3150d04_4259_447a_808d_bf9278ad4eaa.slice/crio-7ded926819cf9be885480567abb17dfa09646cbb48b81e1c0fa77c05330abdf2 WatchSource:0}: Error finding container 7ded926819cf9be885480567abb17dfa09646cbb48b81e1c0fa77c05330abdf2: Status 404 returned error can't find the container with id 7ded926819cf9be885480567abb17dfa09646cbb48b81e1c0fa77c05330abdf2 Apr 17 17:55:51.720489 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.720468 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr_48bf8985-2f70-4c31-98ef-3ed4b0011947/storage-initializer/1.log" Apr 17 17:55:51.720613 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.720546 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr" Apr 17 17:55:51.767726 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.767704 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-init-fail-59fc6e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/48bf8985-2f70-4c31-98ef-3ed4b0011947-isvc-init-fail-59fc6e-kube-rbac-proxy-sar-config\") pod \"48bf8985-2f70-4c31-98ef-3ed4b0011947\" (UID: \"48bf8985-2f70-4c31-98ef-3ed4b0011947\") " Apr 17 17:55:51.767863 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.767762 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/48bf8985-2f70-4c31-98ef-3ed4b0011947-proxy-tls\") pod \"48bf8985-2f70-4c31-98ef-3ed4b0011947\" (UID: \"48bf8985-2f70-4c31-98ef-3ed4b0011947\") " Apr 17 17:55:51.767863 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.767791 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbrmm\" (UniqueName: \"kubernetes.io/projected/48bf8985-2f70-4c31-98ef-3ed4b0011947-kube-api-access-mbrmm\") pod \"48bf8985-2f70-4c31-98ef-3ed4b0011947\" (UID: \"48bf8985-2f70-4c31-98ef-3ed4b0011947\") " Apr 17 17:55:51.767863 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.767813 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/48bf8985-2f70-4c31-98ef-3ed4b0011947-cabundle-cert\") pod \"48bf8985-2f70-4c31-98ef-3ed4b0011947\" (UID: \"48bf8985-2f70-4c31-98ef-3ed4b0011947\") " Apr 17 17:55:51.768221 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.768184 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48bf8985-2f70-4c31-98ef-3ed4b0011947-isvc-init-fail-59fc6e-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-init-fail-59fc6e-kube-rbac-proxy-sar-config") pod "48bf8985-2f70-4c31-98ef-3ed4b0011947" (UID: "48bf8985-2f70-4c31-98ef-3ed4b0011947"). InnerVolumeSpecName "isvc-init-fail-59fc6e-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:55:51.768349 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.768196 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48bf8985-2f70-4c31-98ef-3ed4b0011947-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "48bf8985-2f70-4c31-98ef-3ed4b0011947" (UID: "48bf8985-2f70-4c31-98ef-3ed4b0011947"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:55:51.769797 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.769777 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48bf8985-2f70-4c31-98ef-3ed4b0011947-kube-api-access-mbrmm" (OuterVolumeSpecName: "kube-api-access-mbrmm") pod "48bf8985-2f70-4c31-98ef-3ed4b0011947" (UID: "48bf8985-2f70-4c31-98ef-3ed4b0011947"). InnerVolumeSpecName "kube-api-access-mbrmm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:55:51.769846 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.769831 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48bf8985-2f70-4c31-98ef-3ed4b0011947-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "48bf8985-2f70-4c31-98ef-3ed4b0011947" (UID: "48bf8985-2f70-4c31-98ef-3ed4b0011947"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:55:51.868485 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.868452 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/48bf8985-2f70-4c31-98ef-3ed4b0011947-kserve-provision-location\") pod \"48bf8985-2f70-4c31-98ef-3ed4b0011947\" (UID: \"48bf8985-2f70-4c31-98ef-3ed4b0011947\") " Apr 17 17:55:51.868673 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.868622 2566 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/48bf8985-2f70-4c31-98ef-3ed4b0011947-cabundle-cert\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:55:51.868673 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.868635 2566 reconciler_common.go:299] "Volume detached for volume \"isvc-init-fail-59fc6e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/48bf8985-2f70-4c31-98ef-3ed4b0011947-isvc-init-fail-59fc6e-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:55:51.868673 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.868644 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/48bf8985-2f70-4c31-98ef-3ed4b0011947-proxy-tls\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:55:51.868673 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.868653 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mbrmm\" (UniqueName: \"kubernetes.io/projected/48bf8985-2f70-4c31-98ef-3ed4b0011947-kube-api-access-mbrmm\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:55:51.868810 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.868687 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48bf8985-2f70-4c31-98ef-3ed4b0011947-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "48bf8985-2f70-4c31-98ef-3ed4b0011947" (UID: "48bf8985-2f70-4c31-98ef-3ed4b0011947"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:55:51.969146 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:51.969109 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/48bf8985-2f70-4c31-98ef-3ed4b0011947-kserve-provision-location\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:55:52.604047 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:52.604006 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-q2762" event={"ID":"f3150d04-4259-447a-808d-bf9278ad4eaa","Type":"ContainerStarted","Data":"314d0f20e2ee47817bb640371c30b2da9131182730b5da411f629f1a1dc9a856"} Apr 17 17:55:52.604499 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:52.604052 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-q2762" event={"ID":"f3150d04-4259-447a-808d-bf9278ad4eaa","Type":"ContainerStarted","Data":"7ded926819cf9be885480567abb17dfa09646cbb48b81e1c0fa77c05330abdf2"} Apr 17 17:55:52.605168 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:52.605149 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr_48bf8985-2f70-4c31-98ef-3ed4b0011947/storage-initializer/1.log" Apr 17 17:55:52.605247 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:52.605223 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr" event={"ID":"48bf8985-2f70-4c31-98ef-3ed4b0011947","Type":"ContainerDied","Data":"6cbd4583e7ce44e05adbbda257ee84d7ea4a6eaa12d1f855b2e97e117bfcded9"} Apr 17 17:55:52.605334 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:52.605248 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr" Apr 17 17:55:52.605334 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:52.605248 2566 scope.go:117] "RemoveContainer" containerID="1532119b9f8485bf49979c5b0caa381af8ba862f74862cd686afe5a3c07fc586" Apr 17 17:55:52.649302 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:52.649228 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr"] Apr 17 17:55:52.653723 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:52.653694 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-59fc6e-predictor-5c54fb46c-dhhhr"] Apr 17 17:55:52.837991 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:52.837953 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48bf8985-2f70-4c31-98ef-3ed4b0011947" path="/var/lib/kubelet/pods/48bf8985-2f70-4c31-98ef-3ed4b0011947/volumes" Apr 17 17:55:56.618465 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:56.618385 2566 generic.go:358] "Generic (PLEG): container finished" podID="f3150d04-4259-447a-808d-bf9278ad4eaa" containerID="314d0f20e2ee47817bb640371c30b2da9131182730b5da411f629f1a1dc9a856" exitCode=0 Apr 17 17:55:56.618465 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:55:56.618433 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-q2762" event={"ID":"f3150d04-4259-447a-808d-bf9278ad4eaa","Type":"ContainerDied","Data":"314d0f20e2ee47817bb640371c30b2da9131182730b5da411f629f1a1dc9a856"} Apr 17 17:56:17.696032 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:56:17.695993 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-q2762" event={"ID":"f3150d04-4259-447a-808d-bf9278ad4eaa","Type":"ContainerStarted","Data":"ec93032d5b038a4e9bd0e18f29c6fbd76fc9817fcf849d858c2927d3a4680f1f"} Apr 17 17:56:17.696478 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:56:17.696041 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-q2762" event={"ID":"f3150d04-4259-447a-808d-bf9278ad4eaa","Type":"ContainerStarted","Data":"0f49e55973a7f577768dbe62b5497627e93b171337c688fe6fb85205ba554ea8"} Apr 17 17:56:17.696478 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:56:17.696217 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-q2762" Apr 17 17:56:17.729044 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:56:17.728997 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-q2762" podStartSLOduration=6.384344591 podStartE2EDuration="26.728983739s" podCreationTimestamp="2026-04-17 17:55:51 +0000 UTC" firstStartedPulling="2026-04-17 17:55:56.619540465 +0000 UTC m=+1868.289620084" lastFinishedPulling="2026-04-17 17:56:16.9641796 +0000 UTC m=+1888.634259232" observedRunningTime="2026-04-17 17:56:17.724992364 +0000 UTC m=+1889.395072004" watchObservedRunningTime="2026-04-17 17:56:17.728983739 +0000 UTC m=+1889.399063380" Apr 17 17:56:18.699051 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:56:18.699025 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-q2762" Apr 17 17:56:18.700168 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:56:18.700140 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-q2762" podUID="f3150d04-4259-447a-808d-bf9278ad4eaa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 17 17:56:19.701529 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:56:19.701487 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-q2762" podUID="f3150d04-4259-447a-808d-bf9278ad4eaa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 17 17:56:24.705913 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:56:24.705884 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-q2762" Apr 17 17:56:24.706391 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:56:24.706367 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-q2762" podUID="f3150d04-4259-447a-808d-bf9278ad4eaa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 17 17:56:34.707306 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:56:34.707246 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-q2762" podUID="f3150d04-4259-447a-808d-bf9278ad4eaa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 17 17:56:44.706442 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:56:44.706400 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-q2762" podUID="f3150d04-4259-447a-808d-bf9278ad4eaa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 17 17:56:54.706686 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:56:54.706639 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-q2762" podUID="f3150d04-4259-447a-808d-bf9278ad4eaa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 17 17:57:04.706593 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:04.706547 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-q2762" podUID="f3150d04-4259-447a-808d-bf9278ad4eaa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 17 17:57:14.707008 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:14.706960 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-q2762" podUID="f3150d04-4259-447a-808d-bf9278ad4eaa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 17 17:57:24.707135 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:24.707095 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-q2762" podUID="f3150d04-4259-447a-808d-bf9278ad4eaa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 17 17:57:30.837549 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:30.837522 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-q2762" Apr 17 17:57:41.166816 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:41.166738 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-q2762"] Apr 17 17:57:41.167320 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:41.167072 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-q2762" podUID="f3150d04-4259-447a-808d-bf9278ad4eaa" containerName="kserve-container" containerID="cri-o://0f49e55973a7f577768dbe62b5497627e93b171337c688fe6fb85205ba554ea8" gracePeriod=30 Apr 17 17:57:41.167320 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:41.167104 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-q2762" podUID="f3150d04-4259-447a-808d-bf9278ad4eaa" containerName="kube-rbac-proxy" containerID="cri-o://ec93032d5b038a4e9bd0e18f29c6fbd76fc9817fcf849d858c2927d3a4680f1f" gracePeriod=30 Apr 17 17:57:41.285620 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:41.285591 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9"] Apr 17 17:57:41.285902 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:41.285890 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="48bf8985-2f70-4c31-98ef-3ed4b0011947" containerName="storage-initializer" Apr 17 17:57:41.285960 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:41.285904 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="48bf8985-2f70-4c31-98ef-3ed4b0011947" containerName="storage-initializer" Apr 17 17:57:41.286008 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:41.285960 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="48bf8985-2f70-4c31-98ef-3ed4b0011947" containerName="storage-initializer" Apr 17 17:57:41.286008 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:41.285968 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="48bf8985-2f70-4c31-98ef-3ed4b0011947" containerName="storage-initializer" Apr 17 17:57:41.286080 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:41.286021 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="48bf8985-2f70-4c31-98ef-3ed4b0011947" containerName="storage-initializer" Apr 17 17:57:41.286080 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:41.286028 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="48bf8985-2f70-4c31-98ef-3ed4b0011947" containerName="storage-initializer" Apr 17 17:57:41.288724 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:41.288708 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9" Apr 17 17:57:41.291229 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:41.291211 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-predictor-serving-cert\"" Apr 17 17:57:41.291327 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:41.291305 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\"" Apr 17 17:57:41.302453 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:41.302431 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9"] Apr 17 17:57:41.395622 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:41.395585 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/83350f4e-1bbd-4246-b831-14e1a75e8a94-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9\" (UID: \"83350f4e-1bbd-4246-b831-14e1a75e8a94\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9" Apr 17 17:57:41.395796 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:41.395634 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/83350f4e-1bbd-4246-b831-14e1a75e8a94-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9\" (UID: \"83350f4e-1bbd-4246-b831-14e1a75e8a94\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9" Apr 17 17:57:41.395796 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:41.395751 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/83350f4e-1bbd-4246-b831-14e1a75e8a94-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9\" (UID: \"83350f4e-1bbd-4246-b831-14e1a75e8a94\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9" Apr 17 17:57:41.395796 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:41.395788 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rjwn\" (UniqueName: \"kubernetes.io/projected/83350f4e-1bbd-4246-b831-14e1a75e8a94-kube-api-access-6rjwn\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9\" (UID: \"83350f4e-1bbd-4246-b831-14e1a75e8a94\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9" Apr 17 17:57:41.496422 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:41.496344 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/83350f4e-1bbd-4246-b831-14e1a75e8a94-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9\" (UID: \"83350f4e-1bbd-4246-b831-14e1a75e8a94\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9" Apr 17 17:57:41.496422 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:41.496376 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6rjwn\" (UniqueName: \"kubernetes.io/projected/83350f4e-1bbd-4246-b831-14e1a75e8a94-kube-api-access-6rjwn\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9\" (UID: \"83350f4e-1bbd-4246-b831-14e1a75e8a94\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9" Apr 17 17:57:41.496422 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:41.496415 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/83350f4e-1bbd-4246-b831-14e1a75e8a94-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9\" (UID: \"83350f4e-1bbd-4246-b831-14e1a75e8a94\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9" Apr 17 17:57:41.496664 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:41.496457 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/83350f4e-1bbd-4246-b831-14e1a75e8a94-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9\" (UID: \"83350f4e-1bbd-4246-b831-14e1a75e8a94\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9" Apr 17 17:57:41.496664 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:57:41.496570 2566 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-serving-cert: secret "isvc-predictive-xgboost-predictor-serving-cert" not found Apr 17 17:57:41.496664 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:57:41.496642 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83350f4e-1bbd-4246-b831-14e1a75e8a94-proxy-tls podName:83350f4e-1bbd-4246-b831-14e1a75e8a94 nodeName:}" failed. No retries permitted until 2026-04-17 17:57:41.996619706 +0000 UTC m=+1973.666699329 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/83350f4e-1bbd-4246-b831-14e1a75e8a94-proxy-tls") pod "isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9" (UID: "83350f4e-1bbd-4246-b831-14e1a75e8a94") : secret "isvc-predictive-xgboost-predictor-serving-cert" not found Apr 17 17:57:41.496838 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:41.496820 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/83350f4e-1bbd-4246-b831-14e1a75e8a94-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9\" (UID: \"83350f4e-1bbd-4246-b831-14e1a75e8a94\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9" Apr 17 17:57:41.497050 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:41.497033 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/83350f4e-1bbd-4246-b831-14e1a75e8a94-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9\" (UID: \"83350f4e-1bbd-4246-b831-14e1a75e8a94\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9" Apr 17 17:57:41.506917 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:41.506884 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rjwn\" (UniqueName: \"kubernetes.io/projected/83350f4e-1bbd-4246-b831-14e1a75e8a94-kube-api-access-6rjwn\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9\" (UID: \"83350f4e-1bbd-4246-b831-14e1a75e8a94\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9" Apr 17 17:57:41.939531 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:41.939484 2566 generic.go:358] "Generic (PLEG): container finished" podID="f3150d04-4259-447a-808d-bf9278ad4eaa" containerID="ec93032d5b038a4e9bd0e18f29c6fbd76fc9817fcf849d858c2927d3a4680f1f" exitCode=2 Apr 17 17:57:41.939706 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:41.939557 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-q2762" event={"ID":"f3150d04-4259-447a-808d-bf9278ad4eaa","Type":"ContainerDied","Data":"ec93032d5b038a4e9bd0e18f29c6fbd76fc9817fcf849d858c2927d3a4680f1f"} Apr 17 17:57:42.000186 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:42.000141 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/83350f4e-1bbd-4246-b831-14e1a75e8a94-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9\" (UID: \"83350f4e-1bbd-4246-b831-14e1a75e8a94\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9" Apr 17 17:57:42.000380 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:57:42.000305 2566 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-serving-cert: secret "isvc-predictive-xgboost-predictor-serving-cert" not found Apr 17 17:57:42.000380 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:57:42.000363 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83350f4e-1bbd-4246-b831-14e1a75e8a94-proxy-tls podName:83350f4e-1bbd-4246-b831-14e1a75e8a94 nodeName:}" failed. No retries permitted until 2026-04-17 17:57:43.000348425 +0000 UTC m=+1974.670428057 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/83350f4e-1bbd-4246-b831-14e1a75e8a94-proxy-tls") pod "isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9" (UID: "83350f4e-1bbd-4246-b831-14e1a75e8a94") : secret "isvc-predictive-xgboost-predictor-serving-cert" not found Apr 17 17:57:43.009658 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:43.009618 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/83350f4e-1bbd-4246-b831-14e1a75e8a94-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9\" (UID: \"83350f4e-1bbd-4246-b831-14e1a75e8a94\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9" Apr 17 17:57:43.012082 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:43.012063 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/83350f4e-1bbd-4246-b831-14e1a75e8a94-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9\" (UID: \"83350f4e-1bbd-4246-b831-14e1a75e8a94\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9" Apr 17 17:57:43.098894 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:43.098855 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9" Apr 17 17:57:43.227147 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:43.227126 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9"] Apr 17 17:57:43.229598 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:57:43.229569 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83350f4e_1bbd_4246_b831_14e1a75e8a94.slice/crio-f590b3b20ea9f32d00dfe22a6bbe4178a51a7defba06eb0ecc057b4aa6e663c3 WatchSource:0}: Error finding container f590b3b20ea9f32d00dfe22a6bbe4178a51a7defba06eb0ecc057b4aa6e663c3: Status 404 returned error can't find the container with id f590b3b20ea9f32d00dfe22a6bbe4178a51a7defba06eb0ecc057b4aa6e663c3 Apr 17 17:57:43.946322 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:43.946283 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9" event={"ID":"83350f4e-1bbd-4246-b831-14e1a75e8a94","Type":"ContainerStarted","Data":"fbd2759fa76db387db4dbc1bd042096c7cb406f80471cce8d343414de368b279"} Apr 17 17:57:43.946322 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:43.946327 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9" event={"ID":"83350f4e-1bbd-4246-b831-14e1a75e8a94","Type":"ContainerStarted","Data":"f590b3b20ea9f32d00dfe22a6bbe4178a51a7defba06eb0ecc057b4aa6e663c3"} Apr 17 17:57:44.702023 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:44.701976 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-q2762" podUID="f3150d04-4259-447a-808d-bf9278ad4eaa" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.37:8643/healthz\": dial tcp 10.133.0.37:8643: connect: connection refused" Apr 17 17:57:46.106942 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:46.106917 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-q2762" Apr 17 17:57:46.132243 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:46.132218 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f3150d04-4259-447a-808d-bf9278ad4eaa-kserve-provision-location\") pod \"f3150d04-4259-447a-808d-bf9278ad4eaa\" (UID: \"f3150d04-4259-447a-808d-bf9278ad4eaa\") " Apr 17 17:57:46.132411 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:46.132340 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f3150d04-4259-447a-808d-bf9278ad4eaa-proxy-tls\") pod \"f3150d04-4259-447a-808d-bf9278ad4eaa\" (UID: \"f3150d04-4259-447a-808d-bf9278ad4eaa\") " Apr 17 17:57:46.132411 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:46.132391 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f3150d04-4259-447a-808d-bf9278ad4eaa-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"f3150d04-4259-447a-808d-bf9278ad4eaa\" (UID: \"f3150d04-4259-447a-808d-bf9278ad4eaa\") " Apr 17 17:57:46.132545 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:46.132432 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdv4n\" (UniqueName: \"kubernetes.io/projected/f3150d04-4259-447a-808d-bf9278ad4eaa-kube-api-access-vdv4n\") pod \"f3150d04-4259-447a-808d-bf9278ad4eaa\" (UID: \"f3150d04-4259-447a-808d-bf9278ad4eaa\") " Apr 17 17:57:46.132605 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:46.132556 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3150d04-4259-447a-808d-bf9278ad4eaa-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f3150d04-4259-447a-808d-bf9278ad4eaa" (UID: "f3150d04-4259-447a-808d-bf9278ad4eaa"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:57:46.132665 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:46.132644 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f3150d04-4259-447a-808d-bf9278ad4eaa-kserve-provision-location\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:57:46.132861 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:46.132834 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3150d04-4259-447a-808d-bf9278ad4eaa-isvc-predictive-sklearn-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-sklearn-kube-rbac-proxy-sar-config") pod "f3150d04-4259-447a-808d-bf9278ad4eaa" (UID: "f3150d04-4259-447a-808d-bf9278ad4eaa"). InnerVolumeSpecName "isvc-predictive-sklearn-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:57:46.134500 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:46.134471 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3150d04-4259-447a-808d-bf9278ad4eaa-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f3150d04-4259-447a-808d-bf9278ad4eaa" (UID: "f3150d04-4259-447a-808d-bf9278ad4eaa"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:57:46.134717 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:46.134695 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3150d04-4259-447a-808d-bf9278ad4eaa-kube-api-access-vdv4n" (OuterVolumeSpecName: "kube-api-access-vdv4n") pod "f3150d04-4259-447a-808d-bf9278ad4eaa" (UID: "f3150d04-4259-447a-808d-bf9278ad4eaa"). InnerVolumeSpecName "kube-api-access-vdv4n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:57:46.233291 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:46.233182 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f3150d04-4259-447a-808d-bf9278ad4eaa-proxy-tls\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:57:46.233291 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:46.233216 2566 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f3150d04-4259-447a-808d-bf9278ad4eaa-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:57:46.233291 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:46.233229 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vdv4n\" (UniqueName: \"kubernetes.io/projected/f3150d04-4259-447a-808d-bf9278ad4eaa-kube-api-access-vdv4n\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:57:46.956563 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:46.956531 2566 generic.go:358] "Generic (PLEG): container finished" podID="83350f4e-1bbd-4246-b831-14e1a75e8a94" containerID="fbd2759fa76db387db4dbc1bd042096c7cb406f80471cce8d343414de368b279" exitCode=0 Apr 17 17:57:46.956720 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:46.956606 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9" event={"ID":"83350f4e-1bbd-4246-b831-14e1a75e8a94","Type":"ContainerDied","Data":"fbd2759fa76db387db4dbc1bd042096c7cb406f80471cce8d343414de368b279"} Apr 17 17:57:46.958353 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:46.958334 2566 generic.go:358] "Generic (PLEG): container finished" podID="f3150d04-4259-447a-808d-bf9278ad4eaa" containerID="0f49e55973a7f577768dbe62b5497627e93b171337c688fe6fb85205ba554ea8" exitCode=0 Apr 17 17:57:46.958444 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:46.958387 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-q2762" event={"ID":"f3150d04-4259-447a-808d-bf9278ad4eaa","Type":"ContainerDied","Data":"0f49e55973a7f577768dbe62b5497627e93b171337c688fe6fb85205ba554ea8"} Apr 17 17:57:46.958444 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:46.958404 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-q2762" Apr 17 17:57:46.958444 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:46.958412 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-q2762" event={"ID":"f3150d04-4259-447a-808d-bf9278ad4eaa","Type":"ContainerDied","Data":"7ded926819cf9be885480567abb17dfa09646cbb48b81e1c0fa77c05330abdf2"} Apr 17 17:57:46.958444 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:46.958433 2566 scope.go:117] "RemoveContainer" containerID="ec93032d5b038a4e9bd0e18f29c6fbd76fc9817fcf849d858c2927d3a4680f1f" Apr 17 17:57:46.968830 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:46.968804 2566 scope.go:117] "RemoveContainer" containerID="0f49e55973a7f577768dbe62b5497627e93b171337c688fe6fb85205ba554ea8" Apr 17 17:57:46.975895 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:46.975875 2566 scope.go:117] "RemoveContainer" containerID="314d0f20e2ee47817bb640371c30b2da9131182730b5da411f629f1a1dc9a856" Apr 17 17:57:46.986781 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:46.986766 2566 scope.go:117] "RemoveContainer" containerID="ec93032d5b038a4e9bd0e18f29c6fbd76fc9817fcf849d858c2927d3a4680f1f" Apr 17 17:57:46.987055 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:57:46.987035 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec93032d5b038a4e9bd0e18f29c6fbd76fc9817fcf849d858c2927d3a4680f1f\": container with ID starting with ec93032d5b038a4e9bd0e18f29c6fbd76fc9817fcf849d858c2927d3a4680f1f not found: ID does not exist" containerID="ec93032d5b038a4e9bd0e18f29c6fbd76fc9817fcf849d858c2927d3a4680f1f" Apr 17 17:57:46.987138 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:46.987062 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec93032d5b038a4e9bd0e18f29c6fbd76fc9817fcf849d858c2927d3a4680f1f"} err="failed to get container status \"ec93032d5b038a4e9bd0e18f29c6fbd76fc9817fcf849d858c2927d3a4680f1f\": rpc error: code = NotFound desc = could not find container \"ec93032d5b038a4e9bd0e18f29c6fbd76fc9817fcf849d858c2927d3a4680f1f\": container with ID starting with ec93032d5b038a4e9bd0e18f29c6fbd76fc9817fcf849d858c2927d3a4680f1f not found: ID does not exist" Apr 17 17:57:46.987138 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:46.987080 2566 scope.go:117] "RemoveContainer" containerID="0f49e55973a7f577768dbe62b5497627e93b171337c688fe6fb85205ba554ea8" Apr 17 17:57:46.987376 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:57:46.987358 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f49e55973a7f577768dbe62b5497627e93b171337c688fe6fb85205ba554ea8\": container with ID starting with 0f49e55973a7f577768dbe62b5497627e93b171337c688fe6fb85205ba554ea8 not found: ID does not exist" containerID="0f49e55973a7f577768dbe62b5497627e93b171337c688fe6fb85205ba554ea8" Apr 17 17:57:46.987436 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:46.987382 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f49e55973a7f577768dbe62b5497627e93b171337c688fe6fb85205ba554ea8"} err="failed to get container status \"0f49e55973a7f577768dbe62b5497627e93b171337c688fe6fb85205ba554ea8\": rpc error: code = NotFound desc = could not find container \"0f49e55973a7f577768dbe62b5497627e93b171337c688fe6fb85205ba554ea8\": container with ID starting with 0f49e55973a7f577768dbe62b5497627e93b171337c688fe6fb85205ba554ea8 not found: ID does not exist" Apr 17 17:57:46.987436 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:46.987398 2566 scope.go:117] "RemoveContainer" containerID="314d0f20e2ee47817bb640371c30b2da9131182730b5da411f629f1a1dc9a856" Apr 17 17:57:46.987639 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:57:46.987621 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"314d0f20e2ee47817bb640371c30b2da9131182730b5da411f629f1a1dc9a856\": container with ID starting with 314d0f20e2ee47817bb640371c30b2da9131182730b5da411f629f1a1dc9a856 not found: ID does not exist" containerID="314d0f20e2ee47817bb640371c30b2da9131182730b5da411f629f1a1dc9a856" Apr 17 17:57:46.987679 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:46.987647 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"314d0f20e2ee47817bb640371c30b2da9131182730b5da411f629f1a1dc9a856"} err="failed to get container status \"314d0f20e2ee47817bb640371c30b2da9131182730b5da411f629f1a1dc9a856\": rpc error: code = NotFound desc = could not find container \"314d0f20e2ee47817bb640371c30b2da9131182730b5da411f629f1a1dc9a856\": container with ID starting with 314d0f20e2ee47817bb640371c30b2da9131182730b5da411f629f1a1dc9a856 not found: ID does not exist" Apr 17 17:57:46.994520 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:46.994500 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-q2762"] Apr 17 17:57:46.998616 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:46.998595 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-q2762"] Apr 17 17:57:47.965110 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:47.965075 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9" event={"ID":"83350f4e-1bbd-4246-b831-14e1a75e8a94","Type":"ContainerStarted","Data":"628f30c342cd7d5717d724688732530962b310d596d10979de7540fb671ccc8e"} Apr 17 17:57:47.965531 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:47.965117 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9" event={"ID":"83350f4e-1bbd-4246-b831-14e1a75e8a94","Type":"ContainerStarted","Data":"1fd4c8fad7420e99445344d017e3a5e92c461970aaeab67f866fbf8d10fe7e43"} Apr 17 17:57:47.965531 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:47.965333 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9" Apr 17 17:57:47.988853 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:47.988799 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9" podStartSLOduration=6.988782346 podStartE2EDuration="6.988782346s" podCreationTimestamp="2026-04-17 17:57:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:57:47.98671391 +0000 UTC m=+1979.656793552" watchObservedRunningTime="2026-04-17 17:57:47.988782346 +0000 UTC m=+1979.658861990" Apr 17 17:57:48.839022 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:48.838993 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3150d04-4259-447a-808d-bf9278ad4eaa" path="/var/lib/kubelet/pods/f3150d04-4259-447a-808d-bf9278ad4eaa/volumes" Apr 17 17:57:48.967910 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:48.967879 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9" Apr 17 17:57:48.969131 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:48.969105 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9" podUID="83350f4e-1bbd-4246-b831-14e1a75e8a94" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 17 17:57:49.970130 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:49.970092 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9" podUID="83350f4e-1bbd-4246-b831-14e1a75e8a94" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 17 17:57:54.974526 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:54.974493 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9" Apr 17 17:57:54.975074 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:57:54.975045 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9" podUID="83350f4e-1bbd-4246-b831-14e1a75e8a94" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 17 17:58:04.975868 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:58:04.975830 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9" podUID="83350f4e-1bbd-4246-b831-14e1a75e8a94" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 17 17:58:14.975144 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:58:14.975098 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9" podUID="83350f4e-1bbd-4246-b831-14e1a75e8a94" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 17 17:58:24.975271 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:58:24.975218 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9" podUID="83350f4e-1bbd-4246-b831-14e1a75e8a94" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 17 17:58:34.975852 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:58:34.975814 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9" podUID="83350f4e-1bbd-4246-b831-14e1a75e8a94" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 17 17:58:44.975059 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:58:44.975023 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9" podUID="83350f4e-1bbd-4246-b831-14e1a75e8a94" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 17 17:58:54.975652 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:58:54.975616 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9" podUID="83350f4e-1bbd-4246-b831-14e1a75e8a94" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 17 17:59:00.838071 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:00.837998 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9" Apr 17 17:59:01.350695 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:01.350656 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9"] Apr 17 17:59:01.456404 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:01.456363 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4"] Apr 17 17:59:01.456727 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:01.456711 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f3150d04-4259-447a-808d-bf9278ad4eaa" containerName="kube-rbac-proxy" Apr 17 17:59:01.456786 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:01.456730 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3150d04-4259-447a-808d-bf9278ad4eaa" containerName="kube-rbac-proxy" Apr 17 17:59:01.456786 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:01.456740 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f3150d04-4259-447a-808d-bf9278ad4eaa" containerName="storage-initializer" Apr 17 17:59:01.456786 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:01.456745 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3150d04-4259-447a-808d-bf9278ad4eaa" containerName="storage-initializer" Apr 17 17:59:01.456786 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:01.456753 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f3150d04-4259-447a-808d-bf9278ad4eaa" containerName="kserve-container" Apr 17 17:59:01.456786 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:01.456762 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3150d04-4259-447a-808d-bf9278ad4eaa" containerName="kserve-container" Apr 17 17:59:01.456946 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:01.456818 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="f3150d04-4259-447a-808d-bf9278ad4eaa" containerName="kube-rbac-proxy" Apr 17 17:59:01.456946 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:01.456831 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="f3150d04-4259-447a-808d-bf9278ad4eaa" containerName="kserve-container" Apr 17 17:59:01.465089 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:01.465065 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4" Apr 17 17:59:01.470757 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:01.470734 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4"] Apr 17 17:59:01.471104 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:01.471083 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-predictor-serving-cert\"" Apr 17 17:59:01.471274 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:01.471143 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\"" Apr 17 17:59:01.542629 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:01.542600 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/343db112-9603-4a01-95dd-da645bc1a35a-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4\" (UID: \"343db112-9603-4a01-95dd-da645bc1a35a\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4" Apr 17 17:59:01.542802 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:01.542641 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/343db112-9603-4a01-95dd-da645bc1a35a-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4\" (UID: \"343db112-9603-4a01-95dd-da645bc1a35a\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4" Apr 17 17:59:01.542802 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:01.542711 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/343db112-9603-4a01-95dd-da645bc1a35a-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4\" (UID: \"343db112-9603-4a01-95dd-da645bc1a35a\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4" Apr 17 17:59:01.542802 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:01.542734 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqhmv\" (UniqueName: \"kubernetes.io/projected/343db112-9603-4a01-95dd-da645bc1a35a-kube-api-access-mqhmv\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4\" (UID: \"343db112-9603-4a01-95dd-da645bc1a35a\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4" Apr 17 17:59:01.643683 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:01.643591 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/343db112-9603-4a01-95dd-da645bc1a35a-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4\" (UID: \"343db112-9603-4a01-95dd-da645bc1a35a\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4" Apr 17 17:59:01.643683 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:01.643638 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/343db112-9603-4a01-95dd-da645bc1a35a-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4\" (UID: \"343db112-9603-4a01-95dd-da645bc1a35a\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4" Apr 17 17:59:01.643916 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:59:01.643747 2566 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-serving-cert: secret "isvc-predictive-lightgbm-predictor-serving-cert" not found Apr 17 17:59:01.643916 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:59:01.643805 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/343db112-9603-4a01-95dd-da645bc1a35a-proxy-tls podName:343db112-9603-4a01-95dd-da645bc1a35a nodeName:}" failed. No retries permitted until 2026-04-17 17:59:02.14378794 +0000 UTC m=+2053.813867561 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/343db112-9603-4a01-95dd-da645bc1a35a-proxy-tls") pod "isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4" (UID: "343db112-9603-4a01-95dd-da645bc1a35a") : secret "isvc-predictive-lightgbm-predictor-serving-cert" not found Apr 17 17:59:01.643916 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:01.643818 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/343db112-9603-4a01-95dd-da645bc1a35a-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4\" (UID: \"343db112-9603-4a01-95dd-da645bc1a35a\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4" Apr 17 17:59:01.643916 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:01.643837 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mqhmv\" (UniqueName: \"kubernetes.io/projected/343db112-9603-4a01-95dd-da645bc1a35a-kube-api-access-mqhmv\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4\" (UID: \"343db112-9603-4a01-95dd-da645bc1a35a\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4" Apr 17 17:59:01.644139 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:01.644043 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/343db112-9603-4a01-95dd-da645bc1a35a-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4\" (UID: \"343db112-9603-4a01-95dd-da645bc1a35a\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4" Apr 17 17:59:01.644543 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:01.644526 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/343db112-9603-4a01-95dd-da645bc1a35a-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4\" (UID: \"343db112-9603-4a01-95dd-da645bc1a35a\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4" Apr 17 17:59:01.654938 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:01.654906 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqhmv\" (UniqueName: \"kubernetes.io/projected/343db112-9603-4a01-95dd-da645bc1a35a-kube-api-access-mqhmv\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4\" (UID: \"343db112-9603-4a01-95dd-da645bc1a35a\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4" Apr 17 17:59:02.148777 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:02.148734 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/343db112-9603-4a01-95dd-da645bc1a35a-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4\" (UID: \"343db112-9603-4a01-95dd-da645bc1a35a\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4" Apr 17 17:59:02.151176 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:02.151156 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/343db112-9603-4a01-95dd-da645bc1a35a-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4\" (UID: \"343db112-9603-4a01-95dd-da645bc1a35a\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4" Apr 17 17:59:02.182734 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:02.182699 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9" podUID="83350f4e-1bbd-4246-b831-14e1a75e8a94" containerName="kube-rbac-proxy" containerID="cri-o://628f30c342cd7d5717d724688732530962b310d596d10979de7540fb671ccc8e" gracePeriod=30 Apr 17 17:59:02.182905 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:02.182678 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9" podUID="83350f4e-1bbd-4246-b831-14e1a75e8a94" containerName="kserve-container" containerID="cri-o://1fd4c8fad7420e99445344d017e3a5e92c461970aaeab67f866fbf8d10fe7e43" gracePeriod=30 Apr 17 17:59:02.375737 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:02.375704 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4" Apr 17 17:59:02.497922 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:02.497897 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4"] Apr 17 17:59:02.500241 ip-10-0-140-147 kubenswrapper[2566]: W0417 17:59:02.500211 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod343db112_9603_4a01_95dd_da645bc1a35a.slice/crio-e12a5d5286611ce93039cba2ad39af58e3e69ffe6832c9e95990a0cbf72eacac WatchSource:0}: Error finding container e12a5d5286611ce93039cba2ad39af58e3e69ffe6832c9e95990a0cbf72eacac: Status 404 returned error can't find the container with id e12a5d5286611ce93039cba2ad39af58e3e69ffe6832c9e95990a0cbf72eacac Apr 17 17:59:03.186275 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:03.186232 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4" event={"ID":"343db112-9603-4a01-95dd-da645bc1a35a","Type":"ContainerStarted","Data":"cb8ca2d62fa26abbab2b32f6b8e42b2414b75b61112abac7b7830a5ad6b6084b"} Apr 17 17:59:03.186722 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:03.186284 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4" event={"ID":"343db112-9603-4a01-95dd-da645bc1a35a","Type":"ContainerStarted","Data":"e12a5d5286611ce93039cba2ad39af58e3e69ffe6832c9e95990a0cbf72eacac"} Apr 17 17:59:03.187988 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:03.187963 2566 generic.go:358] "Generic (PLEG): container finished" podID="83350f4e-1bbd-4246-b831-14e1a75e8a94" containerID="628f30c342cd7d5717d724688732530962b310d596d10979de7540fb671ccc8e" exitCode=2 Apr 17 17:59:03.188095 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:03.188031 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9" event={"ID":"83350f4e-1bbd-4246-b831-14e1a75e8a94","Type":"ContainerDied","Data":"628f30c342cd7d5717d724688732530962b310d596d10979de7540fb671ccc8e"} Apr 17 17:59:04.970851 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:04.970811 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9" podUID="83350f4e-1bbd-4246-b831-14e1a75e8a94" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.38:8643/healthz\": dial tcp 10.133.0.38:8643: connect: connection refused" Apr 17 17:59:07.124650 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:07.124627 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9" Apr 17 17:59:07.187541 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:07.187454 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/83350f4e-1bbd-4246-b831-14e1a75e8a94-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"83350f4e-1bbd-4246-b831-14e1a75e8a94\" (UID: \"83350f4e-1bbd-4246-b831-14e1a75e8a94\") " Apr 17 17:59:07.187713 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:07.187554 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rjwn\" (UniqueName: \"kubernetes.io/projected/83350f4e-1bbd-4246-b831-14e1a75e8a94-kube-api-access-6rjwn\") pod \"83350f4e-1bbd-4246-b831-14e1a75e8a94\" (UID: \"83350f4e-1bbd-4246-b831-14e1a75e8a94\") " Apr 17 17:59:07.187713 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:07.187632 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/83350f4e-1bbd-4246-b831-14e1a75e8a94-proxy-tls\") pod \"83350f4e-1bbd-4246-b831-14e1a75e8a94\" (UID: \"83350f4e-1bbd-4246-b831-14e1a75e8a94\") " Apr 17 17:59:07.187713 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:07.187666 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/83350f4e-1bbd-4246-b831-14e1a75e8a94-kserve-provision-location\") pod \"83350f4e-1bbd-4246-b831-14e1a75e8a94\" (UID: \"83350f4e-1bbd-4246-b831-14e1a75e8a94\") " Apr 17 17:59:07.187871 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:07.187833 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83350f4e-1bbd-4246-b831-14e1a75e8a94-isvc-predictive-xgboost-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-xgboost-kube-rbac-proxy-sar-config") pod "83350f4e-1bbd-4246-b831-14e1a75e8a94" (UID: "83350f4e-1bbd-4246-b831-14e1a75e8a94"). InnerVolumeSpecName "isvc-predictive-xgboost-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:59:07.187962 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:07.187945 2566 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/83350f4e-1bbd-4246-b831-14e1a75e8a94-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:59:07.188036 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:07.188005 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83350f4e-1bbd-4246-b831-14e1a75e8a94-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "83350f4e-1bbd-4246-b831-14e1a75e8a94" (UID: "83350f4e-1bbd-4246-b831-14e1a75e8a94"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:59:07.189505 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:07.189489 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83350f4e-1bbd-4246-b831-14e1a75e8a94-kube-api-access-6rjwn" (OuterVolumeSpecName: "kube-api-access-6rjwn") pod "83350f4e-1bbd-4246-b831-14e1a75e8a94" (UID: "83350f4e-1bbd-4246-b831-14e1a75e8a94"). InnerVolumeSpecName "kube-api-access-6rjwn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:59:07.189699 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:07.189678 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83350f4e-1bbd-4246-b831-14e1a75e8a94-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "83350f4e-1bbd-4246-b831-14e1a75e8a94" (UID: "83350f4e-1bbd-4246-b831-14e1a75e8a94"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:59:07.201243 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:07.201219 2566 generic.go:358] "Generic (PLEG): container finished" podID="343db112-9603-4a01-95dd-da645bc1a35a" containerID="cb8ca2d62fa26abbab2b32f6b8e42b2414b75b61112abac7b7830a5ad6b6084b" exitCode=0 Apr 17 17:59:07.201370 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:07.201281 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4" event={"ID":"343db112-9603-4a01-95dd-da645bc1a35a","Type":"ContainerDied","Data":"cb8ca2d62fa26abbab2b32f6b8e42b2414b75b61112abac7b7830a5ad6b6084b"} Apr 17 17:59:07.202522 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:07.202509 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:59:07.202901 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:07.202881 2566 generic.go:358] "Generic (PLEG): container finished" podID="83350f4e-1bbd-4246-b831-14e1a75e8a94" containerID="1fd4c8fad7420e99445344d017e3a5e92c461970aaeab67f866fbf8d10fe7e43" exitCode=0 Apr 17 17:59:07.202981 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:07.202919 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9" event={"ID":"83350f4e-1bbd-4246-b831-14e1a75e8a94","Type":"ContainerDied","Data":"1fd4c8fad7420e99445344d017e3a5e92c461970aaeab67f866fbf8d10fe7e43"} Apr 17 17:59:07.202981 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:07.202952 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9" event={"ID":"83350f4e-1bbd-4246-b831-14e1a75e8a94","Type":"ContainerDied","Data":"f590b3b20ea9f32d00dfe22a6bbe4178a51a7defba06eb0ecc057b4aa6e663c3"} Apr 17 17:59:07.202981 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:07.202973 2566 scope.go:117] "RemoveContainer" containerID="628f30c342cd7d5717d724688732530962b310d596d10979de7540fb671ccc8e" Apr 17 17:59:07.203123 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:07.202982 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9" Apr 17 17:59:07.211045 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:07.211020 2566 scope.go:117] "RemoveContainer" containerID="1fd4c8fad7420e99445344d017e3a5e92c461970aaeab67f866fbf8d10fe7e43" Apr 17 17:59:07.217918 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:07.217902 2566 scope.go:117] "RemoveContainer" containerID="fbd2759fa76db387db4dbc1bd042096c7cb406f80471cce8d343414de368b279" Apr 17 17:59:07.226955 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:07.226918 2566 scope.go:117] "RemoveContainer" containerID="628f30c342cd7d5717d724688732530962b310d596d10979de7540fb671ccc8e" Apr 17 17:59:07.227748 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:59:07.227728 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"628f30c342cd7d5717d724688732530962b310d596d10979de7540fb671ccc8e\": container with ID starting with 628f30c342cd7d5717d724688732530962b310d596d10979de7540fb671ccc8e not found: ID does not exist" containerID="628f30c342cd7d5717d724688732530962b310d596d10979de7540fb671ccc8e" Apr 17 17:59:07.227830 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:07.227759 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"628f30c342cd7d5717d724688732530962b310d596d10979de7540fb671ccc8e"} err="failed to get container status \"628f30c342cd7d5717d724688732530962b310d596d10979de7540fb671ccc8e\": rpc error: code = NotFound desc = could not find container \"628f30c342cd7d5717d724688732530962b310d596d10979de7540fb671ccc8e\": container with ID starting with 628f30c342cd7d5717d724688732530962b310d596d10979de7540fb671ccc8e not found: ID does not exist" Apr 17 17:59:07.227830 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:07.227784 2566 scope.go:117] "RemoveContainer" containerID="1fd4c8fad7420e99445344d017e3a5e92c461970aaeab67f866fbf8d10fe7e43" Apr 17 17:59:07.228133 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:59:07.228103 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fd4c8fad7420e99445344d017e3a5e92c461970aaeab67f866fbf8d10fe7e43\": container with ID starting with 1fd4c8fad7420e99445344d017e3a5e92c461970aaeab67f866fbf8d10fe7e43 not found: ID does not exist" containerID="1fd4c8fad7420e99445344d017e3a5e92c461970aaeab67f866fbf8d10fe7e43" Apr 17 17:59:07.228224 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:07.228144 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fd4c8fad7420e99445344d017e3a5e92c461970aaeab67f866fbf8d10fe7e43"} err="failed to get container status \"1fd4c8fad7420e99445344d017e3a5e92c461970aaeab67f866fbf8d10fe7e43\": rpc error: code = NotFound desc = could not find container \"1fd4c8fad7420e99445344d017e3a5e92c461970aaeab67f866fbf8d10fe7e43\": container with ID starting with 1fd4c8fad7420e99445344d017e3a5e92c461970aaeab67f866fbf8d10fe7e43 not found: ID does not exist" Apr 17 17:59:07.228224 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:07.228168 2566 scope.go:117] "RemoveContainer" containerID="fbd2759fa76db387db4dbc1bd042096c7cb406f80471cce8d343414de368b279" Apr 17 17:59:07.228467 ip-10-0-140-147 kubenswrapper[2566]: E0417 17:59:07.228445 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbd2759fa76db387db4dbc1bd042096c7cb406f80471cce8d343414de368b279\": container with ID starting with fbd2759fa76db387db4dbc1bd042096c7cb406f80471cce8d343414de368b279 not found: ID does not exist" containerID="fbd2759fa76db387db4dbc1bd042096c7cb406f80471cce8d343414de368b279" Apr 17 17:59:07.228535 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:07.228478 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbd2759fa76db387db4dbc1bd042096c7cb406f80471cce8d343414de368b279"} err="failed to get container status \"fbd2759fa76db387db4dbc1bd042096c7cb406f80471cce8d343414de368b279\": rpc error: code = NotFound desc = could not find container \"fbd2759fa76db387db4dbc1bd042096c7cb406f80471cce8d343414de368b279\": container with ID starting with fbd2759fa76db387db4dbc1bd042096c7cb406f80471cce8d343414de368b279 not found: ID does not exist" Apr 17 17:59:07.242522 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:07.242501 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9"] Apr 17 17:59:07.246145 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:07.246126 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fbvq9"] Apr 17 17:59:07.289166 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:07.289139 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6rjwn\" (UniqueName: \"kubernetes.io/projected/83350f4e-1bbd-4246-b831-14e1a75e8a94-kube-api-access-6rjwn\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:59:07.289166 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:07.289169 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/83350f4e-1bbd-4246-b831-14e1a75e8a94-proxy-tls\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:59:07.289320 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:07.289180 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/83350f4e-1bbd-4246-b831-14e1a75e8a94-kserve-provision-location\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 17:59:08.208241 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:08.208206 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4" event={"ID":"343db112-9603-4a01-95dd-da645bc1a35a","Type":"ContainerStarted","Data":"2662912e3490d7f685d62ceeb08f4764464ff9011e5f41dea50aafdae6d12f19"} Apr 17 17:59:08.208720 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:08.208246 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4" event={"ID":"343db112-9603-4a01-95dd-da645bc1a35a","Type":"ContainerStarted","Data":"9cc2fcdfac52d778db4e18caf6aedbd1dbc21177bf797983012e23ea19ed43a1"} Apr 17 17:59:08.208720 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:08.208584 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4" Apr 17 17:59:08.208720 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:08.208716 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4" Apr 17 17:59:08.209845 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:08.209818 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4" podUID="343db112-9603-4a01-95dd-da645bc1a35a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 17 17:59:08.232248 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:08.232208 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4" podStartSLOduration=7.23219471 podStartE2EDuration="7.23219471s" podCreationTimestamp="2026-04-17 17:59:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:59:08.230500701 +0000 UTC m=+2059.900580355" watchObservedRunningTime="2026-04-17 17:59:08.23219471 +0000 UTC m=+2059.902274352" Apr 17 17:59:08.838944 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:08.838912 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83350f4e-1bbd-4246-b831-14e1a75e8a94" path="/var/lib/kubelet/pods/83350f4e-1bbd-4246-b831-14e1a75e8a94/volumes" Apr 17 17:59:09.211442 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:09.211363 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4" podUID="343db112-9603-4a01-95dd-da645bc1a35a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 17 17:59:14.216188 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:14.216158 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4" Apr 17 17:59:14.216783 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:14.216754 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4" podUID="343db112-9603-4a01-95dd-da645bc1a35a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 17 17:59:24.216953 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:24.216915 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4" podUID="343db112-9603-4a01-95dd-da645bc1a35a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 17 17:59:34.216928 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:34.216886 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4" podUID="343db112-9603-4a01-95dd-da645bc1a35a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 17 17:59:44.217588 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:44.217551 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4" podUID="343db112-9603-4a01-95dd-da645bc1a35a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 17 17:59:48.904122 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:48.904093 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz777_35fe1bc2-e1a4-4744-8f67-3cc38747e567/ovn-acl-logging/0.log" Apr 17 17:59:48.906198 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:48.906176 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz777_35fe1bc2-e1a4-4744-8f67-3cc38747e567/ovn-acl-logging/0.log" Apr 17 17:59:54.217016 ip-10-0-140-147 kubenswrapper[2566]: I0417 17:59:54.216980 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4" podUID="343db112-9603-4a01-95dd-da645bc1a35a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 17 18:00:04.216644 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:04.216600 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4" podUID="343db112-9603-4a01-95dd-da645bc1a35a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 17 18:00:14.217058 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:14.217017 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4" podUID="343db112-9603-4a01-95dd-da645bc1a35a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 17 18:00:24.217310 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:24.217279 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4" Apr 17 18:00:31.619445 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:31.619409 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4"] Apr 17 18:00:31.619835 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:31.619741 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4" podUID="343db112-9603-4a01-95dd-da645bc1a35a" containerName="kserve-container" containerID="cri-o://9cc2fcdfac52d778db4e18caf6aedbd1dbc21177bf797983012e23ea19ed43a1" gracePeriod=30 Apr 17 18:00:31.619835 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:31.619783 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4" podUID="343db112-9603-4a01-95dd-da645bc1a35a" containerName="kube-rbac-proxy" containerID="cri-o://2662912e3490d7f685d62ceeb08f4764464ff9011e5f41dea50aafdae6d12f19" gracePeriod=30 Apr 17 18:00:31.752216 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:31.752184 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6"] Apr 17 18:00:31.752527 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:31.752514 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="83350f4e-1bbd-4246-b831-14e1a75e8a94" containerName="storage-initializer" Apr 17 18:00:31.752572 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:31.752529 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="83350f4e-1bbd-4246-b831-14e1a75e8a94" containerName="storage-initializer" Apr 17 18:00:31.752572 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:31.752537 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="83350f4e-1bbd-4246-b831-14e1a75e8a94" containerName="kserve-container" Apr 17 18:00:31.752572 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:31.752544 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="83350f4e-1bbd-4246-b831-14e1a75e8a94" containerName="kserve-container" Apr 17 18:00:31.752572 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:31.752556 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="83350f4e-1bbd-4246-b831-14e1a75e8a94" containerName="kube-rbac-proxy" Apr 17 18:00:31.752572 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:31.752562 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="83350f4e-1bbd-4246-b831-14e1a75e8a94" containerName="kube-rbac-proxy" Apr 17 18:00:31.752726 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:31.752616 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="83350f4e-1bbd-4246-b831-14e1a75e8a94" containerName="kube-rbac-proxy" Apr 17 18:00:31.752726 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:31.752628 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="83350f4e-1bbd-4246-b831-14e1a75e8a94" containerName="kserve-container" Apr 17 18:00:31.755684 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:31.755666 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6" Apr 17 18:00:31.757805 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:31.757785 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\"" Apr 17 18:00:31.757938 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:31.757916 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-v2-predictor-serving-cert\"" Apr 17 18:00:31.765185 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:31.765162 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6"] Apr 17 18:00:31.888882 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:31.888792 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d87f026b-345b-4c86-b78c-03c7739952ec-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6\" (UID: \"d87f026b-345b-4c86-b78c-03c7739952ec\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6" Apr 17 18:00:31.889055 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:31.888899 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d87f026b-345b-4c86-b78c-03c7739952ec-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6\" (UID: \"d87f026b-345b-4c86-b78c-03c7739952ec\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6" Apr 17 18:00:31.889055 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:31.888938 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d87f026b-345b-4c86-b78c-03c7739952ec-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6\" (UID: \"d87f026b-345b-4c86-b78c-03c7739952ec\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6" Apr 17 18:00:31.889055 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:31.888975 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbf69\" (UniqueName: \"kubernetes.io/projected/d87f026b-345b-4c86-b78c-03c7739952ec-kube-api-access-rbf69\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6\" (UID: \"d87f026b-345b-4c86-b78c-03c7739952ec\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6" Apr 17 18:00:31.989923 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:31.989893 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d87f026b-345b-4c86-b78c-03c7739952ec-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6\" (UID: \"d87f026b-345b-4c86-b78c-03c7739952ec\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6" Apr 17 18:00:31.990079 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:31.989929 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d87f026b-345b-4c86-b78c-03c7739952ec-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6\" (UID: \"d87f026b-345b-4c86-b78c-03c7739952ec\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6" Apr 17 18:00:31.990079 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:31.989956 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rbf69\" (UniqueName: \"kubernetes.io/projected/d87f026b-345b-4c86-b78c-03c7739952ec-kube-api-access-rbf69\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6\" (UID: \"d87f026b-345b-4c86-b78c-03c7739952ec\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6" Apr 17 18:00:31.990079 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:31.989989 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d87f026b-345b-4c86-b78c-03c7739952ec-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6\" (UID: \"d87f026b-345b-4c86-b78c-03c7739952ec\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6" Apr 17 18:00:31.990369 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:31.990348 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d87f026b-345b-4c86-b78c-03c7739952ec-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6\" (UID: \"d87f026b-345b-4c86-b78c-03c7739952ec\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6" Apr 17 18:00:31.990646 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:31.990627 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d87f026b-345b-4c86-b78c-03c7739952ec-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6\" (UID: \"d87f026b-345b-4c86-b78c-03c7739952ec\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6" Apr 17 18:00:31.992529 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:31.992498 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d87f026b-345b-4c86-b78c-03c7739952ec-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6\" (UID: \"d87f026b-345b-4c86-b78c-03c7739952ec\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6" Apr 17 18:00:31.998898 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:31.998875 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbf69\" (UniqueName: \"kubernetes.io/projected/d87f026b-345b-4c86-b78c-03c7739952ec-kube-api-access-rbf69\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6\" (UID: \"d87f026b-345b-4c86-b78c-03c7739952ec\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6" Apr 17 18:00:32.066015 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:32.065986 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6" Apr 17 18:00:32.190439 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:32.190413 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6"] Apr 17 18:00:32.192335 ip-10-0-140-147 kubenswrapper[2566]: W0417 18:00:32.192308 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd87f026b_345b_4c86_b78c_03c7739952ec.slice/crio-1c000d2b632f3b33b7c83c0ee4b02e50022daca651ff21890ebf22cabe469556 WatchSource:0}: Error finding container 1c000d2b632f3b33b7c83c0ee4b02e50022daca651ff21890ebf22cabe469556: Status 404 returned error can't find the container with id 1c000d2b632f3b33b7c83c0ee4b02e50022daca651ff21890ebf22cabe469556 Apr 17 18:00:32.464210 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:32.464104 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6" event={"ID":"d87f026b-345b-4c86-b78c-03c7739952ec","Type":"ContainerStarted","Data":"ab54992ec99c8811502b61e5dd11d4820a6b9881bbca49f7672068c9fca93e40"} Apr 17 18:00:32.464210 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:32.464147 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6" event={"ID":"d87f026b-345b-4c86-b78c-03c7739952ec","Type":"ContainerStarted","Data":"1c000d2b632f3b33b7c83c0ee4b02e50022daca651ff21890ebf22cabe469556"} Apr 17 18:00:32.466105 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:32.466078 2566 generic.go:358] "Generic (PLEG): container finished" podID="343db112-9603-4a01-95dd-da645bc1a35a" containerID="2662912e3490d7f685d62ceeb08f4764464ff9011e5f41dea50aafdae6d12f19" exitCode=2 Apr 17 18:00:32.466234 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:32.466122 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4" event={"ID":"343db112-9603-4a01-95dd-da645bc1a35a","Type":"ContainerDied","Data":"2662912e3490d7f685d62ceeb08f4764464ff9011e5f41dea50aafdae6d12f19"} Apr 17 18:00:34.212609 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:34.212573 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4" podUID="343db112-9603-4a01-95dd-da645bc1a35a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.39:8643/healthz\": dial tcp 10.133.0.39:8643: connect: connection refused" Apr 17 18:00:34.216915 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:34.216893 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4" podUID="343db112-9603-4a01-95dd-da645bc1a35a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 17 18:00:36.479786 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:36.479701 2566 generic.go:358] "Generic (PLEG): container finished" podID="d87f026b-345b-4c86-b78c-03c7739952ec" containerID="ab54992ec99c8811502b61e5dd11d4820a6b9881bbca49f7672068c9fca93e40" exitCode=0 Apr 17 18:00:36.480196 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:36.479778 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6" event={"ID":"d87f026b-345b-4c86-b78c-03c7739952ec","Type":"ContainerDied","Data":"ab54992ec99c8811502b61e5dd11d4820a6b9881bbca49f7672068c9fca93e40"} Apr 17 18:00:36.481664 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:36.481643 2566 generic.go:358] "Generic (PLEG): container finished" podID="343db112-9603-4a01-95dd-da645bc1a35a" containerID="9cc2fcdfac52d778db4e18caf6aedbd1dbc21177bf797983012e23ea19ed43a1" exitCode=0 Apr 17 18:00:36.481758 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:36.481716 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4" event={"ID":"343db112-9603-4a01-95dd-da645bc1a35a","Type":"ContainerDied","Data":"9cc2fcdfac52d778db4e18caf6aedbd1dbc21177bf797983012e23ea19ed43a1"} Apr 17 18:00:36.851016 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:36.850997 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4" Apr 17 18:00:36.928049 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:36.928016 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/343db112-9603-4a01-95dd-da645bc1a35a-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"343db112-9603-4a01-95dd-da645bc1a35a\" (UID: \"343db112-9603-4a01-95dd-da645bc1a35a\") " Apr 17 18:00:36.928216 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:36.928058 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/343db112-9603-4a01-95dd-da645bc1a35a-kserve-provision-location\") pod \"343db112-9603-4a01-95dd-da645bc1a35a\" (UID: \"343db112-9603-4a01-95dd-da645bc1a35a\") " Apr 17 18:00:36.928216 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:36.928096 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/343db112-9603-4a01-95dd-da645bc1a35a-proxy-tls\") pod \"343db112-9603-4a01-95dd-da645bc1a35a\" (UID: \"343db112-9603-4a01-95dd-da645bc1a35a\") " Apr 17 18:00:36.928216 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:36.928129 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqhmv\" (UniqueName: \"kubernetes.io/projected/343db112-9603-4a01-95dd-da645bc1a35a-kube-api-access-mqhmv\") pod \"343db112-9603-4a01-95dd-da645bc1a35a\" (UID: \"343db112-9603-4a01-95dd-da645bc1a35a\") " Apr 17 18:00:36.928460 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:36.928429 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/343db112-9603-4a01-95dd-da645bc1a35a-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-lightgbm-kube-rbac-proxy-sar-config") pod "343db112-9603-4a01-95dd-da645bc1a35a" (UID: "343db112-9603-4a01-95dd-da645bc1a35a"). InnerVolumeSpecName "isvc-predictive-lightgbm-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:00:36.928529 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:36.928505 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/343db112-9603-4a01-95dd-da645bc1a35a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "343db112-9603-4a01-95dd-da645bc1a35a" (UID: "343db112-9603-4a01-95dd-da645bc1a35a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:00:36.930222 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:36.930198 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/343db112-9603-4a01-95dd-da645bc1a35a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "343db112-9603-4a01-95dd-da645bc1a35a" (UID: "343db112-9603-4a01-95dd-da645bc1a35a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:00:36.930342 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:36.930198 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/343db112-9603-4a01-95dd-da645bc1a35a-kube-api-access-mqhmv" (OuterVolumeSpecName: "kube-api-access-mqhmv") pod "343db112-9603-4a01-95dd-da645bc1a35a" (UID: "343db112-9603-4a01-95dd-da645bc1a35a"). InnerVolumeSpecName "kube-api-access-mqhmv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:00:37.029282 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:37.029179 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mqhmv\" (UniqueName: \"kubernetes.io/projected/343db112-9603-4a01-95dd-da645bc1a35a-kube-api-access-mqhmv\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:00:37.029282 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:37.029217 2566 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/343db112-9603-4a01-95dd-da645bc1a35a-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:00:37.029282 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:37.029228 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/343db112-9603-4a01-95dd-da645bc1a35a-kserve-provision-location\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:00:37.029282 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:37.029237 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/343db112-9603-4a01-95dd-da645bc1a35a-proxy-tls\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:00:37.485973 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:37.485938 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6" event={"ID":"d87f026b-345b-4c86-b78c-03c7739952ec","Type":"ContainerStarted","Data":"23077df0f3a7766afc87d21ba57263b594b8c79402428e5b8aa52ede24cb5dde"} Apr 17 18:00:37.485973 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:37.485980 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6" event={"ID":"d87f026b-345b-4c86-b78c-03c7739952ec","Type":"ContainerStarted","Data":"ca6b6d47c7339e667d9bbe038f970707808baebe574df4234d0b94c191ea732f"} Apr 17 18:00:37.486542 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:37.486287 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6" Apr 17 18:00:37.486542 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:37.486317 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6" Apr 17 18:00:37.487540 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:37.487517 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4" event={"ID":"343db112-9603-4a01-95dd-da645bc1a35a","Type":"ContainerDied","Data":"e12a5d5286611ce93039cba2ad39af58e3e69ffe6832c9e95990a0cbf72eacac"} Apr 17 18:00:37.487655 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:37.487545 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4" Apr 17 18:00:37.487655 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:37.487556 2566 scope.go:117] "RemoveContainer" containerID="2662912e3490d7f685d62ceeb08f4764464ff9011e5f41dea50aafdae6d12f19" Apr 17 18:00:37.496063 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:37.496045 2566 scope.go:117] "RemoveContainer" containerID="9cc2fcdfac52d778db4e18caf6aedbd1dbc21177bf797983012e23ea19ed43a1" Apr 17 18:00:37.502793 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:37.502778 2566 scope.go:117] "RemoveContainer" containerID="cb8ca2d62fa26abbab2b32f6b8e42b2414b75b61112abac7b7830a5ad6b6084b" Apr 17 18:00:37.534586 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:37.534542 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6" podStartSLOduration=6.534529161 podStartE2EDuration="6.534529161s" podCreationTimestamp="2026-04-17 18:00:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:00:37.533924739 +0000 UTC m=+2149.204004381" watchObservedRunningTime="2026-04-17 18:00:37.534529161 +0000 UTC m=+2149.204608802" Apr 17 18:00:37.555953 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:37.555926 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4"] Apr 17 18:00:37.564730 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:37.564705 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-jzkj4"] Apr 17 18:00:38.838164 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:38.838122 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="343db112-9603-4a01-95dd-da645bc1a35a" path="/var/lib/kubelet/pods/343db112-9603-4a01-95dd-da645bc1a35a/volumes" Apr 17 18:00:43.498924 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:00:43.498900 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6" Apr 17 18:01:13.499907 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:01:13.499868 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6" podUID="d87f026b-345b-4c86-b78c-03c7739952ec" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.40:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.40:8080: connect: connection refused" Apr 17 18:01:23.499919 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:01:23.499880 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6" podUID="d87f026b-345b-4c86-b78c-03c7739952ec" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.40:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.40:8080: connect: connection refused" Apr 17 18:01:33.500405 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:01:33.500367 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6" podUID="d87f026b-345b-4c86-b78c-03c7739952ec" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.40:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.40:8080: connect: connection refused" Apr 17 18:01:43.500400 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:01:43.500360 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6" podUID="d87f026b-345b-4c86-b78c-03c7739952ec" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.40:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.40:8080: connect: connection refused" Apr 17 18:01:49.834896 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:01:49.834846 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6" podUID="d87f026b-345b-4c86-b78c-03c7739952ec" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.40:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.40:8080: connect: connection refused" Apr 17 18:01:59.838591 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:01:59.838508 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6" Apr 17 18:02:01.819160 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:01.819124 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6"] Apr 17 18:02:01.819648 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:01.819474 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6" podUID="d87f026b-345b-4c86-b78c-03c7739952ec" containerName="kserve-container" containerID="cri-o://ca6b6d47c7339e667d9bbe038f970707808baebe574df4234d0b94c191ea732f" gracePeriod=30 Apr 17 18:02:01.819648 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:01.819525 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6" podUID="d87f026b-345b-4c86-b78c-03c7739952ec" containerName="kube-rbac-proxy" containerID="cri-o://23077df0f3a7766afc87d21ba57263b594b8c79402428e5b8aa52ede24cb5dde" gracePeriod=30 Apr 17 18:02:01.951949 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:01.951910 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw"] Apr 17 18:02:01.952269 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:01.952244 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="343db112-9603-4a01-95dd-da645bc1a35a" containerName="kserve-container" Apr 17 18:02:01.952492 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:01.952270 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="343db112-9603-4a01-95dd-da645bc1a35a" containerName="kserve-container" Apr 17 18:02:01.952492 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:01.952290 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="343db112-9603-4a01-95dd-da645bc1a35a" containerName="kube-rbac-proxy" Apr 17 18:02:01.952492 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:01.952295 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="343db112-9603-4a01-95dd-da645bc1a35a" containerName="kube-rbac-proxy" Apr 17 18:02:01.952492 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:01.952305 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="343db112-9603-4a01-95dd-da645bc1a35a" containerName="storage-initializer" Apr 17 18:02:01.952492 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:01.952311 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="343db112-9603-4a01-95dd-da645bc1a35a" containerName="storage-initializer" Apr 17 18:02:01.952492 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:01.952354 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="343db112-9603-4a01-95dd-da645bc1a35a" containerName="kserve-container" Apr 17 18:02:01.952492 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:01.952364 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="343db112-9603-4a01-95dd-da645bc1a35a" containerName="kube-rbac-proxy" Apr 17 18:02:01.955324 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:01.955307 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw" Apr 17 18:02:01.958733 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:01.958712 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\"" Apr 17 18:02:01.959191 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:01.959173 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-v2-predictor-serving-cert\"" Apr 17 18:02:01.979320 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:01.979287 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw"] Apr 17 18:02:02.030377 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:02.030336 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/db0ea3f3-8fb5-40fe-988d-c8df190add27-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw\" (UID: \"db0ea3f3-8fb5-40fe-988d-c8df190add27\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw" Apr 17 18:02:02.030574 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:02.030398 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh57c\" (UniqueName: \"kubernetes.io/projected/db0ea3f3-8fb5-40fe-988d-c8df190add27-kube-api-access-fh57c\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw\" (UID: \"db0ea3f3-8fb5-40fe-988d-c8df190add27\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw" Apr 17 18:02:02.030574 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:02.030498 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/db0ea3f3-8fb5-40fe-988d-c8df190add27-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw\" (UID: \"db0ea3f3-8fb5-40fe-988d-c8df190add27\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw" Apr 17 18:02:02.030574 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:02.030557 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/db0ea3f3-8fb5-40fe-988d-c8df190add27-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw\" (UID: \"db0ea3f3-8fb5-40fe-988d-c8df190add27\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw" Apr 17 18:02:02.131550 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:02.131447 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fh57c\" (UniqueName: \"kubernetes.io/projected/db0ea3f3-8fb5-40fe-988d-c8df190add27-kube-api-access-fh57c\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw\" (UID: \"db0ea3f3-8fb5-40fe-988d-c8df190add27\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw" Apr 17 18:02:02.131550 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:02.131505 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/db0ea3f3-8fb5-40fe-988d-c8df190add27-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw\" (UID: \"db0ea3f3-8fb5-40fe-988d-c8df190add27\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw" Apr 17 18:02:02.131785 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:02.131556 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/db0ea3f3-8fb5-40fe-988d-c8df190add27-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw\" (UID: \"db0ea3f3-8fb5-40fe-988d-c8df190add27\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw" Apr 17 18:02:02.131785 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:02.131585 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/db0ea3f3-8fb5-40fe-988d-c8df190add27-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw\" (UID: \"db0ea3f3-8fb5-40fe-988d-c8df190add27\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw" Apr 17 18:02:02.132019 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:02.131997 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/db0ea3f3-8fb5-40fe-988d-c8df190add27-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw\" (UID: \"db0ea3f3-8fb5-40fe-988d-c8df190add27\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw" Apr 17 18:02:02.132294 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:02.132266 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/db0ea3f3-8fb5-40fe-988d-c8df190add27-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw\" (UID: \"db0ea3f3-8fb5-40fe-988d-c8df190add27\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw" Apr 17 18:02:02.134127 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:02.134096 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/db0ea3f3-8fb5-40fe-988d-c8df190add27-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw\" (UID: \"db0ea3f3-8fb5-40fe-988d-c8df190add27\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw" Apr 17 18:02:02.139364 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:02.139338 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh57c\" (UniqueName: \"kubernetes.io/projected/db0ea3f3-8fb5-40fe-988d-c8df190add27-kube-api-access-fh57c\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw\" (UID: \"db0ea3f3-8fb5-40fe-988d-c8df190add27\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw" Apr 17 18:02:02.265246 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:02.265199 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw" Apr 17 18:02:02.383975 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:02.383946 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw"] Apr 17 18:02:02.386326 ip-10-0-140-147 kubenswrapper[2566]: W0417 18:02:02.386297 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb0ea3f3_8fb5_40fe_988d_c8df190add27.slice/crio-7db0e2e083dbf3126ae6915b6e7b78326c8508af9647b07f4df1d6fe48352613 WatchSource:0}: Error finding container 7db0e2e083dbf3126ae6915b6e7b78326c8508af9647b07f4df1d6fe48352613: Status 404 returned error can't find the container with id 7db0e2e083dbf3126ae6915b6e7b78326c8508af9647b07f4df1d6fe48352613 Apr 17 18:02:02.747780 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:02.747688 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw" event={"ID":"db0ea3f3-8fb5-40fe-988d-c8df190add27","Type":"ContainerStarted","Data":"0bd9507104baf7fc66599ac04e72396f9c3dd09eeea4ccf63f9147986686ab3a"} Apr 17 18:02:02.747780 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:02.747728 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw" event={"ID":"db0ea3f3-8fb5-40fe-988d-c8df190add27","Type":"ContainerStarted","Data":"7db0e2e083dbf3126ae6915b6e7b78326c8508af9647b07f4df1d6fe48352613"} Apr 17 18:02:02.749511 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:02.749483 2566 generic.go:358] "Generic (PLEG): container finished" podID="d87f026b-345b-4c86-b78c-03c7739952ec" containerID="23077df0f3a7766afc87d21ba57263b594b8c79402428e5b8aa52ede24cb5dde" exitCode=2 Apr 17 18:02:02.749658 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:02.749517 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6" event={"ID":"d87f026b-345b-4c86-b78c-03c7739952ec","Type":"ContainerDied","Data":"23077df0f3a7766afc87d21ba57263b594b8c79402428e5b8aa52ede24cb5dde"} Apr 17 18:02:03.493923 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:03.493880 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6" podUID="d87f026b-345b-4c86-b78c-03c7739952ec" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.40:8643/healthz\": dial tcp 10.133.0.40:8643: connect: connection refused" Apr 17 18:02:06.660739 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:06.660718 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6" Apr 17 18:02:06.762926 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:06.762842 2566 generic.go:358] "Generic (PLEG): container finished" podID="db0ea3f3-8fb5-40fe-988d-c8df190add27" containerID="0bd9507104baf7fc66599ac04e72396f9c3dd09eeea4ccf63f9147986686ab3a" exitCode=0 Apr 17 18:02:06.762926 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:06.762913 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw" event={"ID":"db0ea3f3-8fb5-40fe-988d-c8df190add27","Type":"ContainerDied","Data":"0bd9507104baf7fc66599ac04e72396f9c3dd09eeea4ccf63f9147986686ab3a"} Apr 17 18:02:06.764619 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:06.764596 2566 generic.go:358] "Generic (PLEG): container finished" podID="d87f026b-345b-4c86-b78c-03c7739952ec" containerID="ca6b6d47c7339e667d9bbe038f970707808baebe574df4234d0b94c191ea732f" exitCode=0 Apr 17 18:02:06.764716 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:06.764633 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6" event={"ID":"d87f026b-345b-4c86-b78c-03c7739952ec","Type":"ContainerDied","Data":"ca6b6d47c7339e667d9bbe038f970707808baebe574df4234d0b94c191ea732f"} Apr 17 18:02:06.764716 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:06.764651 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6" event={"ID":"d87f026b-345b-4c86-b78c-03c7739952ec","Type":"ContainerDied","Data":"1c000d2b632f3b33b7c83c0ee4b02e50022daca651ff21890ebf22cabe469556"} Apr 17 18:02:06.764716 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:06.764668 2566 scope.go:117] "RemoveContainer" containerID="23077df0f3a7766afc87d21ba57263b594b8c79402428e5b8aa52ede24cb5dde" Apr 17 18:02:06.764716 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:06.764671 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6" Apr 17 18:02:06.772187 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:06.772168 2566 scope.go:117] "RemoveContainer" containerID="ca6b6d47c7339e667d9bbe038f970707808baebe574df4234d0b94c191ea732f" Apr 17 18:02:06.776289 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:06.776266 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d87f026b-345b-4c86-b78c-03c7739952ec-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"d87f026b-345b-4c86-b78c-03c7739952ec\" (UID: \"d87f026b-345b-4c86-b78c-03c7739952ec\") " Apr 17 18:02:06.776395 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:06.776321 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d87f026b-345b-4c86-b78c-03c7739952ec-kserve-provision-location\") pod \"d87f026b-345b-4c86-b78c-03c7739952ec\" (UID: \"d87f026b-345b-4c86-b78c-03c7739952ec\") " Apr 17 18:02:06.776395 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:06.776353 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbf69\" (UniqueName: \"kubernetes.io/projected/d87f026b-345b-4c86-b78c-03c7739952ec-kube-api-access-rbf69\") pod \"d87f026b-345b-4c86-b78c-03c7739952ec\" (UID: \"d87f026b-345b-4c86-b78c-03c7739952ec\") " Apr 17 18:02:06.776395 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:06.776391 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d87f026b-345b-4c86-b78c-03c7739952ec-proxy-tls\") pod \"d87f026b-345b-4c86-b78c-03c7739952ec\" (UID: \"d87f026b-345b-4c86-b78c-03c7739952ec\") " Apr 17 18:02:06.776660 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:06.776628 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d87f026b-345b-4c86-b78c-03c7739952ec-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config") pod "d87f026b-345b-4c86-b78c-03c7739952ec" (UID: "d87f026b-345b-4c86-b78c-03c7739952ec"). InnerVolumeSpecName "isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:02:06.776778 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:06.776673 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d87f026b-345b-4c86-b78c-03c7739952ec-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d87f026b-345b-4c86-b78c-03c7739952ec" (UID: "d87f026b-345b-4c86-b78c-03c7739952ec"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:02:06.778448 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:06.778423 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d87f026b-345b-4c86-b78c-03c7739952ec-kube-api-access-rbf69" (OuterVolumeSpecName: "kube-api-access-rbf69") pod "d87f026b-345b-4c86-b78c-03c7739952ec" (UID: "d87f026b-345b-4c86-b78c-03c7739952ec"). InnerVolumeSpecName "kube-api-access-rbf69". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:02:06.778568 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:06.778424 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d87f026b-345b-4c86-b78c-03c7739952ec-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d87f026b-345b-4c86-b78c-03c7739952ec" (UID: "d87f026b-345b-4c86-b78c-03c7739952ec"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:02:06.779806 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:06.779699 2566 scope.go:117] "RemoveContainer" containerID="ab54992ec99c8811502b61e5dd11d4820a6b9881bbca49f7672068c9fca93e40" Apr 17 18:02:06.790319 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:06.790303 2566 scope.go:117] "RemoveContainer" containerID="23077df0f3a7766afc87d21ba57263b594b8c79402428e5b8aa52ede24cb5dde" Apr 17 18:02:06.790547 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:02:06.790527 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23077df0f3a7766afc87d21ba57263b594b8c79402428e5b8aa52ede24cb5dde\": container with ID starting with 23077df0f3a7766afc87d21ba57263b594b8c79402428e5b8aa52ede24cb5dde not found: ID does not exist" containerID="23077df0f3a7766afc87d21ba57263b594b8c79402428e5b8aa52ede24cb5dde" Apr 17 18:02:06.790602 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:06.790555 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23077df0f3a7766afc87d21ba57263b594b8c79402428e5b8aa52ede24cb5dde"} err="failed to get container status \"23077df0f3a7766afc87d21ba57263b594b8c79402428e5b8aa52ede24cb5dde\": rpc error: code = NotFound desc = could not find container \"23077df0f3a7766afc87d21ba57263b594b8c79402428e5b8aa52ede24cb5dde\": container with ID starting with 23077df0f3a7766afc87d21ba57263b594b8c79402428e5b8aa52ede24cb5dde not found: ID does not exist" Apr 17 18:02:06.790602 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:06.790573 2566 scope.go:117] "RemoveContainer" containerID="ca6b6d47c7339e667d9bbe038f970707808baebe574df4234d0b94c191ea732f" Apr 17 18:02:06.790788 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:02:06.790770 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca6b6d47c7339e667d9bbe038f970707808baebe574df4234d0b94c191ea732f\": container with ID starting with ca6b6d47c7339e667d9bbe038f970707808baebe574df4234d0b94c191ea732f not found: ID does not exist" containerID="ca6b6d47c7339e667d9bbe038f970707808baebe574df4234d0b94c191ea732f" Apr 17 18:02:06.790830 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:06.790795 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca6b6d47c7339e667d9bbe038f970707808baebe574df4234d0b94c191ea732f"} err="failed to get container status \"ca6b6d47c7339e667d9bbe038f970707808baebe574df4234d0b94c191ea732f\": rpc error: code = NotFound desc = could not find container \"ca6b6d47c7339e667d9bbe038f970707808baebe574df4234d0b94c191ea732f\": container with ID starting with ca6b6d47c7339e667d9bbe038f970707808baebe574df4234d0b94c191ea732f not found: ID does not exist" Apr 17 18:02:06.790830 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:06.790810 2566 scope.go:117] "RemoveContainer" containerID="ab54992ec99c8811502b61e5dd11d4820a6b9881bbca49f7672068c9fca93e40" Apr 17 18:02:06.791067 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:02:06.791050 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab54992ec99c8811502b61e5dd11d4820a6b9881bbca49f7672068c9fca93e40\": container with ID starting with ab54992ec99c8811502b61e5dd11d4820a6b9881bbca49f7672068c9fca93e40 not found: ID does not exist" containerID="ab54992ec99c8811502b61e5dd11d4820a6b9881bbca49f7672068c9fca93e40" Apr 17 18:02:06.791120 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:06.791071 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab54992ec99c8811502b61e5dd11d4820a6b9881bbca49f7672068c9fca93e40"} err="failed to get container status \"ab54992ec99c8811502b61e5dd11d4820a6b9881bbca49f7672068c9fca93e40\": rpc error: code = NotFound desc = could not find container \"ab54992ec99c8811502b61e5dd11d4820a6b9881bbca49f7672068c9fca93e40\": container with ID starting with ab54992ec99c8811502b61e5dd11d4820a6b9881bbca49f7672068c9fca93e40 not found: ID does not exist" Apr 17 18:02:06.877411 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:06.877386 2566 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d87f026b-345b-4c86-b78c-03c7739952ec-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:02:06.877533 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:06.877416 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d87f026b-345b-4c86-b78c-03c7739952ec-kserve-provision-location\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:02:06.877533 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:06.877432 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rbf69\" (UniqueName: \"kubernetes.io/projected/d87f026b-345b-4c86-b78c-03c7739952ec-kube-api-access-rbf69\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:02:06.877533 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:06.877444 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d87f026b-345b-4c86-b78c-03c7739952ec-proxy-tls\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:02:07.083338 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:07.083307 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6"] Apr 17 18:02:07.089204 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:07.089179 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9qst6"] Apr 17 18:02:07.770298 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:07.770248 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw" event={"ID":"db0ea3f3-8fb5-40fe-988d-c8df190add27","Type":"ContainerStarted","Data":"a1a0891f570ea313e87eea8fa01bedd712c178e7cd0032502491ac738d04d4bd"} Apr 17 18:02:07.770751 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:07.770306 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw" event={"ID":"db0ea3f3-8fb5-40fe-988d-c8df190add27","Type":"ContainerStarted","Data":"4d55ec023307d91219634b929b25b73fca1b2234ff31de96ad15fad7f52ada36"} Apr 17 18:02:07.770751 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:07.770628 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw" Apr 17 18:02:07.770751 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:07.770659 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw" Apr 17 18:02:07.789820 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:07.789741 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw" podStartSLOduration=6.789725091 podStartE2EDuration="6.789725091s" podCreationTimestamp="2026-04-17 18:02:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:02:07.788402275 +0000 UTC m=+2239.458481916" watchObservedRunningTime="2026-04-17 18:02:07.789725091 +0000 UTC m=+2239.459804735" Apr 17 18:02:08.838830 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:08.838794 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d87f026b-345b-4c86-b78c-03c7739952ec" path="/var/lib/kubelet/pods/d87f026b-345b-4c86-b78c-03c7739952ec/volumes" Apr 17 18:02:13.780309 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:13.780279 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw" Apr 17 18:02:43.781150 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:43.781108 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw" podUID="db0ea3f3-8fb5-40fe-988d-c8df190add27" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.41:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.41:8080: connect: connection refused" Apr 17 18:02:53.781233 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:02:53.781191 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw" podUID="db0ea3f3-8fb5-40fe-988d-c8df190add27" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.41:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.41:8080: connect: connection refused" Apr 17 18:03:03.781078 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:03.781039 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw" podUID="db0ea3f3-8fb5-40fe-988d-c8df190add27" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.41:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.41:8080: connect: connection refused" Apr 17 18:03:13.781767 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:13.781720 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw" podUID="db0ea3f3-8fb5-40fe-988d-c8df190add27" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.41:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.41:8080: connect: connection refused" Apr 17 18:03:23.784582 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:23.784551 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw" Apr 17 18:03:32.061685 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:32.061649 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw"] Apr 17 18:03:32.062332 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:32.062063 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw" podUID="db0ea3f3-8fb5-40fe-988d-c8df190add27" containerName="kserve-container" containerID="cri-o://4d55ec023307d91219634b929b25b73fca1b2234ff31de96ad15fad7f52ada36" gracePeriod=30 Apr 17 18:03:32.062332 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:32.062113 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw" podUID="db0ea3f3-8fb5-40fe-988d-c8df190add27" containerName="kube-rbac-proxy" containerID="cri-o://a1a0891f570ea313e87eea8fa01bedd712c178e7cd0032502491ac738d04d4bd" gracePeriod=30 Apr 17 18:03:32.154269 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:32.154224 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d"] Apr 17 18:03:32.154611 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:32.154594 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d87f026b-345b-4c86-b78c-03c7739952ec" containerName="storage-initializer" Apr 17 18:03:32.154656 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:32.154614 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="d87f026b-345b-4c86-b78c-03c7739952ec" containerName="storage-initializer" Apr 17 18:03:32.154656 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:32.154626 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d87f026b-345b-4c86-b78c-03c7739952ec" containerName="kserve-container" Apr 17 18:03:32.154656 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:32.154632 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="d87f026b-345b-4c86-b78c-03c7739952ec" containerName="kserve-container" Apr 17 18:03:32.154656 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:32.154640 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d87f026b-345b-4c86-b78c-03c7739952ec" containerName="kube-rbac-proxy" Apr 17 18:03:32.154656 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:32.154647 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="d87f026b-345b-4c86-b78c-03c7739952ec" containerName="kube-rbac-proxy" Apr 17 18:03:32.154823 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:32.154721 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="d87f026b-345b-4c86-b78c-03c7739952ec" containerName="kserve-container" Apr 17 18:03:32.154823 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:32.154731 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="d87f026b-345b-4c86-b78c-03c7739952ec" containerName="kube-rbac-proxy" Apr 17 18:03:32.157924 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:32.157909 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d" Apr 17 18:03:32.159888 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:32.159866 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-v2-predictor-serving-cert\"" Apr 17 18:03:32.159990 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:32.159906 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\"" Apr 17 18:03:32.166536 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:32.166512 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d"] Apr 17 18:03:32.200676 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:32.200645 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hb7x\" (UniqueName: \"kubernetes.io/projected/bdbf7bdc-838e-4dbd-809b-bae32fefd176-kube-api-access-5hb7x\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d\" (UID: \"bdbf7bdc-838e-4dbd-809b-bae32fefd176\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d" Apr 17 18:03:32.200676 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:32.200677 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bdbf7bdc-838e-4dbd-809b-bae32fefd176-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d\" (UID: \"bdbf7bdc-838e-4dbd-809b-bae32fefd176\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d" Apr 17 18:03:32.200863 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:32.200700 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bdbf7bdc-838e-4dbd-809b-bae32fefd176-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d\" (UID: \"bdbf7bdc-838e-4dbd-809b-bae32fefd176\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d" Apr 17 18:03:32.200863 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:32.200726 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bdbf7bdc-838e-4dbd-809b-bae32fefd176-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d\" (UID: \"bdbf7bdc-838e-4dbd-809b-bae32fefd176\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d" Apr 17 18:03:32.301516 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:32.301485 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5hb7x\" (UniqueName: \"kubernetes.io/projected/bdbf7bdc-838e-4dbd-809b-bae32fefd176-kube-api-access-5hb7x\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d\" (UID: \"bdbf7bdc-838e-4dbd-809b-bae32fefd176\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d" Apr 17 18:03:32.301516 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:32.301519 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bdbf7bdc-838e-4dbd-809b-bae32fefd176-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d\" (UID: \"bdbf7bdc-838e-4dbd-809b-bae32fefd176\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d" Apr 17 18:03:32.301753 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:32.301543 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bdbf7bdc-838e-4dbd-809b-bae32fefd176-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d\" (UID: \"bdbf7bdc-838e-4dbd-809b-bae32fefd176\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d" Apr 17 18:03:32.301753 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:32.301572 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bdbf7bdc-838e-4dbd-809b-bae32fefd176-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d\" (UID: \"bdbf7bdc-838e-4dbd-809b-bae32fefd176\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d" Apr 17 18:03:32.301937 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:32.301919 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bdbf7bdc-838e-4dbd-809b-bae32fefd176-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d\" (UID: \"bdbf7bdc-838e-4dbd-809b-bae32fefd176\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d" Apr 17 18:03:32.302191 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:32.302170 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bdbf7bdc-838e-4dbd-809b-bae32fefd176-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d\" (UID: \"bdbf7bdc-838e-4dbd-809b-bae32fefd176\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d" Apr 17 18:03:32.303957 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:32.303931 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bdbf7bdc-838e-4dbd-809b-bae32fefd176-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d\" (UID: \"bdbf7bdc-838e-4dbd-809b-bae32fefd176\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d" Apr 17 18:03:32.317733 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:32.317675 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hb7x\" (UniqueName: \"kubernetes.io/projected/bdbf7bdc-838e-4dbd-809b-bae32fefd176-kube-api-access-5hb7x\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d\" (UID: \"bdbf7bdc-838e-4dbd-809b-bae32fefd176\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d" Apr 17 18:03:32.469411 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:32.469375 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d" Apr 17 18:03:32.588632 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:32.588605 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d"] Apr 17 18:03:32.591073 ip-10-0-140-147 kubenswrapper[2566]: W0417 18:03:32.591044 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdbf7bdc_838e_4dbd_809b_bae32fefd176.slice/crio-4039fd5c9ec4d3435f8737e3cf35c0e4f931ede18c0bc97a0e5ded2364fbee8f WatchSource:0}: Error finding container 4039fd5c9ec4d3435f8737e3cf35c0e4f931ede18c0bc97a0e5ded2364fbee8f: Status 404 returned error can't find the container with id 4039fd5c9ec4d3435f8737e3cf35c0e4f931ede18c0bc97a0e5ded2364fbee8f Apr 17 18:03:33.015109 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:33.015004 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d" event={"ID":"bdbf7bdc-838e-4dbd-809b-bae32fefd176","Type":"ContainerStarted","Data":"57b238446eb516f60ea8ea8fbe35766f3a5510d9cc21c40a4ac4999a92646ce8"} Apr 17 18:03:33.015109 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:33.015053 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d" event={"ID":"bdbf7bdc-838e-4dbd-809b-bae32fefd176","Type":"ContainerStarted","Data":"4039fd5c9ec4d3435f8737e3cf35c0e4f931ede18c0bc97a0e5ded2364fbee8f"} Apr 17 18:03:33.016966 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:33.016938 2566 generic.go:358] "Generic (PLEG): container finished" podID="db0ea3f3-8fb5-40fe-988d-c8df190add27" containerID="a1a0891f570ea313e87eea8fa01bedd712c178e7cd0032502491ac738d04d4bd" exitCode=2 Apr 17 18:03:33.017114 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:33.016974 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw" event={"ID":"db0ea3f3-8fb5-40fe-988d-c8df190add27","Type":"ContainerDied","Data":"a1a0891f570ea313e87eea8fa01bedd712c178e7cd0032502491ac738d04d4bd"} Apr 17 18:03:33.775713 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:33.775673 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw" podUID="db0ea3f3-8fb5-40fe-988d-c8df190add27" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.41:8643/healthz\": dial tcp 10.133.0.41:8643: connect: connection refused" Apr 17 18:03:33.781730 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:33.781700 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw" podUID="db0ea3f3-8fb5-40fe-988d-c8df190add27" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.41:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.41:8080: connect: connection refused" Apr 17 18:03:36.802761 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:36.802739 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw" Apr 17 18:03:36.944911 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:36.944874 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/db0ea3f3-8fb5-40fe-988d-c8df190add27-kserve-provision-location\") pod \"db0ea3f3-8fb5-40fe-988d-c8df190add27\" (UID: \"db0ea3f3-8fb5-40fe-988d-c8df190add27\") " Apr 17 18:03:36.944911 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:36.944921 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fh57c\" (UniqueName: \"kubernetes.io/projected/db0ea3f3-8fb5-40fe-988d-c8df190add27-kube-api-access-fh57c\") pod \"db0ea3f3-8fb5-40fe-988d-c8df190add27\" (UID: \"db0ea3f3-8fb5-40fe-988d-c8df190add27\") " Apr 17 18:03:36.945167 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:36.944943 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/db0ea3f3-8fb5-40fe-988d-c8df190add27-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"db0ea3f3-8fb5-40fe-988d-c8df190add27\" (UID: \"db0ea3f3-8fb5-40fe-988d-c8df190add27\") " Apr 17 18:03:36.945167 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:36.945024 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/db0ea3f3-8fb5-40fe-988d-c8df190add27-proxy-tls\") pod \"db0ea3f3-8fb5-40fe-988d-c8df190add27\" (UID: \"db0ea3f3-8fb5-40fe-988d-c8df190add27\") " Apr 17 18:03:36.945325 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:36.945207 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db0ea3f3-8fb5-40fe-988d-c8df190add27-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "db0ea3f3-8fb5-40fe-988d-c8df190add27" (UID: "db0ea3f3-8fb5-40fe-988d-c8df190add27"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:03:36.945403 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:36.945374 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db0ea3f3-8fb5-40fe-988d-c8df190add27-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config") pod "db0ea3f3-8fb5-40fe-988d-c8df190add27" (UID: "db0ea3f3-8fb5-40fe-988d-c8df190add27"). InnerVolumeSpecName "isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:03:36.947078 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:36.947051 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db0ea3f3-8fb5-40fe-988d-c8df190add27-kube-api-access-fh57c" (OuterVolumeSpecName: "kube-api-access-fh57c") pod "db0ea3f3-8fb5-40fe-988d-c8df190add27" (UID: "db0ea3f3-8fb5-40fe-988d-c8df190add27"). InnerVolumeSpecName "kube-api-access-fh57c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:03:36.947078 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:36.947066 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db0ea3f3-8fb5-40fe-988d-c8df190add27-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "db0ea3f3-8fb5-40fe-988d-c8df190add27" (UID: "db0ea3f3-8fb5-40fe-988d-c8df190add27"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:03:37.029495 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:37.029459 2566 generic.go:358] "Generic (PLEG): container finished" podID="bdbf7bdc-838e-4dbd-809b-bae32fefd176" containerID="57b238446eb516f60ea8ea8fbe35766f3a5510d9cc21c40a4ac4999a92646ce8" exitCode=0 Apr 17 18:03:37.029662 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:37.029541 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d" event={"ID":"bdbf7bdc-838e-4dbd-809b-bae32fefd176","Type":"ContainerDied","Data":"57b238446eb516f60ea8ea8fbe35766f3a5510d9cc21c40a4ac4999a92646ce8"} Apr 17 18:03:37.031090 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:37.031067 2566 generic.go:358] "Generic (PLEG): container finished" podID="db0ea3f3-8fb5-40fe-988d-c8df190add27" containerID="4d55ec023307d91219634b929b25b73fca1b2234ff31de96ad15fad7f52ada36" exitCode=0 Apr 17 18:03:37.031190 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:37.031110 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw" event={"ID":"db0ea3f3-8fb5-40fe-988d-c8df190add27","Type":"ContainerDied","Data":"4d55ec023307d91219634b929b25b73fca1b2234ff31de96ad15fad7f52ada36"} Apr 17 18:03:37.031190 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:37.031130 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw" event={"ID":"db0ea3f3-8fb5-40fe-988d-c8df190add27","Type":"ContainerDied","Data":"7db0e2e083dbf3126ae6915b6e7b78326c8508af9647b07f4df1d6fe48352613"} Apr 17 18:03:37.031190 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:37.031144 2566 scope.go:117] "RemoveContainer" containerID="a1a0891f570ea313e87eea8fa01bedd712c178e7cd0032502491ac738d04d4bd" Apr 17 18:03:37.031190 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:37.031153 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw" Apr 17 18:03:37.041363 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:37.041340 2566 scope.go:117] "RemoveContainer" containerID="4d55ec023307d91219634b929b25b73fca1b2234ff31de96ad15fad7f52ada36" Apr 17 18:03:37.045490 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:37.045473 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/db0ea3f3-8fb5-40fe-988d-c8df190add27-proxy-tls\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:03:37.045590 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:37.045492 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/db0ea3f3-8fb5-40fe-988d-c8df190add27-kserve-provision-location\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:03:37.045590 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:37.045502 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fh57c\" (UniqueName: \"kubernetes.io/projected/db0ea3f3-8fb5-40fe-988d-c8df190add27-kube-api-access-fh57c\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:03:37.045590 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:37.045513 2566 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/db0ea3f3-8fb5-40fe-988d-c8df190add27-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:03:37.049361 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:37.049336 2566 scope.go:117] "RemoveContainer" containerID="0bd9507104baf7fc66599ac04e72396f9c3dd09eeea4ccf63f9147986686ab3a" Apr 17 18:03:37.057699 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:37.057674 2566 scope.go:117] "RemoveContainer" containerID="a1a0891f570ea313e87eea8fa01bedd712c178e7cd0032502491ac738d04d4bd" Apr 17 18:03:37.058001 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:03:37.057982 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1a0891f570ea313e87eea8fa01bedd712c178e7cd0032502491ac738d04d4bd\": container with ID starting with a1a0891f570ea313e87eea8fa01bedd712c178e7cd0032502491ac738d04d4bd not found: ID does not exist" containerID="a1a0891f570ea313e87eea8fa01bedd712c178e7cd0032502491ac738d04d4bd" Apr 17 18:03:37.058072 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:37.058011 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1a0891f570ea313e87eea8fa01bedd712c178e7cd0032502491ac738d04d4bd"} err="failed to get container status \"a1a0891f570ea313e87eea8fa01bedd712c178e7cd0032502491ac738d04d4bd\": rpc error: code = NotFound desc = could not find container \"a1a0891f570ea313e87eea8fa01bedd712c178e7cd0032502491ac738d04d4bd\": container with ID starting with a1a0891f570ea313e87eea8fa01bedd712c178e7cd0032502491ac738d04d4bd not found: ID does not exist" Apr 17 18:03:37.058072 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:37.058031 2566 scope.go:117] "RemoveContainer" containerID="4d55ec023307d91219634b929b25b73fca1b2234ff31de96ad15fad7f52ada36" Apr 17 18:03:37.058352 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:03:37.058335 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d55ec023307d91219634b929b25b73fca1b2234ff31de96ad15fad7f52ada36\": container with ID starting with 4d55ec023307d91219634b929b25b73fca1b2234ff31de96ad15fad7f52ada36 not found: ID does not exist" containerID="4d55ec023307d91219634b929b25b73fca1b2234ff31de96ad15fad7f52ada36" Apr 17 18:03:37.058426 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:37.058357 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d55ec023307d91219634b929b25b73fca1b2234ff31de96ad15fad7f52ada36"} err="failed to get container status \"4d55ec023307d91219634b929b25b73fca1b2234ff31de96ad15fad7f52ada36\": rpc error: code = NotFound desc = could not find container \"4d55ec023307d91219634b929b25b73fca1b2234ff31de96ad15fad7f52ada36\": container with ID starting with 4d55ec023307d91219634b929b25b73fca1b2234ff31de96ad15fad7f52ada36 not found: ID does not exist" Apr 17 18:03:37.058426 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:37.058372 2566 scope.go:117] "RemoveContainer" containerID="0bd9507104baf7fc66599ac04e72396f9c3dd09eeea4ccf63f9147986686ab3a" Apr 17 18:03:37.058647 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:03:37.058629 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bd9507104baf7fc66599ac04e72396f9c3dd09eeea4ccf63f9147986686ab3a\": container with ID starting with 0bd9507104baf7fc66599ac04e72396f9c3dd09eeea4ccf63f9147986686ab3a not found: ID does not exist" containerID="0bd9507104baf7fc66599ac04e72396f9c3dd09eeea4ccf63f9147986686ab3a" Apr 17 18:03:37.058700 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:37.058654 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bd9507104baf7fc66599ac04e72396f9c3dd09eeea4ccf63f9147986686ab3a"} err="failed to get container status \"0bd9507104baf7fc66599ac04e72396f9c3dd09eeea4ccf63f9147986686ab3a\": rpc error: code = NotFound desc = could not find container \"0bd9507104baf7fc66599ac04e72396f9c3dd09eeea4ccf63f9147986686ab3a\": container with ID starting with 0bd9507104baf7fc66599ac04e72396f9c3dd09eeea4ccf63f9147986686ab3a not found: ID does not exist" Apr 17 18:03:37.058834 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:37.058817 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw"] Apr 17 18:03:37.061628 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:37.061605 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-llqnw"] Apr 17 18:03:38.036232 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:38.036188 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d" event={"ID":"bdbf7bdc-838e-4dbd-809b-bae32fefd176","Type":"ContainerStarted","Data":"9d04048916bf62f0e5ce4cf444e0fce4c7760fffeb168fca9d4d098cdee97ce8"} Apr 17 18:03:38.036820 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:38.036247 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d" event={"ID":"bdbf7bdc-838e-4dbd-809b-bae32fefd176","Type":"ContainerStarted","Data":"88ac8e1c7af58ac1a47f0392eafe5863ee141d5f097c75372586824cef1ff188"} Apr 17 18:03:38.036820 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:38.036592 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d" Apr 17 18:03:38.036820 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:38.036622 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d" Apr 17 18:03:38.055324 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:38.055270 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d" podStartSLOduration=6.055243013 podStartE2EDuration="6.055243013s" podCreationTimestamp="2026-04-17 18:03:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:03:38.053459299 +0000 UTC m=+2329.723538953" watchObservedRunningTime="2026-04-17 18:03:38.055243013 +0000 UTC m=+2329.725322654" Apr 17 18:03:38.838898 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:38.838857 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db0ea3f3-8fb5-40fe-988d-c8df190add27" path="/var/lib/kubelet/pods/db0ea3f3-8fb5-40fe-988d-c8df190add27/volumes" Apr 17 18:03:44.044941 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:03:44.044916 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d" Apr 17 18:04:14.045576 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:04:14.045534 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d" podUID="bdbf7bdc-838e-4dbd-809b-bae32fefd176" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.42:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.42:8080: connect: connection refused" Apr 17 18:04:24.045773 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:04:24.045737 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d" podUID="bdbf7bdc-838e-4dbd-809b-bae32fefd176" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.42:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.42:8080: connect: connection refused" Apr 17 18:04:34.045562 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:04:34.045522 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d" podUID="bdbf7bdc-838e-4dbd-809b-bae32fefd176" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.42:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.42:8080: connect: connection refused" Apr 17 18:04:44.045677 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:04:44.045636 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d" podUID="bdbf7bdc-838e-4dbd-809b-bae32fefd176" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.42:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.42:8080: connect: connection refused" Apr 17 18:04:48.926170 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:04:48.926140 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz777_35fe1bc2-e1a4-4744-8f67-3cc38747e567/ovn-acl-logging/0.log" Apr 17 18:04:48.928544 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:04:48.928525 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz777_35fe1bc2-e1a4-4744-8f67-3cc38747e567/ovn-acl-logging/0.log" Apr 17 18:04:54.048809 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:04:54.048776 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d" Apr 17 18:05:02.276279 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:02.276182 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d"] Apr 17 18:05:02.276758 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:02.276500 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d" podUID="bdbf7bdc-838e-4dbd-809b-bae32fefd176" containerName="kserve-container" containerID="cri-o://88ac8e1c7af58ac1a47f0392eafe5863ee141d5f097c75372586824cef1ff188" gracePeriod=30 Apr 17 18:05:02.277027 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:02.276547 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d" podUID="bdbf7bdc-838e-4dbd-809b-bae32fefd176" containerName="kube-rbac-proxy" containerID="cri-o://9d04048916bf62f0e5ce4cf444e0fce4c7760fffeb168fca9d4d098cdee97ce8" gracePeriod=30 Apr 17 18:05:03.286002 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:03.285967 2566 generic.go:358] "Generic (PLEG): container finished" podID="bdbf7bdc-838e-4dbd-809b-bae32fefd176" containerID="9d04048916bf62f0e5ce4cf444e0fce4c7760fffeb168fca9d4d098cdee97ce8" exitCode=2 Apr 17 18:05:03.286397 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:03.286037 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d" event={"ID":"bdbf7bdc-838e-4dbd-809b-bae32fefd176","Type":"ContainerDied","Data":"9d04048916bf62f0e5ce4cf444e0fce4c7760fffeb168fca9d4d098cdee97ce8"} Apr 17 18:05:04.041218 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:04.041170 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d" podUID="bdbf7bdc-838e-4dbd-809b-bae32fefd176" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.42:8643/healthz\": dial tcp 10.133.0.42:8643: connect: connection refused" Apr 17 18:05:04.045552 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:04.045520 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d" podUID="bdbf7bdc-838e-4dbd-809b-bae32fefd176" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.42:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.42:8080: connect: connection refused" Apr 17 18:05:04.472007 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:04.471970 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-922cg"] Apr 17 18:05:04.472518 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:04.472351 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="db0ea3f3-8fb5-40fe-988d-c8df190add27" containerName="storage-initializer" Apr 17 18:05:04.472518 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:04.472365 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="db0ea3f3-8fb5-40fe-988d-c8df190add27" containerName="storage-initializer" Apr 17 18:05:04.472518 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:04.472375 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="db0ea3f3-8fb5-40fe-988d-c8df190add27" containerName="kube-rbac-proxy" Apr 17 18:05:04.472518 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:04.472381 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="db0ea3f3-8fb5-40fe-988d-c8df190add27" containerName="kube-rbac-proxy" Apr 17 18:05:04.472518 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:04.472397 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="db0ea3f3-8fb5-40fe-988d-c8df190add27" containerName="kserve-container" Apr 17 18:05:04.472518 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:04.472404 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="db0ea3f3-8fb5-40fe-988d-c8df190add27" containerName="kserve-container" Apr 17 18:05:04.472518 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:04.472457 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="db0ea3f3-8fb5-40fe-988d-c8df190add27" containerName="kserve-container" Apr 17 18:05:04.472518 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:04.472467 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="db0ea3f3-8fb5-40fe-988d-c8df190add27" containerName="kube-rbac-proxy" Apr 17 18:05:04.475630 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:04.475614 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-922cg" Apr 17 18:05:04.477837 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:04.477817 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-predictor-serving-cert\"" Apr 17 18:05:04.477956 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:04.477818 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-kube-rbac-proxy-sar-config\"" Apr 17 18:05:04.486555 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:04.486528 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-922cg"] Apr 17 18:05:04.578045 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:04.578002 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7457f096-03cf-4eea-8155-125687ef0331-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-predictor-d8dbfbbb9-922cg\" (UID: \"7457f096-03cf-4eea-8155-125687ef0331\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-922cg" Apr 17 18:05:04.578045 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:04.578050 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7457f096-03cf-4eea-8155-125687ef0331-proxy-tls\") pod \"isvc-sklearn-predictor-d8dbfbbb9-922cg\" (UID: \"7457f096-03cf-4eea-8155-125687ef0331\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-922cg" Apr 17 18:05:04.578299 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:04.578089 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7457f096-03cf-4eea-8155-125687ef0331-kserve-provision-location\") pod \"isvc-sklearn-predictor-d8dbfbbb9-922cg\" (UID: \"7457f096-03cf-4eea-8155-125687ef0331\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-922cg" Apr 17 18:05:04.578299 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:04.578133 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x47q2\" (UniqueName: \"kubernetes.io/projected/7457f096-03cf-4eea-8155-125687ef0331-kube-api-access-x47q2\") pod \"isvc-sklearn-predictor-d8dbfbbb9-922cg\" (UID: \"7457f096-03cf-4eea-8155-125687ef0331\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-922cg" Apr 17 18:05:04.678887 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:04.678849 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7457f096-03cf-4eea-8155-125687ef0331-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-predictor-d8dbfbbb9-922cg\" (UID: \"7457f096-03cf-4eea-8155-125687ef0331\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-922cg" Apr 17 18:05:04.678887 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:04.678892 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7457f096-03cf-4eea-8155-125687ef0331-proxy-tls\") pod \"isvc-sklearn-predictor-d8dbfbbb9-922cg\" (UID: \"7457f096-03cf-4eea-8155-125687ef0331\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-922cg" Apr 17 18:05:04.679139 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:04.678912 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7457f096-03cf-4eea-8155-125687ef0331-kserve-provision-location\") pod \"isvc-sklearn-predictor-d8dbfbbb9-922cg\" (UID: \"7457f096-03cf-4eea-8155-125687ef0331\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-922cg" Apr 17 18:05:04.679139 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:04.678945 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x47q2\" (UniqueName: \"kubernetes.io/projected/7457f096-03cf-4eea-8155-125687ef0331-kube-api-access-x47q2\") pod \"isvc-sklearn-predictor-d8dbfbbb9-922cg\" (UID: \"7457f096-03cf-4eea-8155-125687ef0331\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-922cg" Apr 17 18:05:04.679405 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:04.679381 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7457f096-03cf-4eea-8155-125687ef0331-kserve-provision-location\") pod \"isvc-sklearn-predictor-d8dbfbbb9-922cg\" (UID: \"7457f096-03cf-4eea-8155-125687ef0331\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-922cg" Apr 17 18:05:04.679630 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:04.679607 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7457f096-03cf-4eea-8155-125687ef0331-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-predictor-d8dbfbbb9-922cg\" (UID: \"7457f096-03cf-4eea-8155-125687ef0331\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-922cg" Apr 17 18:05:04.681445 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:04.681414 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7457f096-03cf-4eea-8155-125687ef0331-proxy-tls\") pod \"isvc-sklearn-predictor-d8dbfbbb9-922cg\" (UID: \"7457f096-03cf-4eea-8155-125687ef0331\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-922cg" Apr 17 18:05:04.686306 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:04.686279 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x47q2\" (UniqueName: \"kubernetes.io/projected/7457f096-03cf-4eea-8155-125687ef0331-kube-api-access-x47q2\") pod \"isvc-sklearn-predictor-d8dbfbbb9-922cg\" (UID: \"7457f096-03cf-4eea-8155-125687ef0331\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-922cg" Apr 17 18:05:04.786592 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:04.786494 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-922cg" Apr 17 18:05:04.909348 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:04.909323 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-922cg"] Apr 17 18:05:04.911572 ip-10-0-140-147 kubenswrapper[2566]: W0417 18:05:04.911539 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7457f096_03cf_4eea_8155_125687ef0331.slice/crio-3ca69fe6aa582ddba8210045cde63287d31e62b22572161b185b222422f62ece WatchSource:0}: Error finding container 3ca69fe6aa582ddba8210045cde63287d31e62b22572161b185b222422f62ece: Status 404 returned error can't find the container with id 3ca69fe6aa582ddba8210045cde63287d31e62b22572161b185b222422f62ece Apr 17 18:05:04.913769 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:04.913753 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 18:05:05.293245 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:05.293207 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-922cg" event={"ID":"7457f096-03cf-4eea-8155-125687ef0331","Type":"ContainerStarted","Data":"85870eb8d16e5a0aa653954423d87c8d90196f01bcd75fec76d849a2cb23dee3"} Apr 17 18:05:05.293245 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:05.293247 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-922cg" event={"ID":"7457f096-03cf-4eea-8155-125687ef0331","Type":"ContainerStarted","Data":"3ca69fe6aa582ddba8210045cde63287d31e62b22572161b185b222422f62ece"} Apr 17 18:05:07.301198 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:07.301108 2566 generic.go:358] "Generic (PLEG): container finished" podID="bdbf7bdc-838e-4dbd-809b-bae32fefd176" containerID="88ac8e1c7af58ac1a47f0392eafe5863ee141d5f097c75372586824cef1ff188" exitCode=0 Apr 17 18:05:07.301198 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:07.301164 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d" event={"ID":"bdbf7bdc-838e-4dbd-809b-bae32fefd176","Type":"ContainerDied","Data":"88ac8e1c7af58ac1a47f0392eafe5863ee141d5f097c75372586824cef1ff188"} Apr 17 18:05:07.513823 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:07.513799 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d" Apr 17 18:05:07.603108 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:07.603074 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hb7x\" (UniqueName: \"kubernetes.io/projected/bdbf7bdc-838e-4dbd-809b-bae32fefd176-kube-api-access-5hb7x\") pod \"bdbf7bdc-838e-4dbd-809b-bae32fefd176\" (UID: \"bdbf7bdc-838e-4dbd-809b-bae32fefd176\") " Apr 17 18:05:07.603304 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:07.603116 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bdbf7bdc-838e-4dbd-809b-bae32fefd176-proxy-tls\") pod \"bdbf7bdc-838e-4dbd-809b-bae32fefd176\" (UID: \"bdbf7bdc-838e-4dbd-809b-bae32fefd176\") " Apr 17 18:05:07.603304 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:07.603140 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bdbf7bdc-838e-4dbd-809b-bae32fefd176-kserve-provision-location\") pod \"bdbf7bdc-838e-4dbd-809b-bae32fefd176\" (UID: \"bdbf7bdc-838e-4dbd-809b-bae32fefd176\") " Apr 17 18:05:07.603304 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:07.603287 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bdbf7bdc-838e-4dbd-809b-bae32fefd176-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"bdbf7bdc-838e-4dbd-809b-bae32fefd176\" (UID: \"bdbf7bdc-838e-4dbd-809b-bae32fefd176\") " Apr 17 18:05:07.603500 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:07.603480 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdbf7bdc-838e-4dbd-809b-bae32fefd176-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bdbf7bdc-838e-4dbd-809b-bae32fefd176" (UID: "bdbf7bdc-838e-4dbd-809b-bae32fefd176"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:05:07.603622 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:07.603601 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdbf7bdc-838e-4dbd-809b-bae32fefd176-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config") pod "bdbf7bdc-838e-4dbd-809b-bae32fefd176" (UID: "bdbf7bdc-838e-4dbd-809b-bae32fefd176"). InnerVolumeSpecName "isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:05:07.603671 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:07.603621 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bdbf7bdc-838e-4dbd-809b-bae32fefd176-kserve-provision-location\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:05:07.605311 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:07.605291 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdbf7bdc-838e-4dbd-809b-bae32fefd176-kube-api-access-5hb7x" (OuterVolumeSpecName: "kube-api-access-5hb7x") pod "bdbf7bdc-838e-4dbd-809b-bae32fefd176" (UID: "bdbf7bdc-838e-4dbd-809b-bae32fefd176"). InnerVolumeSpecName "kube-api-access-5hb7x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:05:07.605378 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:07.605361 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdbf7bdc-838e-4dbd-809b-bae32fefd176-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "bdbf7bdc-838e-4dbd-809b-bae32fefd176" (UID: "bdbf7bdc-838e-4dbd-809b-bae32fefd176"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:05:07.704583 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:07.704548 2566 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bdbf7bdc-838e-4dbd-809b-bae32fefd176-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:05:07.704583 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:07.704577 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5hb7x\" (UniqueName: \"kubernetes.io/projected/bdbf7bdc-838e-4dbd-809b-bae32fefd176-kube-api-access-5hb7x\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:05:07.704583 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:07.704589 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bdbf7bdc-838e-4dbd-809b-bae32fefd176-proxy-tls\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:05:08.305588 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:08.305552 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d" event={"ID":"bdbf7bdc-838e-4dbd-809b-bae32fefd176","Type":"ContainerDied","Data":"4039fd5c9ec4d3435f8737e3cf35c0e4f931ede18c0bc97a0e5ded2364fbee8f"} Apr 17 18:05:08.305588 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:08.305593 2566 scope.go:117] "RemoveContainer" containerID="9d04048916bf62f0e5ce4cf444e0fce4c7760fffeb168fca9d4d098cdee97ce8" Apr 17 18:05:08.306065 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:08.305614 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d" Apr 17 18:05:08.314150 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:08.314116 2566 scope.go:117] "RemoveContainer" containerID="88ac8e1c7af58ac1a47f0392eafe5863ee141d5f097c75372586824cef1ff188" Apr 17 18:05:08.323572 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:08.323497 2566 scope.go:117] "RemoveContainer" containerID="57b238446eb516f60ea8ea8fbe35766f3a5510d9cc21c40a4ac4999a92646ce8" Apr 17 18:05:08.327283 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:08.327244 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d"] Apr 17 18:05:08.331380 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:08.331355 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-q9b4d"] Apr 17 18:05:08.837537 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:08.837503 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdbf7bdc-838e-4dbd-809b-bae32fefd176" path="/var/lib/kubelet/pods/bdbf7bdc-838e-4dbd-809b-bae32fefd176/volumes" Apr 17 18:05:09.309955 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:09.309921 2566 generic.go:358] "Generic (PLEG): container finished" podID="7457f096-03cf-4eea-8155-125687ef0331" containerID="85870eb8d16e5a0aa653954423d87c8d90196f01bcd75fec76d849a2cb23dee3" exitCode=0 Apr 17 18:05:09.310343 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:09.309991 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-922cg" event={"ID":"7457f096-03cf-4eea-8155-125687ef0331","Type":"ContainerDied","Data":"85870eb8d16e5a0aa653954423d87c8d90196f01bcd75fec76d849a2cb23dee3"} Apr 17 18:05:10.314223 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:10.314188 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-922cg" event={"ID":"7457f096-03cf-4eea-8155-125687ef0331","Type":"ContainerStarted","Data":"7f6e8e67578d8ae42aa73b40388c9e767dc73f209311f0588693d6f801b505b8"} Apr 17 18:05:10.314631 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:10.314230 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-922cg" event={"ID":"7457f096-03cf-4eea-8155-125687ef0331","Type":"ContainerStarted","Data":"c1530705af74036ac969d7b516162dc64ccb01f3f2590f71fde5a5cbf50c67a3"} Apr 17 18:05:10.314631 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:10.314594 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-922cg" Apr 17 18:05:10.314706 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:10.314691 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-922cg" Apr 17 18:05:10.315712 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:10.315686 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-922cg" podUID="7457f096-03cf-4eea-8155-125687ef0331" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 17 18:05:10.332561 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:10.332514 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-922cg" podStartSLOduration=6.332499754 podStartE2EDuration="6.332499754s" podCreationTimestamp="2026-04-17 18:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:05:10.331584471 +0000 UTC m=+2422.001664092" watchObservedRunningTime="2026-04-17 18:05:10.332499754 +0000 UTC m=+2422.002579396" Apr 17 18:05:11.317497 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:11.317460 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-922cg" podUID="7457f096-03cf-4eea-8155-125687ef0331" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 17 18:05:16.321926 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:16.321892 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-922cg" Apr 17 18:05:16.322591 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:16.322563 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-922cg" podUID="7457f096-03cf-4eea-8155-125687ef0331" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 17 18:05:26.322589 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:26.322549 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-922cg" podUID="7457f096-03cf-4eea-8155-125687ef0331" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 17 18:05:36.322930 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:36.322890 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-922cg" podUID="7457f096-03cf-4eea-8155-125687ef0331" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 17 18:05:46.322758 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:46.322712 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-922cg" podUID="7457f096-03cf-4eea-8155-125687ef0331" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 17 18:05:56.322630 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:05:56.322587 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-922cg" podUID="7457f096-03cf-4eea-8155-125687ef0331" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 17 18:06:06.323265 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:06.323215 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-922cg" podUID="7457f096-03cf-4eea-8155-125687ef0331" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 17 18:06:16.323161 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:16.323132 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-922cg" Apr 17 18:06:24.705109 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:24.705079 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t5ztv"] Apr 17 18:06:24.705513 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:24.705432 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bdbf7bdc-838e-4dbd-809b-bae32fefd176" containerName="kserve-container" Apr 17 18:06:24.705513 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:24.705451 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdbf7bdc-838e-4dbd-809b-bae32fefd176" containerName="kserve-container" Apr 17 18:06:24.705513 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:24.705474 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bdbf7bdc-838e-4dbd-809b-bae32fefd176" containerName="storage-initializer" Apr 17 18:06:24.705513 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:24.705483 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdbf7bdc-838e-4dbd-809b-bae32fefd176" containerName="storage-initializer" Apr 17 18:06:24.705513 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:24.705497 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bdbf7bdc-838e-4dbd-809b-bae32fefd176" containerName="kube-rbac-proxy" Apr 17 18:06:24.705513 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:24.705506 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdbf7bdc-838e-4dbd-809b-bae32fefd176" containerName="kube-rbac-proxy" Apr 17 18:06:24.705826 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:24.705578 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="bdbf7bdc-838e-4dbd-809b-bae32fefd176" containerName="kserve-container" Apr 17 18:06:24.705826 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:24.705590 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="bdbf7bdc-838e-4dbd-809b-bae32fefd176" containerName="kube-rbac-proxy" Apr 17 18:06:24.711429 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:24.711407 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t5ztv" Apr 17 18:06:24.713737 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:24.713713 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\"" Apr 17 18:06:24.713909 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:24.713890 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sklearn-v2-mlserver-predictor-serving-cert\"" Apr 17 18:06:24.723029 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:24.723008 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t5ztv"] Apr 17 18:06:24.749165 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:24.749141 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-922cg"] Apr 17 18:06:24.749507 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:24.749459 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-922cg" podUID="7457f096-03cf-4eea-8155-125687ef0331" containerName="kserve-container" containerID="cri-o://c1530705af74036ac969d7b516162dc64ccb01f3f2590f71fde5a5cbf50c67a3" gracePeriod=30 Apr 17 18:06:24.749641 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:24.749490 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-922cg" podUID="7457f096-03cf-4eea-8155-125687ef0331" containerName="kube-rbac-proxy" containerID="cri-o://7f6e8e67578d8ae42aa73b40388c9e767dc73f209311f0588693d6f801b505b8" gracePeriod=30 Apr 17 18:06:24.865943 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:24.865917 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3fbc5012-95be-4ad0-8ea5-6612d33e801c-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-t5ztv\" (UID: \"3fbc5012-95be-4ad0-8ea5-6612d33e801c\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t5ztv" Apr 17 18:06:24.866007 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:24.865956 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3fbc5012-95be-4ad0-8ea5-6612d33e801c-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"sklearn-v2-mlserver-predictor-65d8664766-t5ztv\" (UID: \"3fbc5012-95be-4ad0-8ea5-6612d33e801c\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t5ztv" Apr 17 18:06:24.866058 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:24.866041 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9sws\" (UniqueName: \"kubernetes.io/projected/3fbc5012-95be-4ad0-8ea5-6612d33e801c-kube-api-access-f9sws\") pod \"sklearn-v2-mlserver-predictor-65d8664766-t5ztv\" (UID: \"3fbc5012-95be-4ad0-8ea5-6612d33e801c\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t5ztv" Apr 17 18:06:24.866095 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:24.866079 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3fbc5012-95be-4ad0-8ea5-6612d33e801c-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-65d8664766-t5ztv\" (UID: \"3fbc5012-95be-4ad0-8ea5-6612d33e801c\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t5ztv" Apr 17 18:06:24.967559 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:24.967474 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3fbc5012-95be-4ad0-8ea5-6612d33e801c-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-t5ztv\" (UID: \"3fbc5012-95be-4ad0-8ea5-6612d33e801c\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t5ztv" Apr 17 18:06:24.967559 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:24.967514 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3fbc5012-95be-4ad0-8ea5-6612d33e801c-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"sklearn-v2-mlserver-predictor-65d8664766-t5ztv\" (UID: \"3fbc5012-95be-4ad0-8ea5-6612d33e801c\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t5ztv" Apr 17 18:06:24.967559 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:24.967551 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f9sws\" (UniqueName: \"kubernetes.io/projected/3fbc5012-95be-4ad0-8ea5-6612d33e801c-kube-api-access-f9sws\") pod \"sklearn-v2-mlserver-predictor-65d8664766-t5ztv\" (UID: \"3fbc5012-95be-4ad0-8ea5-6612d33e801c\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t5ztv" Apr 17 18:06:24.967805 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:24.967575 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3fbc5012-95be-4ad0-8ea5-6612d33e801c-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-65d8664766-t5ztv\" (UID: \"3fbc5012-95be-4ad0-8ea5-6612d33e801c\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t5ztv" Apr 17 18:06:24.968050 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:24.968024 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3fbc5012-95be-4ad0-8ea5-6612d33e801c-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-65d8664766-t5ztv\" (UID: \"3fbc5012-95be-4ad0-8ea5-6612d33e801c\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t5ztv" Apr 17 18:06:24.968312 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:24.968289 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3fbc5012-95be-4ad0-8ea5-6612d33e801c-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"sklearn-v2-mlserver-predictor-65d8664766-t5ztv\" (UID: \"3fbc5012-95be-4ad0-8ea5-6612d33e801c\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t5ztv" Apr 17 18:06:24.969929 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:24.969910 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3fbc5012-95be-4ad0-8ea5-6612d33e801c-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-t5ztv\" (UID: \"3fbc5012-95be-4ad0-8ea5-6612d33e801c\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t5ztv" Apr 17 18:06:24.976243 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:24.976220 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9sws\" (UniqueName: \"kubernetes.io/projected/3fbc5012-95be-4ad0-8ea5-6612d33e801c-kube-api-access-f9sws\") pod \"sklearn-v2-mlserver-predictor-65d8664766-t5ztv\" (UID: \"3fbc5012-95be-4ad0-8ea5-6612d33e801c\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t5ztv" Apr 17 18:06:25.023238 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:25.023199 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t5ztv" Apr 17 18:06:25.143181 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:25.143129 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t5ztv"] Apr 17 18:06:25.145820 ip-10-0-140-147 kubenswrapper[2566]: W0417 18:06:25.145793 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fbc5012_95be_4ad0_8ea5_6612d33e801c.slice/crio-6a322b2cda88d150b660f3ec12eed3320cbbf014e9353050f3e262d352cb3ce8 WatchSource:0}: Error finding container 6a322b2cda88d150b660f3ec12eed3320cbbf014e9353050f3e262d352cb3ce8: Status 404 returned error can't find the container with id 6a322b2cda88d150b660f3ec12eed3320cbbf014e9353050f3e262d352cb3ce8 Apr 17 18:06:25.526970 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:25.526933 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t5ztv" event={"ID":"3fbc5012-95be-4ad0-8ea5-6612d33e801c","Type":"ContainerStarted","Data":"e36f7f19e148f1bb7df1d0194525469c330674f8ecb1a6dab9cc40b44be75e8e"} Apr 17 18:06:25.526970 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:25.526973 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t5ztv" event={"ID":"3fbc5012-95be-4ad0-8ea5-6612d33e801c","Type":"ContainerStarted","Data":"6a322b2cda88d150b660f3ec12eed3320cbbf014e9353050f3e262d352cb3ce8"} Apr 17 18:06:25.528895 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:25.528865 2566 generic.go:358] "Generic (PLEG): container finished" podID="7457f096-03cf-4eea-8155-125687ef0331" containerID="7f6e8e67578d8ae42aa73b40388c9e767dc73f209311f0588693d6f801b505b8" exitCode=2 Apr 17 18:06:25.529024 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:25.528937 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-922cg" event={"ID":"7457f096-03cf-4eea-8155-125687ef0331","Type":"ContainerDied","Data":"7f6e8e67578d8ae42aa73b40388c9e767dc73f209311f0588693d6f801b505b8"} Apr 17 18:06:26.318699 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:26.318616 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-922cg" podUID="7457f096-03cf-4eea-8155-125687ef0331" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.43:8643/healthz\": dial tcp 10.133.0.43:8643: connect: connection refused" Apr 17 18:06:26.322961 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:26.322936 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-922cg" podUID="7457f096-03cf-4eea-8155-125687ef0331" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 17 18:06:28.984879 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:28.984856 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-922cg" Apr 17 18:06:29.099769 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:29.099683 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x47q2\" (UniqueName: \"kubernetes.io/projected/7457f096-03cf-4eea-8155-125687ef0331-kube-api-access-x47q2\") pod \"7457f096-03cf-4eea-8155-125687ef0331\" (UID: \"7457f096-03cf-4eea-8155-125687ef0331\") " Apr 17 18:06:29.099769 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:29.099736 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7457f096-03cf-4eea-8155-125687ef0331-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"7457f096-03cf-4eea-8155-125687ef0331\" (UID: \"7457f096-03cf-4eea-8155-125687ef0331\") " Apr 17 18:06:29.099993 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:29.099775 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7457f096-03cf-4eea-8155-125687ef0331-proxy-tls\") pod \"7457f096-03cf-4eea-8155-125687ef0331\" (UID: \"7457f096-03cf-4eea-8155-125687ef0331\") " Apr 17 18:06:29.099993 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:29.099883 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7457f096-03cf-4eea-8155-125687ef0331-kserve-provision-location\") pod \"7457f096-03cf-4eea-8155-125687ef0331\" (UID: \"7457f096-03cf-4eea-8155-125687ef0331\") " Apr 17 18:06:29.100124 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:29.100100 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7457f096-03cf-4eea-8155-125687ef0331-isvc-sklearn-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-kube-rbac-proxy-sar-config") pod "7457f096-03cf-4eea-8155-125687ef0331" (UID: "7457f096-03cf-4eea-8155-125687ef0331"). InnerVolumeSpecName "isvc-sklearn-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:06:29.100201 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:29.100179 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7457f096-03cf-4eea-8155-125687ef0331-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7457f096-03cf-4eea-8155-125687ef0331" (UID: "7457f096-03cf-4eea-8155-125687ef0331"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:06:29.100280 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:29.100216 2566 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7457f096-03cf-4eea-8155-125687ef0331-isvc-sklearn-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:06:29.101876 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:29.101855 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7457f096-03cf-4eea-8155-125687ef0331-kube-api-access-x47q2" (OuterVolumeSpecName: "kube-api-access-x47q2") pod "7457f096-03cf-4eea-8155-125687ef0331" (UID: "7457f096-03cf-4eea-8155-125687ef0331"). InnerVolumeSpecName "kube-api-access-x47q2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:06:29.101876 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:29.101863 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7457f096-03cf-4eea-8155-125687ef0331-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "7457f096-03cf-4eea-8155-125687ef0331" (UID: "7457f096-03cf-4eea-8155-125687ef0331"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:06:29.201610 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:29.201561 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x47q2\" (UniqueName: \"kubernetes.io/projected/7457f096-03cf-4eea-8155-125687ef0331-kube-api-access-x47q2\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:06:29.201610 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:29.201603 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7457f096-03cf-4eea-8155-125687ef0331-proxy-tls\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:06:29.201610 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:29.201614 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7457f096-03cf-4eea-8155-125687ef0331-kserve-provision-location\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:06:29.544012 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:29.543920 2566 generic.go:358] "Generic (PLEG): container finished" podID="3fbc5012-95be-4ad0-8ea5-6612d33e801c" containerID="e36f7f19e148f1bb7df1d0194525469c330674f8ecb1a6dab9cc40b44be75e8e" exitCode=0 Apr 17 18:06:29.544012 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:29.543997 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t5ztv" event={"ID":"3fbc5012-95be-4ad0-8ea5-6612d33e801c","Type":"ContainerDied","Data":"e36f7f19e148f1bb7df1d0194525469c330674f8ecb1a6dab9cc40b44be75e8e"} Apr 17 18:06:29.545763 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:29.545742 2566 generic.go:358] "Generic (PLEG): container finished" podID="7457f096-03cf-4eea-8155-125687ef0331" containerID="c1530705af74036ac969d7b516162dc64ccb01f3f2590f71fde5a5cbf50c67a3" exitCode=0 Apr 17 18:06:29.545984 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:29.545784 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-922cg" event={"ID":"7457f096-03cf-4eea-8155-125687ef0331","Type":"ContainerDied","Data":"c1530705af74036ac969d7b516162dc64ccb01f3f2590f71fde5a5cbf50c67a3"} Apr 17 18:06:29.545984 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:29.545810 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-922cg" Apr 17 18:06:29.545984 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:29.545826 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-922cg" event={"ID":"7457f096-03cf-4eea-8155-125687ef0331","Type":"ContainerDied","Data":"3ca69fe6aa582ddba8210045cde63287d31e62b22572161b185b222422f62ece"} Apr 17 18:06:29.545984 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:29.545849 2566 scope.go:117] "RemoveContainer" containerID="7f6e8e67578d8ae42aa73b40388c9e767dc73f209311f0588693d6f801b505b8" Apr 17 18:06:29.555088 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:29.555070 2566 scope.go:117] "RemoveContainer" containerID="c1530705af74036ac969d7b516162dc64ccb01f3f2590f71fde5a5cbf50c67a3" Apr 17 18:06:29.563186 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:29.563134 2566 scope.go:117] "RemoveContainer" containerID="85870eb8d16e5a0aa653954423d87c8d90196f01bcd75fec76d849a2cb23dee3" Apr 17 18:06:29.572473 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:29.572451 2566 scope.go:117] "RemoveContainer" containerID="7f6e8e67578d8ae42aa73b40388c9e767dc73f209311f0588693d6f801b505b8" Apr 17 18:06:29.572743 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:06:29.572724 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f6e8e67578d8ae42aa73b40388c9e767dc73f209311f0588693d6f801b505b8\": container with ID starting with 7f6e8e67578d8ae42aa73b40388c9e767dc73f209311f0588693d6f801b505b8 not found: ID does not exist" containerID="7f6e8e67578d8ae42aa73b40388c9e767dc73f209311f0588693d6f801b505b8" Apr 17 18:06:29.572851 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:29.572758 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f6e8e67578d8ae42aa73b40388c9e767dc73f209311f0588693d6f801b505b8"} err="failed to get container status \"7f6e8e67578d8ae42aa73b40388c9e767dc73f209311f0588693d6f801b505b8\": rpc error: code = NotFound desc = could not find container \"7f6e8e67578d8ae42aa73b40388c9e767dc73f209311f0588693d6f801b505b8\": container with ID starting with 7f6e8e67578d8ae42aa73b40388c9e767dc73f209311f0588693d6f801b505b8 not found: ID does not exist" Apr 17 18:06:29.572851 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:29.572784 2566 scope.go:117] "RemoveContainer" containerID="c1530705af74036ac969d7b516162dc64ccb01f3f2590f71fde5a5cbf50c67a3" Apr 17 18:06:29.573127 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:06:29.573102 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1530705af74036ac969d7b516162dc64ccb01f3f2590f71fde5a5cbf50c67a3\": container with ID starting with c1530705af74036ac969d7b516162dc64ccb01f3f2590f71fde5a5cbf50c67a3 not found: ID does not exist" containerID="c1530705af74036ac969d7b516162dc64ccb01f3f2590f71fde5a5cbf50c67a3" Apr 17 18:06:29.573212 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:29.573132 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1530705af74036ac969d7b516162dc64ccb01f3f2590f71fde5a5cbf50c67a3"} err="failed to get container status \"c1530705af74036ac969d7b516162dc64ccb01f3f2590f71fde5a5cbf50c67a3\": rpc error: code = NotFound desc = could not find container \"c1530705af74036ac969d7b516162dc64ccb01f3f2590f71fde5a5cbf50c67a3\": container with ID starting with c1530705af74036ac969d7b516162dc64ccb01f3f2590f71fde5a5cbf50c67a3 not found: ID does not exist" Apr 17 18:06:29.573212 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:29.573147 2566 scope.go:117] "RemoveContainer" containerID="85870eb8d16e5a0aa653954423d87c8d90196f01bcd75fec76d849a2cb23dee3" Apr 17 18:06:29.573434 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:06:29.573407 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85870eb8d16e5a0aa653954423d87c8d90196f01bcd75fec76d849a2cb23dee3\": container with ID starting with 85870eb8d16e5a0aa653954423d87c8d90196f01bcd75fec76d849a2cb23dee3 not found: ID does not exist" containerID="85870eb8d16e5a0aa653954423d87c8d90196f01bcd75fec76d849a2cb23dee3" Apr 17 18:06:29.573539 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:29.573440 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85870eb8d16e5a0aa653954423d87c8d90196f01bcd75fec76d849a2cb23dee3"} err="failed to get container status \"85870eb8d16e5a0aa653954423d87c8d90196f01bcd75fec76d849a2cb23dee3\": rpc error: code = NotFound desc = could not find container \"85870eb8d16e5a0aa653954423d87c8d90196f01bcd75fec76d849a2cb23dee3\": container with ID starting with 85870eb8d16e5a0aa653954423d87c8d90196f01bcd75fec76d849a2cb23dee3 not found: ID does not exist" Apr 17 18:06:29.628504 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:29.628480 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-922cg"] Apr 17 18:06:29.639736 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:29.639696 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-922cg"] Apr 17 18:06:30.552921 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:30.552886 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t5ztv" event={"ID":"3fbc5012-95be-4ad0-8ea5-6612d33e801c","Type":"ContainerStarted","Data":"308f1949f0828cad474e1281caf26d9e89f757a320ff6ee55901bfe4f604e78a"} Apr 17 18:06:30.553382 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:30.552928 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t5ztv" event={"ID":"3fbc5012-95be-4ad0-8ea5-6612d33e801c","Type":"ContainerStarted","Data":"6546b8ea5d76a293a99222c7978f9de655510d063751d254bb012daf2439978b"} Apr 17 18:06:30.553382 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:30.553195 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t5ztv" Apr 17 18:06:30.553382 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:30.553271 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t5ztv" Apr 17 18:06:30.584126 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:30.584083 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t5ztv" podStartSLOduration=6.584070395 podStartE2EDuration="6.584070395s" podCreationTimestamp="2026-04-17 18:06:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:06:30.582945236 +0000 UTC m=+2502.253024876" watchObservedRunningTime="2026-04-17 18:06:30.584070395 +0000 UTC m=+2502.254150038" Apr 17 18:06:30.839057 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:30.839024 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7457f096-03cf-4eea-8155-125687ef0331" path="/var/lib/kubelet/pods/7457f096-03cf-4eea-8155-125687ef0331/volumes" Apr 17 18:06:36.562078 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:06:36.562050 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t5ztv" Apr 17 18:07:06.632005 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:06.631958 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t5ztv" podUID="3fbc5012-95be-4ad0-8ea5-6612d33e801c" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 17 18:07:16.565132 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:16.565095 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t5ztv" Apr 17 18:07:24.798799 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:24.798769 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t5ztv"] Apr 17 18:07:24.799384 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:24.799111 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t5ztv" podUID="3fbc5012-95be-4ad0-8ea5-6612d33e801c" containerName="kserve-container" containerID="cri-o://6546b8ea5d76a293a99222c7978f9de655510d063751d254bb012daf2439978b" gracePeriod=30 Apr 17 18:07:24.799384 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:24.799157 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t5ztv" podUID="3fbc5012-95be-4ad0-8ea5-6612d33e801c" containerName="kube-rbac-proxy" containerID="cri-o://308f1949f0828cad474e1281caf26d9e89f757a320ff6ee55901bfe4f604e78a" gracePeriod=30 Apr 17 18:07:24.858913 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:24.858882 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-98lwc"] Apr 17 18:07:24.859194 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:24.859182 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7457f096-03cf-4eea-8155-125687ef0331" containerName="kserve-container" Apr 17 18:07:24.859246 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:24.859196 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="7457f096-03cf-4eea-8155-125687ef0331" containerName="kserve-container" Apr 17 18:07:24.859246 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:24.859206 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7457f096-03cf-4eea-8155-125687ef0331" containerName="kube-rbac-proxy" Apr 17 18:07:24.859246 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:24.859212 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="7457f096-03cf-4eea-8155-125687ef0331" containerName="kube-rbac-proxy" Apr 17 18:07:24.859246 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:24.859232 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7457f096-03cf-4eea-8155-125687ef0331" containerName="storage-initializer" Apr 17 18:07:24.859246 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:24.859238 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="7457f096-03cf-4eea-8155-125687ef0331" containerName="storage-initializer" Apr 17 18:07:24.859414 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:24.859294 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="7457f096-03cf-4eea-8155-125687ef0331" containerName="kserve-container" Apr 17 18:07:24.859414 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:24.859303 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="7457f096-03cf-4eea-8155-125687ef0331" containerName="kube-rbac-proxy" Apr 17 18:07:24.862484 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:24.862468 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-98lwc" Apr 17 18:07:24.864621 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:24.864599 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\"" Apr 17 18:07:24.864736 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:24.864665 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-runtime-predictor-serving-cert\"" Apr 17 18:07:24.871360 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:24.871336 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-98lwc"] Apr 17 18:07:24.967127 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:24.967090 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trrwn\" (UniqueName: \"kubernetes.io/projected/3c74fda3-85d9-4d9e-818e-310ce4e4b414-kube-api-access-trrwn\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-98lwc\" (UID: \"3c74fda3-85d9-4d9e-818e-310ce4e4b414\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-98lwc" Apr 17 18:07:24.967319 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:24.967150 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3c74fda3-85d9-4d9e-818e-310ce4e4b414-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-98lwc\" (UID: \"3c74fda3-85d9-4d9e-818e-310ce4e4b414\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-98lwc" Apr 17 18:07:24.967319 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:24.967179 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3c74fda3-85d9-4d9e-818e-310ce4e4b414-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-98lwc\" (UID: \"3c74fda3-85d9-4d9e-818e-310ce4e4b414\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-98lwc" Apr 17 18:07:24.967319 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:24.967207 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3c74fda3-85d9-4d9e-818e-310ce4e4b414-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-98lwc\" (UID: \"3c74fda3-85d9-4d9e-818e-310ce4e4b414\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-98lwc" Apr 17 18:07:25.068213 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:25.068181 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3c74fda3-85d9-4d9e-818e-310ce4e4b414-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-98lwc\" (UID: \"3c74fda3-85d9-4d9e-818e-310ce4e4b414\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-98lwc" Apr 17 18:07:25.068452 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:25.068227 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3c74fda3-85d9-4d9e-818e-310ce4e4b414-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-98lwc\" (UID: \"3c74fda3-85d9-4d9e-818e-310ce4e4b414\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-98lwc" Apr 17 18:07:25.068452 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:07:25.068381 2566 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-serving-cert: secret "isvc-sklearn-runtime-predictor-serving-cert" not found Apr 17 18:07:25.068452 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:25.068430 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3c74fda3-85d9-4d9e-818e-310ce4e4b414-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-98lwc\" (UID: \"3c74fda3-85d9-4d9e-818e-310ce4e4b414\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-98lwc" Apr 17 18:07:25.068452 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:07:25.068452 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c74fda3-85d9-4d9e-818e-310ce4e4b414-proxy-tls podName:3c74fda3-85d9-4d9e-818e-310ce4e4b414 nodeName:}" failed. No retries permitted until 2026-04-17 18:07:25.568430196 +0000 UTC m=+2557.238509834 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/3c74fda3-85d9-4d9e-818e-310ce4e4b414-proxy-tls") pod "isvc-sklearn-runtime-predictor-65cd49579f-98lwc" (UID: "3c74fda3-85d9-4d9e-818e-310ce4e4b414") : secret "isvc-sklearn-runtime-predictor-serving-cert" not found Apr 17 18:07:25.068669 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:25.068538 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-trrwn\" (UniqueName: \"kubernetes.io/projected/3c74fda3-85d9-4d9e-818e-310ce4e4b414-kube-api-access-trrwn\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-98lwc\" (UID: \"3c74fda3-85d9-4d9e-818e-310ce4e4b414\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-98lwc" Apr 17 18:07:25.068731 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:25.068713 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3c74fda3-85d9-4d9e-818e-310ce4e4b414-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-98lwc\" (UID: \"3c74fda3-85d9-4d9e-818e-310ce4e4b414\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-98lwc" Apr 17 18:07:25.068912 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:25.068894 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3c74fda3-85d9-4d9e-818e-310ce4e4b414-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-98lwc\" (UID: \"3c74fda3-85d9-4d9e-818e-310ce4e4b414\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-98lwc" Apr 17 18:07:25.078691 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:25.078667 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-trrwn\" (UniqueName: \"kubernetes.io/projected/3c74fda3-85d9-4d9e-818e-310ce4e4b414-kube-api-access-trrwn\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-98lwc\" (UID: \"3c74fda3-85d9-4d9e-818e-310ce4e4b414\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-98lwc" Apr 17 18:07:25.572138 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:25.572106 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3c74fda3-85d9-4d9e-818e-310ce4e4b414-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-98lwc\" (UID: \"3c74fda3-85d9-4d9e-818e-310ce4e4b414\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-98lwc" Apr 17 18:07:25.574469 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:25.574443 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3c74fda3-85d9-4d9e-818e-310ce4e4b414-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-98lwc\" (UID: \"3c74fda3-85d9-4d9e-818e-310ce4e4b414\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-98lwc" Apr 17 18:07:25.717658 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:25.717626 2566 generic.go:358] "Generic (PLEG): container finished" podID="3fbc5012-95be-4ad0-8ea5-6612d33e801c" containerID="308f1949f0828cad474e1281caf26d9e89f757a320ff6ee55901bfe4f604e78a" exitCode=2 Apr 17 18:07:25.717809 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:25.717692 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t5ztv" event={"ID":"3fbc5012-95be-4ad0-8ea5-6612d33e801c","Type":"ContainerDied","Data":"308f1949f0828cad474e1281caf26d9e89f757a320ff6ee55901bfe4f604e78a"} Apr 17 18:07:25.773099 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:25.773050 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-98lwc" Apr 17 18:07:25.901596 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:25.901550 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-98lwc"] Apr 17 18:07:25.903369 ip-10-0-140-147 kubenswrapper[2566]: W0417 18:07:25.903343 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c74fda3_85d9_4d9e_818e_310ce4e4b414.slice/crio-a765fcc1f27dc39d7ed1f336f66b4e3d100872c1144611a94fd15bc0b2047833 WatchSource:0}: Error finding container a765fcc1f27dc39d7ed1f336f66b4e3d100872c1144611a94fd15bc0b2047833: Status 404 returned error can't find the container with id a765fcc1f27dc39d7ed1f336f66b4e3d100872c1144611a94fd15bc0b2047833 Apr 17 18:07:26.557618 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:26.557576 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t5ztv" podUID="3fbc5012-95be-4ad0-8ea5-6612d33e801c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.44:8643/healthz\": dial tcp 10.133.0.44:8643: connect: connection refused" Apr 17 18:07:26.721993 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:26.721954 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-98lwc" event={"ID":"3c74fda3-85d9-4d9e-818e-310ce4e4b414","Type":"ContainerStarted","Data":"8463d22958380314b20cda6677cbd4eda41ec534efca10c1aab17f8632b0b299"} Apr 17 18:07:26.721993 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:26.721992 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-98lwc" event={"ID":"3c74fda3-85d9-4d9e-818e-310ce4e4b414","Type":"ContainerStarted","Data":"a765fcc1f27dc39d7ed1f336f66b4e3d100872c1144611a94fd15bc0b2047833"} Apr 17 18:07:27.603439 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:27.603395 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t5ztv" podUID="3fbc5012-95be-4ad0-8ea5-6612d33e801c" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.44:8080/v2/models/sklearn-v2-mlserver/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 17 18:07:31.557452 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:31.557353 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t5ztv" podUID="3fbc5012-95be-4ad0-8ea5-6612d33e801c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.44:8643/healthz\": dial tcp 10.133.0.44:8643: connect: connection refused" Apr 17 18:07:31.739236 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:31.739201 2566 generic.go:358] "Generic (PLEG): container finished" podID="3c74fda3-85d9-4d9e-818e-310ce4e4b414" containerID="8463d22958380314b20cda6677cbd4eda41ec534efca10c1aab17f8632b0b299" exitCode=0 Apr 17 18:07:31.739437 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:31.739277 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-98lwc" event={"ID":"3c74fda3-85d9-4d9e-818e-310ce4e4b414","Type":"ContainerDied","Data":"8463d22958380314b20cda6677cbd4eda41ec534efca10c1aab17f8632b0b299"} Apr 17 18:07:32.338193 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:32.338171 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t5ztv" Apr 17 18:07:32.429787 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:32.429689 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3fbc5012-95be-4ad0-8ea5-6612d33e801c-kserve-provision-location\") pod \"3fbc5012-95be-4ad0-8ea5-6612d33e801c\" (UID: \"3fbc5012-95be-4ad0-8ea5-6612d33e801c\") " Apr 17 18:07:32.429787 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:32.429738 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9sws\" (UniqueName: \"kubernetes.io/projected/3fbc5012-95be-4ad0-8ea5-6612d33e801c-kube-api-access-f9sws\") pod \"3fbc5012-95be-4ad0-8ea5-6612d33e801c\" (UID: \"3fbc5012-95be-4ad0-8ea5-6612d33e801c\") " Apr 17 18:07:32.429787 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:32.429768 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3fbc5012-95be-4ad0-8ea5-6612d33e801c-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"3fbc5012-95be-4ad0-8ea5-6612d33e801c\" (UID: \"3fbc5012-95be-4ad0-8ea5-6612d33e801c\") " Apr 17 18:07:32.430060 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:32.429798 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3fbc5012-95be-4ad0-8ea5-6612d33e801c-proxy-tls\") pod \"3fbc5012-95be-4ad0-8ea5-6612d33e801c\" (UID: \"3fbc5012-95be-4ad0-8ea5-6612d33e801c\") " Apr 17 18:07:32.430060 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:32.429964 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fbc5012-95be-4ad0-8ea5-6612d33e801c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3fbc5012-95be-4ad0-8ea5-6612d33e801c" (UID: "3fbc5012-95be-4ad0-8ea5-6612d33e801c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:07:32.430132 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:32.430105 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3fbc5012-95be-4ad0-8ea5-6612d33e801c-kserve-provision-location\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:07:32.430224 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:32.430155 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fbc5012-95be-4ad0-8ea5-6612d33e801c-sklearn-v2-mlserver-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "sklearn-v2-mlserver-kube-rbac-proxy-sar-config") pod "3fbc5012-95be-4ad0-8ea5-6612d33e801c" (UID: "3fbc5012-95be-4ad0-8ea5-6612d33e801c"). InnerVolumeSpecName "sklearn-v2-mlserver-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:07:32.431781 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:32.431749 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fbc5012-95be-4ad0-8ea5-6612d33e801c-kube-api-access-f9sws" (OuterVolumeSpecName: "kube-api-access-f9sws") pod "3fbc5012-95be-4ad0-8ea5-6612d33e801c" (UID: "3fbc5012-95be-4ad0-8ea5-6612d33e801c"). InnerVolumeSpecName "kube-api-access-f9sws". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:07:32.431901 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:32.431807 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fbc5012-95be-4ad0-8ea5-6612d33e801c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "3fbc5012-95be-4ad0-8ea5-6612d33e801c" (UID: "3fbc5012-95be-4ad0-8ea5-6612d33e801c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:07:32.530669 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:32.530631 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f9sws\" (UniqueName: \"kubernetes.io/projected/3fbc5012-95be-4ad0-8ea5-6612d33e801c-kube-api-access-f9sws\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:07:32.530669 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:32.530664 2566 reconciler_common.go:299] "Volume detached for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3fbc5012-95be-4ad0-8ea5-6612d33e801c-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:07:32.530867 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:32.530679 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3fbc5012-95be-4ad0-8ea5-6612d33e801c-proxy-tls\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:07:32.744553 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:32.744466 2566 generic.go:358] "Generic (PLEG): container finished" podID="3fbc5012-95be-4ad0-8ea5-6612d33e801c" containerID="6546b8ea5d76a293a99222c7978f9de655510d063751d254bb012daf2439978b" exitCode=0 Apr 17 18:07:32.745000 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:32.744553 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t5ztv" Apr 17 18:07:32.745000 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:32.744545 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t5ztv" event={"ID":"3fbc5012-95be-4ad0-8ea5-6612d33e801c","Type":"ContainerDied","Data":"6546b8ea5d76a293a99222c7978f9de655510d063751d254bb012daf2439978b"} Apr 17 18:07:32.745000 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:32.744668 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t5ztv" event={"ID":"3fbc5012-95be-4ad0-8ea5-6612d33e801c","Type":"ContainerDied","Data":"6a322b2cda88d150b660f3ec12eed3320cbbf014e9353050f3e262d352cb3ce8"} Apr 17 18:07:32.745000 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:32.744684 2566 scope.go:117] "RemoveContainer" containerID="308f1949f0828cad474e1281caf26d9e89f757a320ff6ee55901bfe4f604e78a" Apr 17 18:07:32.746615 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:32.746590 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-98lwc" event={"ID":"3c74fda3-85d9-4d9e-818e-310ce4e4b414","Type":"ContainerStarted","Data":"581765daa190a487e44a7fc1a0343b6ca9adb269a38840a093c1e221d27e8205"} Apr 17 18:07:32.746765 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:32.746630 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-98lwc" event={"ID":"3c74fda3-85d9-4d9e-818e-310ce4e4b414","Type":"ContainerStarted","Data":"8306063cf6cfbc482f9b111c59a986a3e08a3fc7862da18475581c514fc240ce"} Apr 17 18:07:32.746970 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:32.746942 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-98lwc" Apr 17 18:07:32.747075 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:32.746995 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-98lwc" Apr 17 18:07:32.748484 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:32.748453 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-98lwc" podUID="3c74fda3-85d9-4d9e-818e-310ce4e4b414" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 17 18:07:32.755139 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:32.755122 2566 scope.go:117] "RemoveContainer" containerID="6546b8ea5d76a293a99222c7978f9de655510d063751d254bb012daf2439978b" Apr 17 18:07:32.763444 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:32.763427 2566 scope.go:117] "RemoveContainer" containerID="e36f7f19e148f1bb7df1d0194525469c330674f8ecb1a6dab9cc40b44be75e8e" Apr 17 18:07:32.767514 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:32.767470 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-98lwc" podStartSLOduration=8.767455072 podStartE2EDuration="8.767455072s" podCreationTimestamp="2026-04-17 18:07:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:07:32.76592833 +0000 UTC m=+2564.436007973" watchObservedRunningTime="2026-04-17 18:07:32.767455072 +0000 UTC m=+2564.437534715" Apr 17 18:07:32.772792 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:32.772772 2566 scope.go:117] "RemoveContainer" containerID="308f1949f0828cad474e1281caf26d9e89f757a320ff6ee55901bfe4f604e78a" Apr 17 18:07:32.773224 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:07:32.773205 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"308f1949f0828cad474e1281caf26d9e89f757a320ff6ee55901bfe4f604e78a\": container with ID starting with 308f1949f0828cad474e1281caf26d9e89f757a320ff6ee55901bfe4f604e78a not found: ID does not exist" containerID="308f1949f0828cad474e1281caf26d9e89f757a320ff6ee55901bfe4f604e78a" Apr 17 18:07:32.773395 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:32.773234 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"308f1949f0828cad474e1281caf26d9e89f757a320ff6ee55901bfe4f604e78a"} err="failed to get container status \"308f1949f0828cad474e1281caf26d9e89f757a320ff6ee55901bfe4f604e78a\": rpc error: code = NotFound desc = could not find container \"308f1949f0828cad474e1281caf26d9e89f757a320ff6ee55901bfe4f604e78a\": container with ID starting with 308f1949f0828cad474e1281caf26d9e89f757a320ff6ee55901bfe4f604e78a not found: ID does not exist" Apr 17 18:07:32.773395 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:32.773270 2566 scope.go:117] "RemoveContainer" containerID="6546b8ea5d76a293a99222c7978f9de655510d063751d254bb012daf2439978b" Apr 17 18:07:32.773543 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:07:32.773520 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6546b8ea5d76a293a99222c7978f9de655510d063751d254bb012daf2439978b\": container with ID starting with 6546b8ea5d76a293a99222c7978f9de655510d063751d254bb012daf2439978b not found: ID does not exist" containerID="6546b8ea5d76a293a99222c7978f9de655510d063751d254bb012daf2439978b" Apr 17 18:07:32.773597 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:32.773550 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6546b8ea5d76a293a99222c7978f9de655510d063751d254bb012daf2439978b"} err="failed to get container status \"6546b8ea5d76a293a99222c7978f9de655510d063751d254bb012daf2439978b\": rpc error: code = NotFound desc = could not find container \"6546b8ea5d76a293a99222c7978f9de655510d063751d254bb012daf2439978b\": container with ID starting with 6546b8ea5d76a293a99222c7978f9de655510d063751d254bb012daf2439978b not found: ID does not exist" Apr 17 18:07:32.773597 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:32.773565 2566 scope.go:117] "RemoveContainer" containerID="e36f7f19e148f1bb7df1d0194525469c330674f8ecb1a6dab9cc40b44be75e8e" Apr 17 18:07:32.773782 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:07:32.773767 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e36f7f19e148f1bb7df1d0194525469c330674f8ecb1a6dab9cc40b44be75e8e\": container with ID starting with e36f7f19e148f1bb7df1d0194525469c330674f8ecb1a6dab9cc40b44be75e8e not found: ID does not exist" containerID="e36f7f19e148f1bb7df1d0194525469c330674f8ecb1a6dab9cc40b44be75e8e" Apr 17 18:07:32.773836 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:32.773784 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e36f7f19e148f1bb7df1d0194525469c330674f8ecb1a6dab9cc40b44be75e8e"} err="failed to get container status \"e36f7f19e148f1bb7df1d0194525469c330674f8ecb1a6dab9cc40b44be75e8e\": rpc error: code = NotFound desc = could not find container \"e36f7f19e148f1bb7df1d0194525469c330674f8ecb1a6dab9cc40b44be75e8e\": container with ID starting with e36f7f19e148f1bb7df1d0194525469c330674f8ecb1a6dab9cc40b44be75e8e not found: ID does not exist" Apr 17 18:07:32.779775 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:32.779752 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t5ztv"] Apr 17 18:07:32.783740 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:32.783721 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t5ztv"] Apr 17 18:07:32.839658 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:32.839624 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fbc5012-95be-4ad0-8ea5-6612d33e801c" path="/var/lib/kubelet/pods/3fbc5012-95be-4ad0-8ea5-6612d33e801c/volumes" Apr 17 18:07:33.750465 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:33.750418 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-98lwc" podUID="3c74fda3-85d9-4d9e-818e-310ce4e4b414" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 17 18:07:38.755656 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:38.755625 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-98lwc" Apr 17 18:07:38.756291 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:38.756244 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-98lwc" podUID="3c74fda3-85d9-4d9e-818e-310ce4e4b414" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 17 18:07:48.757082 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:07:48.757049 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-98lwc" Apr 17 18:08:01.861707 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:01.861630 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-runtime-predictor-65cd49579f-98lwc_3c74fda3-85d9-4d9e-818e-310ce4e4b414/kserve-container/0.log" Apr 17 18:08:01.994382 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:01.994345 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-98lwc"] Apr 17 18:08:01.994709 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:01.994680 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-98lwc" podUID="3c74fda3-85d9-4d9e-818e-310ce4e4b414" containerName="kserve-container" containerID="cri-o://8306063cf6cfbc482f9b111c59a986a3e08a3fc7862da18475581c514fc240ce" gracePeriod=30 Apr 17 18:08:01.994838 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:01.994714 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-98lwc" podUID="3c74fda3-85d9-4d9e-818e-310ce4e4b414" containerName="kube-rbac-proxy" containerID="cri-o://581765daa190a487e44a7fc1a0343b6ca9adb269a38840a093c1e221d27e8205" gracePeriod=30 Apr 17 18:08:02.071488 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:02.071454 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m"] Apr 17 18:08:02.071770 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:02.071758 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3fbc5012-95be-4ad0-8ea5-6612d33e801c" containerName="kserve-container" Apr 17 18:08:02.071830 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:02.071771 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fbc5012-95be-4ad0-8ea5-6612d33e801c" containerName="kserve-container" Apr 17 18:08:02.071830 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:02.071784 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3fbc5012-95be-4ad0-8ea5-6612d33e801c" containerName="storage-initializer" Apr 17 18:08:02.071830 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:02.071789 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fbc5012-95be-4ad0-8ea5-6612d33e801c" containerName="storage-initializer" Apr 17 18:08:02.071830 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:02.071809 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3fbc5012-95be-4ad0-8ea5-6612d33e801c" containerName="kube-rbac-proxy" Apr 17 18:08:02.071830 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:02.071815 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fbc5012-95be-4ad0-8ea5-6612d33e801c" containerName="kube-rbac-proxy" Apr 17 18:08:02.071987 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:02.071863 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="3fbc5012-95be-4ad0-8ea5-6612d33e801c" containerName="kube-rbac-proxy" Apr 17 18:08:02.071987 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:02.071872 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="3fbc5012-95be-4ad0-8ea5-6612d33e801c" containerName="kserve-container" Apr 17 18:08:02.076127 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:02.076111 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m" Apr 17 18:08:02.078220 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:02.078198 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 17 18:08:02.078417 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:02.078400 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-runtime-predictor-serving-cert\"" Apr 17 18:08:02.085750 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:02.085730 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m"] Apr 17 18:08:02.182380 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:02.182338 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eed9c8ea-9005-49a5-807d-60798646a5f9-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m\" (UID: \"eed9c8ea-9005-49a5-807d-60798646a5f9\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m" Apr 17 18:08:02.182380 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:02.182380 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eed9c8ea-9005-49a5-807d-60798646a5f9-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m\" (UID: \"eed9c8ea-9005-49a5-807d-60798646a5f9\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m" Apr 17 18:08:02.182583 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:02.182493 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/eed9c8ea-9005-49a5-807d-60798646a5f9-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m\" (UID: \"eed9c8ea-9005-49a5-807d-60798646a5f9\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m" Apr 17 18:08:02.182583 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:02.182525 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lpnl\" (UniqueName: \"kubernetes.io/projected/eed9c8ea-9005-49a5-807d-60798646a5f9-kube-api-access-5lpnl\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m\" (UID: \"eed9c8ea-9005-49a5-807d-60798646a5f9\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m" Apr 17 18:08:02.283912 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:02.283872 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eed9c8ea-9005-49a5-807d-60798646a5f9-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m\" (UID: \"eed9c8ea-9005-49a5-807d-60798646a5f9\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m" Apr 17 18:08:02.283912 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:02.283915 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eed9c8ea-9005-49a5-807d-60798646a5f9-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m\" (UID: \"eed9c8ea-9005-49a5-807d-60798646a5f9\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m" Apr 17 18:08:02.284172 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:02.283993 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/eed9c8ea-9005-49a5-807d-60798646a5f9-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m\" (UID: \"eed9c8ea-9005-49a5-807d-60798646a5f9\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m" Apr 17 18:08:02.284172 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:02.284019 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5lpnl\" (UniqueName: \"kubernetes.io/projected/eed9c8ea-9005-49a5-807d-60798646a5f9-kube-api-access-5lpnl\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m\" (UID: \"eed9c8ea-9005-49a5-807d-60798646a5f9\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m" Apr 17 18:08:02.284329 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:02.284308 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eed9c8ea-9005-49a5-807d-60798646a5f9-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m\" (UID: \"eed9c8ea-9005-49a5-807d-60798646a5f9\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m" Apr 17 18:08:02.284612 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:02.284592 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/eed9c8ea-9005-49a5-807d-60798646a5f9-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m\" (UID: \"eed9c8ea-9005-49a5-807d-60798646a5f9\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m" Apr 17 18:08:02.286359 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:02.286336 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eed9c8ea-9005-49a5-807d-60798646a5f9-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m\" (UID: \"eed9c8ea-9005-49a5-807d-60798646a5f9\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m" Apr 17 18:08:02.291911 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:02.291889 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lpnl\" (UniqueName: \"kubernetes.io/projected/eed9c8ea-9005-49a5-807d-60798646a5f9-kube-api-access-5lpnl\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m\" (UID: \"eed9c8ea-9005-49a5-807d-60798646a5f9\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m" Apr 17 18:08:02.387391 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:02.387294 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m" Apr 17 18:08:02.511023 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:02.510994 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m"] Apr 17 18:08:02.513353 ip-10-0-140-147 kubenswrapper[2566]: W0417 18:08:02.513319 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeed9c8ea_9005_49a5_807d_60798646a5f9.slice/crio-18bcea2ff984141bd0e6e6ff52a2a1e534c9dbb7c390f2c68da8eef1a92fb87e WatchSource:0}: Error finding container 18bcea2ff984141bd0e6e6ff52a2a1e534c9dbb7c390f2c68da8eef1a92fb87e: Status 404 returned error can't find the container with id 18bcea2ff984141bd0e6e6ff52a2a1e534c9dbb7c390f2c68da8eef1a92fb87e Apr 17 18:08:02.842138 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:02.842096 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m" event={"ID":"eed9c8ea-9005-49a5-807d-60798646a5f9","Type":"ContainerStarted","Data":"58aa786699c77c84964800a9184ee6b53c693f31a33137e96f71099369b2075d"} Apr 17 18:08:02.842138 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:02.842138 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m" event={"ID":"eed9c8ea-9005-49a5-807d-60798646a5f9","Type":"ContainerStarted","Data":"18bcea2ff984141bd0e6e6ff52a2a1e534c9dbb7c390f2c68da8eef1a92fb87e"} Apr 17 18:08:02.844247 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:02.844221 2566 generic.go:358] "Generic (PLEG): container finished" podID="3c74fda3-85d9-4d9e-818e-310ce4e4b414" containerID="581765daa190a487e44a7fc1a0343b6ca9adb269a38840a093c1e221d27e8205" exitCode=2 Apr 17 18:08:02.844405 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:02.844280 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-98lwc" event={"ID":"3c74fda3-85d9-4d9e-818e-310ce4e4b414","Type":"ContainerDied","Data":"581765daa190a487e44a7fc1a0343b6ca9adb269a38840a093c1e221d27e8205"} Apr 17 18:08:03.131576 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:03.131545 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-98lwc" Apr 17 18:08:03.292551 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:03.292514 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3c74fda3-85d9-4d9e-818e-310ce4e4b414-kserve-provision-location\") pod \"3c74fda3-85d9-4d9e-818e-310ce4e4b414\" (UID: \"3c74fda3-85d9-4d9e-818e-310ce4e4b414\") " Apr 17 18:08:03.292551 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:03.292558 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3c74fda3-85d9-4d9e-818e-310ce4e4b414-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"3c74fda3-85d9-4d9e-818e-310ce4e4b414\" (UID: \"3c74fda3-85d9-4d9e-818e-310ce4e4b414\") " Apr 17 18:08:03.292784 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:03.292594 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3c74fda3-85d9-4d9e-818e-310ce4e4b414-proxy-tls\") pod \"3c74fda3-85d9-4d9e-818e-310ce4e4b414\" (UID: \"3c74fda3-85d9-4d9e-818e-310ce4e4b414\") " Apr 17 18:08:03.292784 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:03.292652 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trrwn\" (UniqueName: \"kubernetes.io/projected/3c74fda3-85d9-4d9e-818e-310ce4e4b414-kube-api-access-trrwn\") pod \"3c74fda3-85d9-4d9e-818e-310ce4e4b414\" (UID: \"3c74fda3-85d9-4d9e-818e-310ce4e4b414\") " Apr 17 18:08:03.292988 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:03.292958 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c74fda3-85d9-4d9e-818e-310ce4e4b414-isvc-sklearn-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-runtime-kube-rbac-proxy-sar-config") pod "3c74fda3-85d9-4d9e-818e-310ce4e4b414" (UID: "3c74fda3-85d9-4d9e-818e-310ce4e4b414"). InnerVolumeSpecName "isvc-sklearn-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:08:03.294762 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:03.294730 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c74fda3-85d9-4d9e-818e-310ce4e4b414-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "3c74fda3-85d9-4d9e-818e-310ce4e4b414" (UID: "3c74fda3-85d9-4d9e-818e-310ce4e4b414"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:08:03.294898 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:03.294803 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c74fda3-85d9-4d9e-818e-310ce4e4b414-kube-api-access-trrwn" (OuterVolumeSpecName: "kube-api-access-trrwn") pod "3c74fda3-85d9-4d9e-818e-310ce4e4b414" (UID: "3c74fda3-85d9-4d9e-818e-310ce4e4b414"). InnerVolumeSpecName "kube-api-access-trrwn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:08:03.320113 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:03.320073 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c74fda3-85d9-4d9e-818e-310ce4e4b414-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3c74fda3-85d9-4d9e-818e-310ce4e4b414" (UID: "3c74fda3-85d9-4d9e-818e-310ce4e4b414"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:08:03.394069 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:03.393996 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3c74fda3-85d9-4d9e-818e-310ce4e4b414-proxy-tls\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:08:03.394069 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:03.394023 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-trrwn\" (UniqueName: \"kubernetes.io/projected/3c74fda3-85d9-4d9e-818e-310ce4e4b414-kube-api-access-trrwn\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:08:03.394069 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:03.394033 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3c74fda3-85d9-4d9e-818e-310ce4e4b414-kserve-provision-location\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:08:03.394069 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:03.394043 2566 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3c74fda3-85d9-4d9e-818e-310ce4e4b414-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:08:03.849076 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:03.849039 2566 generic.go:358] "Generic (PLEG): container finished" podID="3c74fda3-85d9-4d9e-818e-310ce4e4b414" containerID="8306063cf6cfbc482f9b111c59a986a3e08a3fc7862da18475581c514fc240ce" exitCode=0 Apr 17 18:08:03.849220 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:03.849120 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-98lwc" Apr 17 18:08:03.849220 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:03.849122 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-98lwc" event={"ID":"3c74fda3-85d9-4d9e-818e-310ce4e4b414","Type":"ContainerDied","Data":"8306063cf6cfbc482f9b111c59a986a3e08a3fc7862da18475581c514fc240ce"} Apr 17 18:08:03.849220 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:03.849157 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-98lwc" event={"ID":"3c74fda3-85d9-4d9e-818e-310ce4e4b414","Type":"ContainerDied","Data":"a765fcc1f27dc39d7ed1f336f66b4e3d100872c1144611a94fd15bc0b2047833"} Apr 17 18:08:03.849220 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:03.849172 2566 scope.go:117] "RemoveContainer" containerID="581765daa190a487e44a7fc1a0343b6ca9adb269a38840a093c1e221d27e8205" Apr 17 18:08:03.857024 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:03.857009 2566 scope.go:117] "RemoveContainer" containerID="8306063cf6cfbc482f9b111c59a986a3e08a3fc7862da18475581c514fc240ce" Apr 17 18:08:03.863938 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:03.863919 2566 scope.go:117] "RemoveContainer" containerID="8463d22958380314b20cda6677cbd4eda41ec534efca10c1aab17f8632b0b299" Apr 17 18:08:03.869433 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:03.869409 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-98lwc"] Apr 17 18:08:03.871934 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:03.871911 2566 scope.go:117] "RemoveContainer" containerID="581765daa190a487e44a7fc1a0343b6ca9adb269a38840a093c1e221d27e8205" Apr 17 18:08:03.872197 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:08:03.872178 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"581765daa190a487e44a7fc1a0343b6ca9adb269a38840a093c1e221d27e8205\": container with ID starting with 581765daa190a487e44a7fc1a0343b6ca9adb269a38840a093c1e221d27e8205 not found: ID does not exist" containerID="581765daa190a487e44a7fc1a0343b6ca9adb269a38840a093c1e221d27e8205" Apr 17 18:08:03.872293 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:03.872205 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"581765daa190a487e44a7fc1a0343b6ca9adb269a38840a093c1e221d27e8205"} err="failed to get container status \"581765daa190a487e44a7fc1a0343b6ca9adb269a38840a093c1e221d27e8205\": rpc error: code = NotFound desc = could not find container \"581765daa190a487e44a7fc1a0343b6ca9adb269a38840a093c1e221d27e8205\": container with ID starting with 581765daa190a487e44a7fc1a0343b6ca9adb269a38840a093c1e221d27e8205 not found: ID does not exist" Apr 17 18:08:03.872293 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:03.872223 2566 scope.go:117] "RemoveContainer" containerID="8306063cf6cfbc482f9b111c59a986a3e08a3fc7862da18475581c514fc240ce" Apr 17 18:08:03.872479 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:03.872457 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-98lwc"] Apr 17 18:08:03.872612 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:08:03.872464 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8306063cf6cfbc482f9b111c59a986a3e08a3fc7862da18475581c514fc240ce\": container with ID starting with 8306063cf6cfbc482f9b111c59a986a3e08a3fc7862da18475581c514fc240ce not found: ID does not exist" containerID="8306063cf6cfbc482f9b111c59a986a3e08a3fc7862da18475581c514fc240ce" Apr 17 18:08:03.872612 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:03.872521 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8306063cf6cfbc482f9b111c59a986a3e08a3fc7862da18475581c514fc240ce"} err="failed to get container status \"8306063cf6cfbc482f9b111c59a986a3e08a3fc7862da18475581c514fc240ce\": rpc error: code = NotFound desc = could not find container \"8306063cf6cfbc482f9b111c59a986a3e08a3fc7862da18475581c514fc240ce\": container with ID starting with 8306063cf6cfbc482f9b111c59a986a3e08a3fc7862da18475581c514fc240ce not found: ID does not exist" Apr 17 18:08:03.872612 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:03.872553 2566 scope.go:117] "RemoveContainer" containerID="8463d22958380314b20cda6677cbd4eda41ec534efca10c1aab17f8632b0b299" Apr 17 18:08:03.872800 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:08:03.872776 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8463d22958380314b20cda6677cbd4eda41ec534efca10c1aab17f8632b0b299\": container with ID starting with 8463d22958380314b20cda6677cbd4eda41ec534efca10c1aab17f8632b0b299 not found: ID does not exist" containerID="8463d22958380314b20cda6677cbd4eda41ec534efca10c1aab17f8632b0b299" Apr 17 18:08:03.872852 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:03.872812 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8463d22958380314b20cda6677cbd4eda41ec534efca10c1aab17f8632b0b299"} err="failed to get container status \"8463d22958380314b20cda6677cbd4eda41ec534efca10c1aab17f8632b0b299\": rpc error: code = NotFound desc = could not find container \"8463d22958380314b20cda6677cbd4eda41ec534efca10c1aab17f8632b0b299\": container with ID starting with 8463d22958380314b20cda6677cbd4eda41ec534efca10c1aab17f8632b0b299 not found: ID does not exist" Apr 17 18:08:04.840508 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:04.840473 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c74fda3-85d9-4d9e-818e-310ce4e4b414" path="/var/lib/kubelet/pods/3c74fda3-85d9-4d9e-818e-310ce4e4b414/volumes" Apr 17 18:08:06.858134 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:06.858100 2566 generic.go:358] "Generic (PLEG): container finished" podID="eed9c8ea-9005-49a5-807d-60798646a5f9" containerID="58aa786699c77c84964800a9184ee6b53c693f31a33137e96f71099369b2075d" exitCode=0 Apr 17 18:08:06.858633 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:06.858183 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m" event={"ID":"eed9c8ea-9005-49a5-807d-60798646a5f9","Type":"ContainerDied","Data":"58aa786699c77c84964800a9184ee6b53c693f31a33137e96f71099369b2075d"} Apr 17 18:08:07.863108 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:07.863072 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m" event={"ID":"eed9c8ea-9005-49a5-807d-60798646a5f9","Type":"ContainerStarted","Data":"06c8741aadb5b8bd93c1d7785c81d51409e363e65d1aacfa367d32299000a481"} Apr 17 18:08:07.863108 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:07.863112 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m" event={"ID":"eed9c8ea-9005-49a5-807d-60798646a5f9","Type":"ContainerStarted","Data":"66dc41ca89fa53eba5c5613062671e676b6cd42a9c3dca4d8d9f826420b9f91f"} Apr 17 18:08:07.863616 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:07.863324 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m" Apr 17 18:08:07.863616 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:07.863349 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m" Apr 17 18:08:07.882508 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:07.882466 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m" podStartSLOduration=5.882452495 podStartE2EDuration="5.882452495s" podCreationTimestamp="2026-04-17 18:08:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:08:07.880551927 +0000 UTC m=+2599.550631569" watchObservedRunningTime="2026-04-17 18:08:07.882452495 +0000 UTC m=+2599.552532137" Apr 17 18:08:13.871975 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:13.871946 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m" Apr 17 18:08:43.932295 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:43.932234 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m" podUID="eed9c8ea-9005-49a5-807d-60798646a5f9" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 17 18:08:53.875662 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:08:53.875625 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m" Apr 17 18:09:02.196270 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:02.196223 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m"] Apr 17 18:09:02.196711 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:02.196579 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m" podUID="eed9c8ea-9005-49a5-807d-60798646a5f9" containerName="kserve-container" containerID="cri-o://66dc41ca89fa53eba5c5613062671e676b6cd42a9c3dca4d8d9f826420b9f91f" gracePeriod=30 Apr 17 18:09:02.196711 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:02.196604 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m" podUID="eed9c8ea-9005-49a5-807d-60798646a5f9" containerName="kube-rbac-proxy" containerID="cri-o://06c8741aadb5b8bd93c1d7785c81d51409e363e65d1aacfa367d32299000a481" gracePeriod=30 Apr 17 18:09:02.254628 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:02.254593 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-c8mll"] Apr 17 18:09:02.254952 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:02.254940 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3c74fda3-85d9-4d9e-818e-310ce4e4b414" containerName="kube-rbac-proxy" Apr 17 18:09:02.254996 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:02.254954 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c74fda3-85d9-4d9e-818e-310ce4e4b414" containerName="kube-rbac-proxy" Apr 17 18:09:02.254996 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:02.254970 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3c74fda3-85d9-4d9e-818e-310ce4e4b414" containerName="kserve-container" Apr 17 18:09:02.254996 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:02.254978 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c74fda3-85d9-4d9e-818e-310ce4e4b414" containerName="kserve-container" Apr 17 18:09:02.254996 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:02.254993 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3c74fda3-85d9-4d9e-818e-310ce4e4b414" containerName="storage-initializer" Apr 17 18:09:02.255125 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:02.255002 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c74fda3-85d9-4d9e-818e-310ce4e4b414" containerName="storage-initializer" Apr 17 18:09:02.255125 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:02.255059 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="3c74fda3-85d9-4d9e-818e-310ce4e4b414" containerName="kube-rbac-proxy" Apr 17 18:09:02.255125 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:02.255066 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="3c74fda3-85d9-4d9e-818e-310ce4e4b414" containerName="kserve-container" Apr 17 18:09:02.259430 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:02.259411 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-c8mll" Apr 17 18:09:02.261399 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:02.261374 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-predictor-serving-cert\"" Apr 17 18:09:02.261598 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:02.261585 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-kube-rbac-proxy-sar-config\"" Apr 17 18:09:02.268417 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:02.268393 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-c8mll"] Apr 17 18:09:02.379440 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:02.379402 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ae301454-239b-4f1f-9057-c2b8ba5396d6-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-predictor-69755fbb9-c8mll\" (UID: \"ae301454-239b-4f1f-9057-c2b8ba5396d6\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-c8mll" Apr 17 18:09:02.379616 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:02.379465 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ae301454-239b-4f1f-9057-c2b8ba5396d6-proxy-tls\") pod \"isvc-sklearn-v2-predictor-69755fbb9-c8mll\" (UID: \"ae301454-239b-4f1f-9057-c2b8ba5396d6\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-c8mll" Apr 17 18:09:02.379616 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:02.379492 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ae301454-239b-4f1f-9057-c2b8ba5396d6-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-69755fbb9-c8mll\" (UID: \"ae301454-239b-4f1f-9057-c2b8ba5396d6\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-c8mll" Apr 17 18:09:02.379616 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:02.379526 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7bvt\" (UniqueName: \"kubernetes.io/projected/ae301454-239b-4f1f-9057-c2b8ba5396d6-kube-api-access-x7bvt\") pod \"isvc-sklearn-v2-predictor-69755fbb9-c8mll\" (UID: \"ae301454-239b-4f1f-9057-c2b8ba5396d6\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-c8mll" Apr 17 18:09:02.480948 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:02.480803 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x7bvt\" (UniqueName: \"kubernetes.io/projected/ae301454-239b-4f1f-9057-c2b8ba5396d6-kube-api-access-x7bvt\") pod \"isvc-sklearn-v2-predictor-69755fbb9-c8mll\" (UID: \"ae301454-239b-4f1f-9057-c2b8ba5396d6\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-c8mll" Apr 17 18:09:02.481104 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:02.480981 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ae301454-239b-4f1f-9057-c2b8ba5396d6-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-predictor-69755fbb9-c8mll\" (UID: \"ae301454-239b-4f1f-9057-c2b8ba5396d6\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-c8mll" Apr 17 18:09:02.481104 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:02.481024 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ae301454-239b-4f1f-9057-c2b8ba5396d6-proxy-tls\") pod \"isvc-sklearn-v2-predictor-69755fbb9-c8mll\" (UID: \"ae301454-239b-4f1f-9057-c2b8ba5396d6\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-c8mll" Apr 17 18:09:02.481104 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:02.481047 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ae301454-239b-4f1f-9057-c2b8ba5396d6-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-69755fbb9-c8mll\" (UID: \"ae301454-239b-4f1f-9057-c2b8ba5396d6\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-c8mll" Apr 17 18:09:02.481415 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:02.481400 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ae301454-239b-4f1f-9057-c2b8ba5396d6-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-69755fbb9-c8mll\" (UID: \"ae301454-239b-4f1f-9057-c2b8ba5396d6\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-c8mll" Apr 17 18:09:02.481753 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:02.481735 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ae301454-239b-4f1f-9057-c2b8ba5396d6-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-predictor-69755fbb9-c8mll\" (UID: \"ae301454-239b-4f1f-9057-c2b8ba5396d6\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-c8mll" Apr 17 18:09:02.483474 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:02.483454 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ae301454-239b-4f1f-9057-c2b8ba5396d6-proxy-tls\") pod \"isvc-sklearn-v2-predictor-69755fbb9-c8mll\" (UID: \"ae301454-239b-4f1f-9057-c2b8ba5396d6\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-c8mll" Apr 17 18:09:02.489747 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:02.489723 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7bvt\" (UniqueName: \"kubernetes.io/projected/ae301454-239b-4f1f-9057-c2b8ba5396d6-kube-api-access-x7bvt\") pod \"isvc-sklearn-v2-predictor-69755fbb9-c8mll\" (UID: \"ae301454-239b-4f1f-9057-c2b8ba5396d6\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-c8mll" Apr 17 18:09:02.569619 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:02.569574 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-c8mll" Apr 17 18:09:02.689035 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:02.689010 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-c8mll"] Apr 17 18:09:02.691633 ip-10-0-140-147 kubenswrapper[2566]: W0417 18:09:02.691603 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae301454_239b_4f1f_9057_c2b8ba5396d6.slice/crio-56b72af2d690dabdfa9d311f11b22002f241fe46d9c2318f61a77fed49e4528a WatchSource:0}: Error finding container 56b72af2d690dabdfa9d311f11b22002f241fe46d9c2318f61a77fed49e4528a: Status 404 returned error can't find the container with id 56b72af2d690dabdfa9d311f11b22002f241fe46d9c2318f61a77fed49e4528a Apr 17 18:09:03.021528 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:03.021435 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-c8mll" event={"ID":"ae301454-239b-4f1f-9057-c2b8ba5396d6","Type":"ContainerStarted","Data":"9cf54d6e4afe2b030ea389c0b0888af1258cfc3bb704252023afd7552a9b2262"} Apr 17 18:09:03.021528 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:03.021474 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-c8mll" event={"ID":"ae301454-239b-4f1f-9057-c2b8ba5396d6","Type":"ContainerStarted","Data":"56b72af2d690dabdfa9d311f11b22002f241fe46d9c2318f61a77fed49e4528a"} Apr 17 18:09:03.023412 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:03.023388 2566 generic.go:358] "Generic (PLEG): container finished" podID="eed9c8ea-9005-49a5-807d-60798646a5f9" containerID="06c8741aadb5b8bd93c1d7785c81d51409e363e65d1aacfa367d32299000a481" exitCode=2 Apr 17 18:09:03.023550 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:03.023456 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m" event={"ID":"eed9c8ea-9005-49a5-807d-60798646a5f9","Type":"ContainerDied","Data":"06c8741aadb5b8bd93c1d7785c81d51409e363e65d1aacfa367d32299000a481"} Apr 17 18:09:03.867153 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:03.867104 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m" podUID="eed9c8ea-9005-49a5-807d-60798646a5f9" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.46:8643/healthz\": dial tcp 10.133.0.46:8643: connect: connection refused" Apr 17 18:09:04.914469 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:04.914425 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m" podUID="eed9c8ea-9005-49a5-807d-60798646a5f9" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.46:8080/v2/models/isvc-sklearn-v2-runtime/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 17 18:09:07.036276 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:07.036221 2566 generic.go:358] "Generic (PLEG): container finished" podID="ae301454-239b-4f1f-9057-c2b8ba5396d6" containerID="9cf54d6e4afe2b030ea389c0b0888af1258cfc3bb704252023afd7552a9b2262" exitCode=0 Apr 17 18:09:07.036719 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:07.036291 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-c8mll" event={"ID":"ae301454-239b-4f1f-9057-c2b8ba5396d6","Type":"ContainerDied","Data":"9cf54d6e4afe2b030ea389c0b0888af1258cfc3bb704252023afd7552a9b2262"} Apr 17 18:09:08.041592 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:08.041559 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-c8mll" event={"ID":"ae301454-239b-4f1f-9057-c2b8ba5396d6","Type":"ContainerStarted","Data":"d7c83a586d94af05ed266a4652f0ba162928cc2afc38df0e632c9642baf01e1c"} Apr 17 18:09:08.041945 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:08.041597 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-c8mll" event={"ID":"ae301454-239b-4f1f-9057-c2b8ba5396d6","Type":"ContainerStarted","Data":"099874d0d7554690afe5d015ad933f83553cc8329f441f96f6adf5736c93cb50"} Apr 17 18:09:08.041945 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:08.041799 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-c8mll" Apr 17 18:09:08.061101 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:08.061054 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-c8mll" podStartSLOduration=6.061039425 podStartE2EDuration="6.061039425s" podCreationTimestamp="2026-04-17 18:09:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:09:08.059167576 +0000 UTC m=+2659.729247218" watchObservedRunningTime="2026-04-17 18:09:08.061039425 +0000 UTC m=+2659.731119068" Apr 17 18:09:08.866623 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:08.866581 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m" podUID="eed9c8ea-9005-49a5-807d-60798646a5f9" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.46:8643/healthz\": dial tcp 10.133.0.46:8643: connect: connection refused" Apr 17 18:09:09.044583 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:09.044542 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-c8mll" Apr 17 18:09:09.045780 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:09.045751 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-c8mll" podUID="ae301454-239b-4f1f-9057-c2b8ba5396d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 17 18:09:09.535372 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:09.535349 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m" Apr 17 18:09:09.641731 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:09.641646 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/eed9c8ea-9005-49a5-807d-60798646a5f9-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"eed9c8ea-9005-49a5-807d-60798646a5f9\" (UID: \"eed9c8ea-9005-49a5-807d-60798646a5f9\") " Apr 17 18:09:09.641731 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:09.641693 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lpnl\" (UniqueName: \"kubernetes.io/projected/eed9c8ea-9005-49a5-807d-60798646a5f9-kube-api-access-5lpnl\") pod \"eed9c8ea-9005-49a5-807d-60798646a5f9\" (UID: \"eed9c8ea-9005-49a5-807d-60798646a5f9\") " Apr 17 18:09:09.641927 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:09.641795 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eed9c8ea-9005-49a5-807d-60798646a5f9-kserve-provision-location\") pod \"eed9c8ea-9005-49a5-807d-60798646a5f9\" (UID: \"eed9c8ea-9005-49a5-807d-60798646a5f9\") " Apr 17 18:09:09.641927 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:09.641813 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eed9c8ea-9005-49a5-807d-60798646a5f9-proxy-tls\") pod \"eed9c8ea-9005-49a5-807d-60798646a5f9\" (UID: \"eed9c8ea-9005-49a5-807d-60798646a5f9\") " Apr 17 18:09:09.642127 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:09.642098 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eed9c8ea-9005-49a5-807d-60798646a5f9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "eed9c8ea-9005-49a5-807d-60798646a5f9" (UID: "eed9c8ea-9005-49a5-807d-60798646a5f9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:09:09.642176 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:09.642098 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eed9c8ea-9005-49a5-807d-60798646a5f9-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config") pod "eed9c8ea-9005-49a5-807d-60798646a5f9" (UID: "eed9c8ea-9005-49a5-807d-60798646a5f9"). InnerVolumeSpecName "isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:09:09.643779 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:09.643757 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eed9c8ea-9005-49a5-807d-60798646a5f9-kube-api-access-5lpnl" (OuterVolumeSpecName: "kube-api-access-5lpnl") pod "eed9c8ea-9005-49a5-807d-60798646a5f9" (UID: "eed9c8ea-9005-49a5-807d-60798646a5f9"). InnerVolumeSpecName "kube-api-access-5lpnl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:09:09.643855 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:09.643797 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eed9c8ea-9005-49a5-807d-60798646a5f9-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "eed9c8ea-9005-49a5-807d-60798646a5f9" (UID: "eed9c8ea-9005-49a5-807d-60798646a5f9"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:09:09.742808 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:09.742770 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eed9c8ea-9005-49a5-807d-60798646a5f9-kserve-provision-location\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:09:09.742808 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:09.742812 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eed9c8ea-9005-49a5-807d-60798646a5f9-proxy-tls\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:09:09.743011 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:09.742830 2566 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/eed9c8ea-9005-49a5-807d-60798646a5f9-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:09:09.743011 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:09.742841 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5lpnl\" (UniqueName: \"kubernetes.io/projected/eed9c8ea-9005-49a5-807d-60798646a5f9-kube-api-access-5lpnl\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:09:10.054471 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:10.054385 2566 generic.go:358] "Generic (PLEG): container finished" podID="eed9c8ea-9005-49a5-807d-60798646a5f9" containerID="66dc41ca89fa53eba5c5613062671e676b6cd42a9c3dca4d8d9f826420b9f91f" exitCode=0 Apr 17 18:09:10.054902 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:10.054479 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m" Apr 17 18:09:10.054902 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:10.054470 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m" event={"ID":"eed9c8ea-9005-49a5-807d-60798646a5f9","Type":"ContainerDied","Data":"66dc41ca89fa53eba5c5613062671e676b6cd42a9c3dca4d8d9f826420b9f91f"} Apr 17 18:09:10.054902 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:10.054592 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m" event={"ID":"eed9c8ea-9005-49a5-807d-60798646a5f9","Type":"ContainerDied","Data":"18bcea2ff984141bd0e6e6ff52a2a1e534c9dbb7c390f2c68da8eef1a92fb87e"} Apr 17 18:09:10.054902 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:10.054616 2566 scope.go:117] "RemoveContainer" containerID="06c8741aadb5b8bd93c1d7785c81d51409e363e65d1aacfa367d32299000a481" Apr 17 18:09:10.055131 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:10.055057 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-c8mll" podUID="ae301454-239b-4f1f-9057-c2b8ba5396d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 17 18:09:10.062865 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:10.062845 2566 scope.go:117] "RemoveContainer" containerID="66dc41ca89fa53eba5c5613062671e676b6cd42a9c3dca4d8d9f826420b9f91f" Apr 17 18:09:10.069867 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:10.069849 2566 scope.go:117] "RemoveContainer" containerID="58aa786699c77c84964800a9184ee6b53c693f31a33137e96f71099369b2075d" Apr 17 18:09:10.074981 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:10.074956 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m"] Apr 17 18:09:10.077491 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:10.077334 2566 scope.go:117] "RemoveContainer" containerID="06c8741aadb5b8bd93c1d7785c81d51409e363e65d1aacfa367d32299000a481" Apr 17 18:09:10.077628 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:09:10.077604 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06c8741aadb5b8bd93c1d7785c81d51409e363e65d1aacfa367d32299000a481\": container with ID starting with 06c8741aadb5b8bd93c1d7785c81d51409e363e65d1aacfa367d32299000a481 not found: ID does not exist" containerID="06c8741aadb5b8bd93c1d7785c81d51409e363e65d1aacfa367d32299000a481" Apr 17 18:09:10.077683 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:10.077636 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06c8741aadb5b8bd93c1d7785c81d51409e363e65d1aacfa367d32299000a481"} err="failed to get container status \"06c8741aadb5b8bd93c1d7785c81d51409e363e65d1aacfa367d32299000a481\": rpc error: code = NotFound desc = could not find container \"06c8741aadb5b8bd93c1d7785c81d51409e363e65d1aacfa367d32299000a481\": container with ID starting with 06c8741aadb5b8bd93c1d7785c81d51409e363e65d1aacfa367d32299000a481 not found: ID does not exist" Apr 17 18:09:10.077683 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:10.077657 2566 scope.go:117] "RemoveContainer" containerID="66dc41ca89fa53eba5c5613062671e676b6cd42a9c3dca4d8d9f826420b9f91f" Apr 17 18:09:10.077915 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:09:10.077897 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66dc41ca89fa53eba5c5613062671e676b6cd42a9c3dca4d8d9f826420b9f91f\": container with ID starting with 66dc41ca89fa53eba5c5613062671e676b6cd42a9c3dca4d8d9f826420b9f91f not found: ID does not exist" containerID="66dc41ca89fa53eba5c5613062671e676b6cd42a9c3dca4d8d9f826420b9f91f" Apr 17 18:09:10.077985 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:10.077927 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66dc41ca89fa53eba5c5613062671e676b6cd42a9c3dca4d8d9f826420b9f91f"} err="failed to get container status \"66dc41ca89fa53eba5c5613062671e676b6cd42a9c3dca4d8d9f826420b9f91f\": rpc error: code = NotFound desc = could not find container \"66dc41ca89fa53eba5c5613062671e676b6cd42a9c3dca4d8d9f826420b9f91f\": container with ID starting with 66dc41ca89fa53eba5c5613062671e676b6cd42a9c3dca4d8d9f826420b9f91f not found: ID does not exist" Apr 17 18:09:10.077985 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:10.077950 2566 scope.go:117] "RemoveContainer" containerID="58aa786699c77c84964800a9184ee6b53c693f31a33137e96f71099369b2075d" Apr 17 18:09:10.078237 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:09:10.078221 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58aa786699c77c84964800a9184ee6b53c693f31a33137e96f71099369b2075d\": container with ID starting with 58aa786699c77c84964800a9184ee6b53c693f31a33137e96f71099369b2075d not found: ID does not exist" containerID="58aa786699c77c84964800a9184ee6b53c693f31a33137e96f71099369b2075d" Apr 17 18:09:10.078324 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:10.078245 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58aa786699c77c84964800a9184ee6b53c693f31a33137e96f71099369b2075d"} err="failed to get container status \"58aa786699c77c84964800a9184ee6b53c693f31a33137e96f71099369b2075d\": rpc error: code = NotFound desc = could not find container \"58aa786699c77c84964800a9184ee6b53c693f31a33137e96f71099369b2075d\": container with ID starting with 58aa786699c77c84964800a9184ee6b53c693f31a33137e96f71099369b2075d not found: ID does not exist" Apr 17 18:09:10.078897 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:10.078878 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-xff9m"] Apr 17 18:09:10.838335 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:10.838295 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eed9c8ea-9005-49a5-807d-60798646a5f9" path="/var/lib/kubelet/pods/eed9c8ea-9005-49a5-807d-60798646a5f9/volumes" Apr 17 18:09:15.059611 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:15.059585 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-c8mll" Apr 17 18:09:15.060129 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:15.060102 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-c8mll" podUID="ae301454-239b-4f1f-9057-c2b8ba5396d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 17 18:09:25.060109 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:25.060059 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-c8mll" podUID="ae301454-239b-4f1f-9057-c2b8ba5396d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 17 18:09:35.061015 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:35.060972 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-c8mll" podUID="ae301454-239b-4f1f-9057-c2b8ba5396d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 17 18:09:45.060657 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:45.060614 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-c8mll" podUID="ae301454-239b-4f1f-9057-c2b8ba5396d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 17 18:09:48.949266 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:48.949228 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz777_35fe1bc2-e1a4-4744-8f67-3cc38747e567/ovn-acl-logging/0.log" Apr 17 18:09:48.951350 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:48.951329 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz777_35fe1bc2-e1a4-4744-8f67-3cc38747e567/ovn-acl-logging/0.log" Apr 17 18:09:55.061044 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:09:55.061001 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-c8mll" podUID="ae301454-239b-4f1f-9057-c2b8ba5396d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 17 18:10:05.060981 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:05.060939 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-c8mll" podUID="ae301454-239b-4f1f-9057-c2b8ba5396d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 17 18:10:15.061542 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:15.061513 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-c8mll" Apr 17 18:10:22.455088 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:22.455055 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-c8mll"] Apr 17 18:10:22.455587 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:22.455392 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-c8mll" podUID="ae301454-239b-4f1f-9057-c2b8ba5396d6" containerName="kserve-container" containerID="cri-o://099874d0d7554690afe5d015ad933f83553cc8329f441f96f6adf5736c93cb50" gracePeriod=30 Apr 17 18:10:22.455587 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:22.455514 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-c8mll" podUID="ae301454-239b-4f1f-9057-c2b8ba5396d6" containerName="kube-rbac-proxy" containerID="cri-o://d7c83a586d94af05ed266a4652f0ba162928cc2afc38df0e632c9642baf01e1c" gracePeriod=30 Apr 17 18:10:22.527352 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:22.527318 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l"] Apr 17 18:10:22.527684 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:22.527669 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eed9c8ea-9005-49a5-807d-60798646a5f9" containerName="storage-initializer" Apr 17 18:10:22.527732 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:22.527686 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="eed9c8ea-9005-49a5-807d-60798646a5f9" containerName="storage-initializer" Apr 17 18:10:22.527732 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:22.527694 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eed9c8ea-9005-49a5-807d-60798646a5f9" containerName="kserve-container" Apr 17 18:10:22.527732 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:22.527700 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="eed9c8ea-9005-49a5-807d-60798646a5f9" containerName="kserve-container" Apr 17 18:10:22.527732 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:22.527727 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eed9c8ea-9005-49a5-807d-60798646a5f9" containerName="kube-rbac-proxy" Apr 17 18:10:22.527732 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:22.527733 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="eed9c8ea-9005-49a5-807d-60798646a5f9" containerName="kube-rbac-proxy" Apr 17 18:10:22.527888 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:22.527786 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="eed9c8ea-9005-49a5-807d-60798646a5f9" containerName="kserve-container" Apr 17 18:10:22.527888 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:22.527795 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="eed9c8ea-9005-49a5-807d-60798646a5f9" containerName="kube-rbac-proxy" Apr 17 18:10:22.531121 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:22.531105 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l" Apr 17 18:10:22.533365 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:22.533338 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-mixed-predictor-serving-cert\"" Apr 17 18:10:22.533484 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:22.533403 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\"" Apr 17 18:10:22.537884 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:22.537860 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/59217f5c-bd33-4db2-9740-417234884083-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l\" (UID: \"59217f5c-bd33-4db2-9740-417234884083\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l" Apr 17 18:10:22.538010 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:22.537900 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cn9zx\" (UniqueName: \"kubernetes.io/projected/59217f5c-bd33-4db2-9740-417234884083-kube-api-access-cn9zx\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l\" (UID: \"59217f5c-bd33-4db2-9740-417234884083\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l" Apr 17 18:10:22.538010 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:22.537955 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/59217f5c-bd33-4db2-9740-417234884083-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l\" (UID: \"59217f5c-bd33-4db2-9740-417234884083\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l" Apr 17 18:10:22.538132 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:22.538017 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59217f5c-bd33-4db2-9740-417234884083-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l\" (UID: \"59217f5c-bd33-4db2-9740-417234884083\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l" Apr 17 18:10:22.540693 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:22.540672 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l"] Apr 17 18:10:22.638379 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:22.638348 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cn9zx\" (UniqueName: \"kubernetes.io/projected/59217f5c-bd33-4db2-9740-417234884083-kube-api-access-cn9zx\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l\" (UID: \"59217f5c-bd33-4db2-9740-417234884083\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l" Apr 17 18:10:22.638544 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:22.638395 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/59217f5c-bd33-4db2-9740-417234884083-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l\" (UID: \"59217f5c-bd33-4db2-9740-417234884083\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l" Apr 17 18:10:22.638544 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:22.638437 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59217f5c-bd33-4db2-9740-417234884083-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l\" (UID: \"59217f5c-bd33-4db2-9740-417234884083\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l" Apr 17 18:10:22.638544 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:22.638469 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/59217f5c-bd33-4db2-9740-417234884083-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l\" (UID: \"59217f5c-bd33-4db2-9740-417234884083\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l" Apr 17 18:10:22.638692 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:10:22.638574 2566 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-serving-cert: secret "isvc-sklearn-v2-mixed-predictor-serving-cert" not found Apr 17 18:10:22.638692 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:10:22.638630 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59217f5c-bd33-4db2-9740-417234884083-proxy-tls podName:59217f5c-bd33-4db2-9740-417234884083 nodeName:}" failed. No retries permitted until 2026-04-17 18:10:23.138611202 +0000 UTC m=+2734.808690822 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/59217f5c-bd33-4db2-9740-417234884083-proxy-tls") pod "isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l" (UID: "59217f5c-bd33-4db2-9740-417234884083") : secret "isvc-sklearn-v2-mixed-predictor-serving-cert" not found Apr 17 18:10:22.638836 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:22.638818 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59217f5c-bd33-4db2-9740-417234884083-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l\" (UID: \"59217f5c-bd33-4db2-9740-417234884083\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l" Apr 17 18:10:22.639101 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:22.639079 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/59217f5c-bd33-4db2-9740-417234884083-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l\" (UID: \"59217f5c-bd33-4db2-9740-417234884083\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l" Apr 17 18:10:22.646267 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:22.646230 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cn9zx\" (UniqueName: \"kubernetes.io/projected/59217f5c-bd33-4db2-9740-417234884083-kube-api-access-cn9zx\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l\" (UID: \"59217f5c-bd33-4db2-9740-417234884083\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l" Apr 17 18:10:23.142783 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:23.142747 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/59217f5c-bd33-4db2-9740-417234884083-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l\" (UID: \"59217f5c-bd33-4db2-9740-417234884083\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l" Apr 17 18:10:23.145090 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:23.145070 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/59217f5c-bd33-4db2-9740-417234884083-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l\" (UID: \"59217f5c-bd33-4db2-9740-417234884083\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l" Apr 17 18:10:23.276662 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:23.276632 2566 generic.go:358] "Generic (PLEG): container finished" podID="ae301454-239b-4f1f-9057-c2b8ba5396d6" containerID="d7c83a586d94af05ed266a4652f0ba162928cc2afc38df0e632c9642baf01e1c" exitCode=2 Apr 17 18:10:23.276827 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:23.276714 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-c8mll" event={"ID":"ae301454-239b-4f1f-9057-c2b8ba5396d6","Type":"ContainerDied","Data":"d7c83a586d94af05ed266a4652f0ba162928cc2afc38df0e632c9642baf01e1c"} Apr 17 18:10:23.441933 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:23.441833 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l" Apr 17 18:10:23.561949 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:23.561927 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l"] Apr 17 18:10:23.564094 ip-10-0-140-147 kubenswrapper[2566]: W0417 18:10:23.564066 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59217f5c_bd33_4db2_9740_417234884083.slice/crio-c554061111c683c3196dfe8d8848783f567a24d96aca59d9f741c85292afce14 WatchSource:0}: Error finding container c554061111c683c3196dfe8d8848783f567a24d96aca59d9f741c85292afce14: Status 404 returned error can't find the container with id c554061111c683c3196dfe8d8848783f567a24d96aca59d9f741c85292afce14 Apr 17 18:10:23.565914 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:23.565896 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 18:10:24.282000 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:24.281960 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l" event={"ID":"59217f5c-bd33-4db2-9740-417234884083","Type":"ContainerStarted","Data":"8c7c815c1f12b79873eb2101cb93811d4a5536a214ba744cf06e6d654faa5efe"} Apr 17 18:10:24.282179 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:24.282005 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l" event={"ID":"59217f5c-bd33-4db2-9740-417234884083","Type":"ContainerStarted","Data":"c554061111c683c3196dfe8d8848783f567a24d96aca59d9f741c85292afce14"} Apr 17 18:10:25.055736 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:25.055691 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-c8mll" podUID="ae301454-239b-4f1f-9057-c2b8ba5396d6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.47:8643/healthz\": dial tcp 10.133.0.47:8643: connect: connection refused" Apr 17 18:10:25.060036 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:25.060004 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-c8mll" podUID="ae301454-239b-4f1f-9057-c2b8ba5396d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 17 18:10:26.791651 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:26.791623 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-c8mll" Apr 17 18:10:26.872738 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:26.872704 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ae301454-239b-4f1f-9057-c2b8ba5396d6-kserve-provision-location\") pod \"ae301454-239b-4f1f-9057-c2b8ba5396d6\" (UID: \"ae301454-239b-4f1f-9057-c2b8ba5396d6\") " Apr 17 18:10:26.872920 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:26.872745 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ae301454-239b-4f1f-9057-c2b8ba5396d6-proxy-tls\") pod \"ae301454-239b-4f1f-9057-c2b8ba5396d6\" (UID: \"ae301454-239b-4f1f-9057-c2b8ba5396d6\") " Apr 17 18:10:26.872920 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:26.872762 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7bvt\" (UniqueName: \"kubernetes.io/projected/ae301454-239b-4f1f-9057-c2b8ba5396d6-kube-api-access-x7bvt\") pod \"ae301454-239b-4f1f-9057-c2b8ba5396d6\" (UID: \"ae301454-239b-4f1f-9057-c2b8ba5396d6\") " Apr 17 18:10:26.872920 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:26.872810 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ae301454-239b-4f1f-9057-c2b8ba5396d6-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"ae301454-239b-4f1f-9057-c2b8ba5396d6\" (UID: \"ae301454-239b-4f1f-9057-c2b8ba5396d6\") " Apr 17 18:10:26.873096 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:26.873008 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae301454-239b-4f1f-9057-c2b8ba5396d6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ae301454-239b-4f1f-9057-c2b8ba5396d6" (UID: "ae301454-239b-4f1f-9057-c2b8ba5396d6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:10:26.873221 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:26.873190 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae301454-239b-4f1f-9057-c2b8ba5396d6-isvc-sklearn-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-v2-kube-rbac-proxy-sar-config") pod "ae301454-239b-4f1f-9057-c2b8ba5396d6" (UID: "ae301454-239b-4f1f-9057-c2b8ba5396d6"). InnerVolumeSpecName "isvc-sklearn-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:10:26.874808 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:26.874786 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae301454-239b-4f1f-9057-c2b8ba5396d6-kube-api-access-x7bvt" (OuterVolumeSpecName: "kube-api-access-x7bvt") pod "ae301454-239b-4f1f-9057-c2b8ba5396d6" (UID: "ae301454-239b-4f1f-9057-c2b8ba5396d6"). InnerVolumeSpecName "kube-api-access-x7bvt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:10:26.874887 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:26.874805 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae301454-239b-4f1f-9057-c2b8ba5396d6-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ae301454-239b-4f1f-9057-c2b8ba5396d6" (UID: "ae301454-239b-4f1f-9057-c2b8ba5396d6"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:10:26.973971 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:26.973895 2566 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ae301454-239b-4f1f-9057-c2b8ba5396d6-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:10:26.973971 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:26.973921 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ae301454-239b-4f1f-9057-c2b8ba5396d6-kserve-provision-location\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:10:26.973971 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:26.973934 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ae301454-239b-4f1f-9057-c2b8ba5396d6-proxy-tls\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:10:26.973971 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:26.973944 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x7bvt\" (UniqueName: \"kubernetes.io/projected/ae301454-239b-4f1f-9057-c2b8ba5396d6-kube-api-access-x7bvt\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:10:27.292412 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:27.292320 2566 generic.go:358] "Generic (PLEG): container finished" podID="ae301454-239b-4f1f-9057-c2b8ba5396d6" containerID="099874d0d7554690afe5d015ad933f83553cc8329f441f96f6adf5736c93cb50" exitCode=0 Apr 17 18:10:27.292565 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:27.292399 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-c8mll" event={"ID":"ae301454-239b-4f1f-9057-c2b8ba5396d6","Type":"ContainerDied","Data":"099874d0d7554690afe5d015ad933f83553cc8329f441f96f6adf5736c93cb50"} Apr 17 18:10:27.292565 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:27.292439 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-c8mll" event={"ID":"ae301454-239b-4f1f-9057-c2b8ba5396d6","Type":"ContainerDied","Data":"56b72af2d690dabdfa9d311f11b22002f241fe46d9c2318f61a77fed49e4528a"} Apr 17 18:10:27.292565 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:27.292441 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-c8mll" Apr 17 18:10:27.292565 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:27.292452 2566 scope.go:117] "RemoveContainer" containerID="d7c83a586d94af05ed266a4652f0ba162928cc2afc38df0e632c9642baf01e1c" Apr 17 18:10:27.300596 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:27.300576 2566 scope.go:117] "RemoveContainer" containerID="099874d0d7554690afe5d015ad933f83553cc8329f441f96f6adf5736c93cb50" Apr 17 18:10:27.307744 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:27.307725 2566 scope.go:117] "RemoveContainer" containerID="9cf54d6e4afe2b030ea389c0b0888af1258cfc3bb704252023afd7552a9b2262" Apr 17 18:10:27.313055 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:27.313032 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-c8mll"] Apr 17 18:10:27.314782 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:27.314766 2566 scope.go:117] "RemoveContainer" containerID="d7c83a586d94af05ed266a4652f0ba162928cc2afc38df0e632c9642baf01e1c" Apr 17 18:10:27.315041 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:10:27.315023 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7c83a586d94af05ed266a4652f0ba162928cc2afc38df0e632c9642baf01e1c\": container with ID starting with d7c83a586d94af05ed266a4652f0ba162928cc2afc38df0e632c9642baf01e1c not found: ID does not exist" containerID="d7c83a586d94af05ed266a4652f0ba162928cc2afc38df0e632c9642baf01e1c" Apr 17 18:10:27.315098 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:27.315049 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7c83a586d94af05ed266a4652f0ba162928cc2afc38df0e632c9642baf01e1c"} err="failed to get container status \"d7c83a586d94af05ed266a4652f0ba162928cc2afc38df0e632c9642baf01e1c\": rpc error: code = NotFound desc = could not find container \"d7c83a586d94af05ed266a4652f0ba162928cc2afc38df0e632c9642baf01e1c\": container with ID starting with d7c83a586d94af05ed266a4652f0ba162928cc2afc38df0e632c9642baf01e1c not found: ID does not exist" Apr 17 18:10:27.315098 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:27.315068 2566 scope.go:117] "RemoveContainer" containerID="099874d0d7554690afe5d015ad933f83553cc8329f441f96f6adf5736c93cb50" Apr 17 18:10:27.315341 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:10:27.315313 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"099874d0d7554690afe5d015ad933f83553cc8329f441f96f6adf5736c93cb50\": container with ID starting with 099874d0d7554690afe5d015ad933f83553cc8329f441f96f6adf5736c93cb50 not found: ID does not exist" containerID="099874d0d7554690afe5d015ad933f83553cc8329f441f96f6adf5736c93cb50" Apr 17 18:10:27.315435 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:27.315361 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"099874d0d7554690afe5d015ad933f83553cc8329f441f96f6adf5736c93cb50"} err="failed to get container status \"099874d0d7554690afe5d015ad933f83553cc8329f441f96f6adf5736c93cb50\": rpc error: code = NotFound desc = could not find container \"099874d0d7554690afe5d015ad933f83553cc8329f441f96f6adf5736c93cb50\": container with ID starting with 099874d0d7554690afe5d015ad933f83553cc8329f441f96f6adf5736c93cb50 not found: ID does not exist" Apr 17 18:10:27.315435 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:27.315378 2566 scope.go:117] "RemoveContainer" containerID="9cf54d6e4afe2b030ea389c0b0888af1258cfc3bb704252023afd7552a9b2262" Apr 17 18:10:27.315613 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:10:27.315599 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cf54d6e4afe2b030ea389c0b0888af1258cfc3bb704252023afd7552a9b2262\": container with ID starting with 9cf54d6e4afe2b030ea389c0b0888af1258cfc3bb704252023afd7552a9b2262 not found: ID does not exist" containerID="9cf54d6e4afe2b030ea389c0b0888af1258cfc3bb704252023afd7552a9b2262" Apr 17 18:10:27.315664 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:27.315617 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cf54d6e4afe2b030ea389c0b0888af1258cfc3bb704252023afd7552a9b2262"} err="failed to get container status \"9cf54d6e4afe2b030ea389c0b0888af1258cfc3bb704252023afd7552a9b2262\": rpc error: code = NotFound desc = could not find container \"9cf54d6e4afe2b030ea389c0b0888af1258cfc3bb704252023afd7552a9b2262\": container with ID starting with 9cf54d6e4afe2b030ea389c0b0888af1258cfc3bb704252023afd7552a9b2262 not found: ID does not exist" Apr 17 18:10:27.318957 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:27.318937 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-c8mll"] Apr 17 18:10:28.297024 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:28.296989 2566 generic.go:358] "Generic (PLEG): container finished" podID="59217f5c-bd33-4db2-9740-417234884083" containerID="8c7c815c1f12b79873eb2101cb93811d4a5536a214ba744cf06e6d654faa5efe" exitCode=0 Apr 17 18:10:28.297520 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:28.297057 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l" event={"ID":"59217f5c-bd33-4db2-9740-417234884083","Type":"ContainerDied","Data":"8c7c815c1f12b79873eb2101cb93811d4a5536a214ba744cf06e6d654faa5efe"} Apr 17 18:10:28.838676 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:28.838642 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae301454-239b-4f1f-9057-c2b8ba5396d6" path="/var/lib/kubelet/pods/ae301454-239b-4f1f-9057-c2b8ba5396d6/volumes" Apr 17 18:10:29.301235 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:29.301150 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l" event={"ID":"59217f5c-bd33-4db2-9740-417234884083","Type":"ContainerStarted","Data":"25e3883b89078cba4e84bc07f0a3994498a391ec15680baafc8f51f269c005d3"} Apr 17 18:10:29.301235 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:29.301192 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l" event={"ID":"59217f5c-bd33-4db2-9740-417234884083","Type":"ContainerStarted","Data":"81890acb6c61dbaf1f67c56febde6ee4ed5a2aa9620187f4aa122cd32a583a4d"} Apr 17 18:10:29.301711 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:29.301506 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l" Apr 17 18:10:29.301711 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:29.301632 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l" Apr 17 18:10:29.302885 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:29.302855 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l" podUID="59217f5c-bd33-4db2-9740-417234884083" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 17 18:10:29.319198 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:29.319157 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l" podStartSLOduration=7.319144931 podStartE2EDuration="7.319144931s" podCreationTimestamp="2026-04-17 18:10:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:10:29.318343325 +0000 UTC m=+2740.988422966" watchObservedRunningTime="2026-04-17 18:10:29.319144931 +0000 UTC m=+2740.989224573" Apr 17 18:10:30.304534 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:30.304496 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l" podUID="59217f5c-bd33-4db2-9740-417234884083" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 17 18:10:35.309199 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:35.309166 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l" Apr 17 18:10:35.309689 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:35.309663 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l" podUID="59217f5c-bd33-4db2-9740-417234884083" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 17 18:10:45.309982 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:45.309942 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l" podUID="59217f5c-bd33-4db2-9740-417234884083" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 17 18:10:55.309817 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:10:55.309777 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l" podUID="59217f5c-bd33-4db2-9740-417234884083" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 17 18:11:05.309646 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:05.309564 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l" podUID="59217f5c-bd33-4db2-9740-417234884083" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 17 18:11:15.310118 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:15.310079 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l" podUID="59217f5c-bd33-4db2-9740-417234884083" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 17 18:11:25.310232 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:25.310185 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l" podUID="59217f5c-bd33-4db2-9740-417234884083" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 17 18:11:35.310411 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:35.310378 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l" Apr 17 18:11:42.660919 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:42.660887 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l"] Apr 17 18:11:42.661402 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:42.661318 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l" podUID="59217f5c-bd33-4db2-9740-417234884083" containerName="kserve-container" containerID="cri-o://81890acb6c61dbaf1f67c56febde6ee4ed5a2aa9620187f4aa122cd32a583a4d" gracePeriod=30 Apr 17 18:11:42.661402 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:42.661333 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l" podUID="59217f5c-bd33-4db2-9740-417234884083" containerName="kube-rbac-proxy" containerID="cri-o://25e3883b89078cba4e84bc07f0a3994498a391ec15680baafc8f51f269c005d3" gracePeriod=30 Apr 17 18:11:42.726723 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:42.726693 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xd2bq"] Apr 17 18:11:42.726999 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:42.726988 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ae301454-239b-4f1f-9057-c2b8ba5396d6" containerName="storage-initializer" Apr 17 18:11:42.727045 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:42.727002 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae301454-239b-4f1f-9057-c2b8ba5396d6" containerName="storage-initializer" Apr 17 18:11:42.727045 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:42.727017 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ae301454-239b-4f1f-9057-c2b8ba5396d6" containerName="kserve-container" Apr 17 18:11:42.727045 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:42.727023 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae301454-239b-4f1f-9057-c2b8ba5396d6" containerName="kserve-container" Apr 17 18:11:42.727045 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:42.727038 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ae301454-239b-4f1f-9057-c2b8ba5396d6" containerName="kube-rbac-proxy" Apr 17 18:11:42.727045 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:42.727045 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae301454-239b-4f1f-9057-c2b8ba5396d6" containerName="kube-rbac-proxy" Apr 17 18:11:42.727225 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:42.727093 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="ae301454-239b-4f1f-9057-c2b8ba5396d6" containerName="kserve-container" Apr 17 18:11:42.727225 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:42.727101 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="ae301454-239b-4f1f-9057-c2b8ba5396d6" containerName="kube-rbac-proxy" Apr 17 18:11:42.730445 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:42.730429 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xd2bq" Apr 17 18:11:42.732652 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:42.732614 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-kube-rbac-proxy-sar-config\"" Apr 17 18:11:42.732775 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:42.732709 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-predictor-serving-cert\"" Apr 17 18:11:42.743315 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:42.743286 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xd2bq"] Apr 17 18:11:42.817598 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:42.817566 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/638ce086-8f01-4848-a5b1-a6894b720bf6-kserve-provision-location\") pod \"isvc-tensorflow-predictor-6756f669d7-xd2bq\" (UID: \"638ce086-8f01-4848-a5b1-a6894b720bf6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xd2bq" Apr 17 18:11:42.817739 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:42.817613 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/638ce086-8f01-4848-a5b1-a6894b720bf6-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-xd2bq\" (UID: \"638ce086-8f01-4848-a5b1-a6894b720bf6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xd2bq" Apr 17 18:11:42.817739 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:42.817699 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xv5c\" (UniqueName: \"kubernetes.io/projected/638ce086-8f01-4848-a5b1-a6894b720bf6-kube-api-access-5xv5c\") pod \"isvc-tensorflow-predictor-6756f669d7-xd2bq\" (UID: \"638ce086-8f01-4848-a5b1-a6894b720bf6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xd2bq" Apr 17 18:11:42.817739 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:42.817731 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/638ce086-8f01-4848-a5b1-a6894b720bf6-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-predictor-6756f669d7-xd2bq\" (UID: \"638ce086-8f01-4848-a5b1-a6894b720bf6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xd2bq" Apr 17 18:11:42.919114 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:42.919023 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/638ce086-8f01-4848-a5b1-a6894b720bf6-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-xd2bq\" (UID: \"638ce086-8f01-4848-a5b1-a6894b720bf6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xd2bq" Apr 17 18:11:42.919114 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:42.919084 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5xv5c\" (UniqueName: \"kubernetes.io/projected/638ce086-8f01-4848-a5b1-a6894b720bf6-kube-api-access-5xv5c\") pod \"isvc-tensorflow-predictor-6756f669d7-xd2bq\" (UID: \"638ce086-8f01-4848-a5b1-a6894b720bf6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xd2bq" Apr 17 18:11:42.919114 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:42.919104 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/638ce086-8f01-4848-a5b1-a6894b720bf6-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-predictor-6756f669d7-xd2bq\" (UID: \"638ce086-8f01-4848-a5b1-a6894b720bf6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xd2bq" Apr 17 18:11:42.919438 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:42.919140 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/638ce086-8f01-4848-a5b1-a6894b720bf6-kserve-provision-location\") pod \"isvc-tensorflow-predictor-6756f669d7-xd2bq\" (UID: \"638ce086-8f01-4848-a5b1-a6894b720bf6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xd2bq" Apr 17 18:11:42.919627 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:42.919608 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/638ce086-8f01-4848-a5b1-a6894b720bf6-kserve-provision-location\") pod \"isvc-tensorflow-predictor-6756f669d7-xd2bq\" (UID: \"638ce086-8f01-4848-a5b1-a6894b720bf6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xd2bq" Apr 17 18:11:42.919766 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:42.919749 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/638ce086-8f01-4848-a5b1-a6894b720bf6-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-predictor-6756f669d7-xd2bq\" (UID: \"638ce086-8f01-4848-a5b1-a6894b720bf6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xd2bq" Apr 17 18:11:42.921493 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:42.921476 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/638ce086-8f01-4848-a5b1-a6894b720bf6-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-xd2bq\" (UID: \"638ce086-8f01-4848-a5b1-a6894b720bf6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xd2bq" Apr 17 18:11:42.927017 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:42.926986 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xv5c\" (UniqueName: \"kubernetes.io/projected/638ce086-8f01-4848-a5b1-a6894b720bf6-kube-api-access-5xv5c\") pod \"isvc-tensorflow-predictor-6756f669d7-xd2bq\" (UID: \"638ce086-8f01-4848-a5b1-a6894b720bf6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xd2bq" Apr 17 18:11:43.040581 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:43.040541 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xd2bq" Apr 17 18:11:43.157220 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:43.157194 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xd2bq"] Apr 17 18:11:43.159710 ip-10-0-140-147 kubenswrapper[2566]: W0417 18:11:43.159680 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod638ce086_8f01_4848_a5b1_a6894b720bf6.slice/crio-395332da7f355abf0adf4bf4e2e510c82386b7c18edcc46075f45fa6c13f3cfa WatchSource:0}: Error finding container 395332da7f355abf0adf4bf4e2e510c82386b7c18edcc46075f45fa6c13f3cfa: Status 404 returned error can't find the container with id 395332da7f355abf0adf4bf4e2e510c82386b7c18edcc46075f45fa6c13f3cfa Apr 17 18:11:43.520067 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:43.519970 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xd2bq" event={"ID":"638ce086-8f01-4848-a5b1-a6894b720bf6","Type":"ContainerStarted","Data":"ab611e94c94e6bdea42317c71d555926a38dcfd2728d4a1d1eddf7e4d7ded5c3"} Apr 17 18:11:43.520067 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:43.520019 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xd2bq" event={"ID":"638ce086-8f01-4848-a5b1-a6894b720bf6","Type":"ContainerStarted","Data":"395332da7f355abf0adf4bf4e2e510c82386b7c18edcc46075f45fa6c13f3cfa"} Apr 17 18:11:43.521907 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:43.521881 2566 generic.go:358] "Generic (PLEG): container finished" podID="59217f5c-bd33-4db2-9740-417234884083" containerID="25e3883b89078cba4e84bc07f0a3994498a391ec15680baafc8f51f269c005d3" exitCode=2 Apr 17 18:11:43.522021 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:43.521947 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l" event={"ID":"59217f5c-bd33-4db2-9740-417234884083","Type":"ContainerDied","Data":"25e3883b89078cba4e84bc07f0a3994498a391ec15680baafc8f51f269c005d3"} Apr 17 18:11:45.305661 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:45.305619 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l" podUID="59217f5c-bd33-4db2-9740-417234884083" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.48:8643/healthz\": dial tcp 10.133.0.48:8643: connect: connection refused" Apr 17 18:11:45.309997 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:45.309966 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l" podUID="59217f5c-bd33-4db2-9740-417234884083" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 17 18:11:47.100405 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:47.100384 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l" Apr 17 18:11:47.257091 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:47.257000 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/59217f5c-bd33-4db2-9740-417234884083-proxy-tls\") pod \"59217f5c-bd33-4db2-9740-417234884083\" (UID: \"59217f5c-bd33-4db2-9740-417234884083\") " Apr 17 18:11:47.257091 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:47.257038 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/59217f5c-bd33-4db2-9740-417234884083-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"59217f5c-bd33-4db2-9740-417234884083\" (UID: \"59217f5c-bd33-4db2-9740-417234884083\") " Apr 17 18:11:47.257091 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:47.257091 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59217f5c-bd33-4db2-9740-417234884083-kserve-provision-location\") pod \"59217f5c-bd33-4db2-9740-417234884083\" (UID: \"59217f5c-bd33-4db2-9740-417234884083\") " Apr 17 18:11:47.257408 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:47.257145 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cn9zx\" (UniqueName: \"kubernetes.io/projected/59217f5c-bd33-4db2-9740-417234884083-kube-api-access-cn9zx\") pod \"59217f5c-bd33-4db2-9740-417234884083\" (UID: \"59217f5c-bd33-4db2-9740-417234884083\") " Apr 17 18:11:47.257533 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:47.257505 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59217f5c-bd33-4db2-9740-417234884083-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "59217f5c-bd33-4db2-9740-417234884083" (UID: "59217f5c-bd33-4db2-9740-417234884083"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:11:47.257594 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:47.257557 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59217f5c-bd33-4db2-9740-417234884083-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config") pod "59217f5c-bd33-4db2-9740-417234884083" (UID: "59217f5c-bd33-4db2-9740-417234884083"). InnerVolumeSpecName "isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:11:47.259144 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:47.259116 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59217f5c-bd33-4db2-9740-417234884083-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "59217f5c-bd33-4db2-9740-417234884083" (UID: "59217f5c-bd33-4db2-9740-417234884083"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:11:47.259266 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:47.259197 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59217f5c-bd33-4db2-9740-417234884083-kube-api-access-cn9zx" (OuterVolumeSpecName: "kube-api-access-cn9zx") pod "59217f5c-bd33-4db2-9740-417234884083" (UID: "59217f5c-bd33-4db2-9740-417234884083"). InnerVolumeSpecName "kube-api-access-cn9zx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:11:47.357963 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:47.357916 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/59217f5c-bd33-4db2-9740-417234884083-proxy-tls\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:11:47.357963 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:47.357950 2566 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/59217f5c-bd33-4db2-9740-417234884083-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:11:47.357963 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:47.357961 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59217f5c-bd33-4db2-9740-417234884083-kserve-provision-location\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:11:47.357963 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:47.357971 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cn9zx\" (UniqueName: \"kubernetes.io/projected/59217f5c-bd33-4db2-9740-417234884083-kube-api-access-cn9zx\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:11:47.537444 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:47.537363 2566 generic.go:358] "Generic (PLEG): container finished" podID="59217f5c-bd33-4db2-9740-417234884083" containerID="81890acb6c61dbaf1f67c56febde6ee4ed5a2aa9620187f4aa122cd32a583a4d" exitCode=0 Apr 17 18:11:47.537444 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:47.537423 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l" event={"ID":"59217f5c-bd33-4db2-9740-417234884083","Type":"ContainerDied","Data":"81890acb6c61dbaf1f67c56febde6ee4ed5a2aa9620187f4aa122cd32a583a4d"} Apr 17 18:11:47.537444 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:47.537439 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l" Apr 17 18:11:47.537694 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:47.537450 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l" event={"ID":"59217f5c-bd33-4db2-9740-417234884083","Type":"ContainerDied","Data":"c554061111c683c3196dfe8d8848783f567a24d96aca59d9f741c85292afce14"} Apr 17 18:11:47.537694 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:47.537465 2566 scope.go:117] "RemoveContainer" containerID="25e3883b89078cba4e84bc07f0a3994498a391ec15680baafc8f51f269c005d3" Apr 17 18:11:47.545845 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:47.545819 2566 scope.go:117] "RemoveContainer" containerID="81890acb6c61dbaf1f67c56febde6ee4ed5a2aa9620187f4aa122cd32a583a4d" Apr 17 18:11:47.552562 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:47.552542 2566 scope.go:117] "RemoveContainer" containerID="8c7c815c1f12b79873eb2101cb93811d4a5536a214ba744cf06e6d654faa5efe" Apr 17 18:11:47.557867 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:47.557844 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l"] Apr 17 18:11:47.560986 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:47.560966 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-lxk4l"] Apr 17 18:11:47.561793 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:47.561772 2566 scope.go:117] "RemoveContainer" containerID="25e3883b89078cba4e84bc07f0a3994498a391ec15680baafc8f51f269c005d3" Apr 17 18:11:47.562181 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:11:47.562163 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25e3883b89078cba4e84bc07f0a3994498a391ec15680baafc8f51f269c005d3\": container with ID starting with 25e3883b89078cba4e84bc07f0a3994498a391ec15680baafc8f51f269c005d3 not found: ID does not exist" containerID="25e3883b89078cba4e84bc07f0a3994498a391ec15680baafc8f51f269c005d3" Apr 17 18:11:47.562281 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:47.562194 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25e3883b89078cba4e84bc07f0a3994498a391ec15680baafc8f51f269c005d3"} err="failed to get container status \"25e3883b89078cba4e84bc07f0a3994498a391ec15680baafc8f51f269c005d3\": rpc error: code = NotFound desc = could not find container \"25e3883b89078cba4e84bc07f0a3994498a391ec15680baafc8f51f269c005d3\": container with ID starting with 25e3883b89078cba4e84bc07f0a3994498a391ec15680baafc8f51f269c005d3 not found: ID does not exist" Apr 17 18:11:47.562281 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:47.562219 2566 scope.go:117] "RemoveContainer" containerID="81890acb6c61dbaf1f67c56febde6ee4ed5a2aa9620187f4aa122cd32a583a4d" Apr 17 18:11:47.562511 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:11:47.562495 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81890acb6c61dbaf1f67c56febde6ee4ed5a2aa9620187f4aa122cd32a583a4d\": container with ID starting with 81890acb6c61dbaf1f67c56febde6ee4ed5a2aa9620187f4aa122cd32a583a4d not found: ID does not exist" containerID="81890acb6c61dbaf1f67c56febde6ee4ed5a2aa9620187f4aa122cd32a583a4d" Apr 17 18:11:47.562556 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:47.562518 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81890acb6c61dbaf1f67c56febde6ee4ed5a2aa9620187f4aa122cd32a583a4d"} err="failed to get container status \"81890acb6c61dbaf1f67c56febde6ee4ed5a2aa9620187f4aa122cd32a583a4d\": rpc error: code = NotFound desc = could not find container \"81890acb6c61dbaf1f67c56febde6ee4ed5a2aa9620187f4aa122cd32a583a4d\": container with ID starting with 81890acb6c61dbaf1f67c56febde6ee4ed5a2aa9620187f4aa122cd32a583a4d not found: ID does not exist" Apr 17 18:11:47.562556 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:47.562534 2566 scope.go:117] "RemoveContainer" containerID="8c7c815c1f12b79873eb2101cb93811d4a5536a214ba744cf06e6d654faa5efe" Apr 17 18:11:47.562769 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:11:47.562747 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c7c815c1f12b79873eb2101cb93811d4a5536a214ba744cf06e6d654faa5efe\": container with ID starting with 8c7c815c1f12b79873eb2101cb93811d4a5536a214ba744cf06e6d654faa5efe not found: ID does not exist" containerID="8c7c815c1f12b79873eb2101cb93811d4a5536a214ba744cf06e6d654faa5efe" Apr 17 18:11:47.562815 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:47.562780 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c7c815c1f12b79873eb2101cb93811d4a5536a214ba744cf06e6d654faa5efe"} err="failed to get container status \"8c7c815c1f12b79873eb2101cb93811d4a5536a214ba744cf06e6d654faa5efe\": rpc error: code = NotFound desc = could not find container \"8c7c815c1f12b79873eb2101cb93811d4a5536a214ba744cf06e6d654faa5efe\": container with ID starting with 8c7c815c1f12b79873eb2101cb93811d4a5536a214ba744cf06e6d654faa5efe not found: ID does not exist" Apr 17 18:11:48.839402 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:48.839370 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59217f5c-bd33-4db2-9740-417234884083" path="/var/lib/kubelet/pods/59217f5c-bd33-4db2-9740-417234884083/volumes" Apr 17 18:11:50.547731 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:50.547640 2566 generic.go:358] "Generic (PLEG): container finished" podID="638ce086-8f01-4848-a5b1-a6894b720bf6" containerID="ab611e94c94e6bdea42317c71d555926a38dcfd2728d4a1d1eddf7e4d7ded5c3" exitCode=0 Apr 17 18:11:50.547731 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:50.547715 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xd2bq" event={"ID":"638ce086-8f01-4848-a5b1-a6894b720bf6","Type":"ContainerDied","Data":"ab611e94c94e6bdea42317c71d555926a38dcfd2728d4a1d1eddf7e4d7ded5c3"} Apr 17 18:11:55.566727 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:55.566690 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xd2bq" event={"ID":"638ce086-8f01-4848-a5b1-a6894b720bf6","Type":"ContainerStarted","Data":"70000f7efa7ad4841f12b4a520f68e64c7f67b27e52607e4a944fe763d43fc03"} Apr 17 18:11:55.567094 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:55.566736 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xd2bq" event={"ID":"638ce086-8f01-4848-a5b1-a6894b720bf6","Type":"ContainerStarted","Data":"3880e7ee3cb5d56c971b871e0dcda8fc6131434ccca3e9c24dd7307bd3f18652"} Apr 17 18:11:55.567094 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:55.566936 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xd2bq" Apr 17 18:11:55.586917 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:55.586870 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xd2bq" podStartSLOduration=9.646067179 podStartE2EDuration="13.586857634s" podCreationTimestamp="2026-04-17 18:11:42 +0000 UTC" firstStartedPulling="2026-04-17 18:11:50.548887933 +0000 UTC m=+2822.218967552" lastFinishedPulling="2026-04-17 18:11:54.489678384 +0000 UTC m=+2826.159758007" observedRunningTime="2026-04-17 18:11:55.585139982 +0000 UTC m=+2827.255219624" watchObservedRunningTime="2026-04-17 18:11:55.586857634 +0000 UTC m=+2827.256937276" Apr 17 18:11:56.570299 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:56.570265 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xd2bq" Apr 17 18:11:56.571624 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:56.571595 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xd2bq" podUID="638ce086-8f01-4848-a5b1-a6894b720bf6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 17 18:11:57.573346 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:11:57.573300 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xd2bq" podUID="638ce086-8f01-4848-a5b1-a6894b720bf6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 17 18:12:02.578164 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:02.578139 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xd2bq" Apr 17 18:12:02.578630 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:02.578604 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xd2bq" podUID="638ce086-8f01-4848-a5b1-a6894b720bf6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 17 18:12:12.579720 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:12.579686 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xd2bq" Apr 17 18:12:33.380391 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:33.380356 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xd2bq"] Apr 17 18:12:33.380983 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:33.380786 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xd2bq" podUID="638ce086-8f01-4848-a5b1-a6894b720bf6" containerName="kserve-container" containerID="cri-o://70000f7efa7ad4841f12b4a520f68e64c7f67b27e52607e4a944fe763d43fc03" gracePeriod=30 Apr 17 18:12:33.380983 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:33.380835 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xd2bq" podUID="638ce086-8f01-4848-a5b1-a6894b720bf6" containerName="kube-rbac-proxy" containerID="cri-o://3880e7ee3cb5d56c971b871e0dcda8fc6131434ccca3e9c24dd7307bd3f18652" gracePeriod=30 Apr 17 18:12:33.448975 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:33.448939 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c64zm"] Apr 17 18:12:33.449246 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:33.449235 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59217f5c-bd33-4db2-9740-417234884083" containerName="storage-initializer" Apr 17 18:12:33.449329 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:33.449247 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="59217f5c-bd33-4db2-9740-417234884083" containerName="storage-initializer" Apr 17 18:12:33.449329 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:33.449276 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59217f5c-bd33-4db2-9740-417234884083" containerName="kserve-container" Apr 17 18:12:33.449329 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:33.449282 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="59217f5c-bd33-4db2-9740-417234884083" containerName="kserve-container" Apr 17 18:12:33.449329 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:33.449295 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59217f5c-bd33-4db2-9740-417234884083" containerName="kube-rbac-proxy" Apr 17 18:12:33.449329 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:33.449300 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="59217f5c-bd33-4db2-9740-417234884083" containerName="kube-rbac-proxy" Apr 17 18:12:33.449528 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:33.449342 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="59217f5c-bd33-4db2-9740-417234884083" containerName="kserve-container" Apr 17 18:12:33.449528 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:33.449351 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="59217f5c-bd33-4db2-9740-417234884083" containerName="kube-rbac-proxy" Apr 17 18:12:33.451999 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:33.451981 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c64zm" Apr 17 18:12:33.455162 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:33.455145 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\"" Apr 17 18:12:33.455272 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:33.455150 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-runtime-predictor-serving-cert\"" Apr 17 18:12:33.461815 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:33.461795 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c64zm"] Apr 17 18:12:33.553316 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:33.553285 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/283b3043-05e4-492f-a287-eafa84443d02-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-c64zm\" (UID: \"283b3043-05e4-492f-a287-eafa84443d02\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c64zm" Apr 17 18:12:33.553491 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:33.553338 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/283b3043-05e4-492f-a287-eafa84443d02-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-c64zm\" (UID: \"283b3043-05e4-492f-a287-eafa84443d02\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c64zm" Apr 17 18:12:33.553491 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:33.553368 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcf2c\" (UniqueName: \"kubernetes.io/projected/283b3043-05e4-492f-a287-eafa84443d02-kube-api-access-fcf2c\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-c64zm\" (UID: \"283b3043-05e4-492f-a287-eafa84443d02\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c64zm" Apr 17 18:12:33.553491 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:33.553453 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/283b3043-05e4-492f-a287-eafa84443d02-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-c64zm\" (UID: \"283b3043-05e4-492f-a287-eafa84443d02\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c64zm" Apr 17 18:12:33.654808 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:33.654732 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/283b3043-05e4-492f-a287-eafa84443d02-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-c64zm\" (UID: \"283b3043-05e4-492f-a287-eafa84443d02\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c64zm" Apr 17 18:12:33.654808 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:33.654771 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/283b3043-05e4-492f-a287-eafa84443d02-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-c64zm\" (UID: \"283b3043-05e4-492f-a287-eafa84443d02\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c64zm" Apr 17 18:12:33.654808 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:33.654803 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fcf2c\" (UniqueName: \"kubernetes.io/projected/283b3043-05e4-492f-a287-eafa84443d02-kube-api-access-fcf2c\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-c64zm\" (UID: \"283b3043-05e4-492f-a287-eafa84443d02\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c64zm" Apr 17 18:12:33.655096 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:33.654872 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/283b3043-05e4-492f-a287-eafa84443d02-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-c64zm\" (UID: \"283b3043-05e4-492f-a287-eafa84443d02\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c64zm" Apr 17 18:12:33.655096 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:12:33.654976 2566 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-serving-cert: secret "isvc-tensorflow-runtime-predictor-serving-cert" not found Apr 17 18:12:33.655096 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:12:33.655048 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/283b3043-05e4-492f-a287-eafa84443d02-proxy-tls podName:283b3043-05e4-492f-a287-eafa84443d02 nodeName:}" failed. No retries permitted until 2026-04-17 18:12:34.155030983 +0000 UTC m=+2865.825110603 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/283b3043-05e4-492f-a287-eafa84443d02-proxy-tls") pod "isvc-tensorflow-runtime-predictor-8699d78cf-c64zm" (UID: "283b3043-05e4-492f-a287-eafa84443d02") : secret "isvc-tensorflow-runtime-predictor-serving-cert" not found Apr 17 18:12:33.655291 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:33.655193 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/283b3043-05e4-492f-a287-eafa84443d02-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-c64zm\" (UID: \"283b3043-05e4-492f-a287-eafa84443d02\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c64zm" Apr 17 18:12:33.655543 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:33.655517 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/283b3043-05e4-492f-a287-eafa84443d02-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-c64zm\" (UID: \"283b3043-05e4-492f-a287-eafa84443d02\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c64zm" Apr 17 18:12:33.663078 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:33.663055 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcf2c\" (UniqueName: \"kubernetes.io/projected/283b3043-05e4-492f-a287-eafa84443d02-kube-api-access-fcf2c\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-c64zm\" (UID: \"283b3043-05e4-492f-a287-eafa84443d02\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c64zm" Apr 17 18:12:33.676787 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:33.676766 2566 generic.go:358] "Generic (PLEG): container finished" podID="638ce086-8f01-4848-a5b1-a6894b720bf6" containerID="3880e7ee3cb5d56c971b871e0dcda8fc6131434ccca3e9c24dd7307bd3f18652" exitCode=2 Apr 17 18:12:33.676884 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:33.676829 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xd2bq" event={"ID":"638ce086-8f01-4848-a5b1-a6894b720bf6","Type":"ContainerDied","Data":"3880e7ee3cb5d56c971b871e0dcda8fc6131434ccca3e9c24dd7307bd3f18652"} Apr 17 18:12:34.159568 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:34.159526 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/283b3043-05e4-492f-a287-eafa84443d02-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-c64zm\" (UID: \"283b3043-05e4-492f-a287-eafa84443d02\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c64zm" Apr 17 18:12:34.161870 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:34.161841 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/283b3043-05e4-492f-a287-eafa84443d02-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-c64zm\" (UID: \"283b3043-05e4-492f-a287-eafa84443d02\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c64zm" Apr 17 18:12:34.361919 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:34.361889 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c64zm" Apr 17 18:12:34.482111 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:34.481958 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c64zm"] Apr 17 18:12:34.484516 ip-10-0-140-147 kubenswrapper[2566]: W0417 18:12:34.484488 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod283b3043_05e4_492f_a287_eafa84443d02.slice/crio-5ad4bf9ff7793d8fda8c81ee46d1919fa84d34fdc9bec47c7f289d6d95d5623b WatchSource:0}: Error finding container 5ad4bf9ff7793d8fda8c81ee46d1919fa84d34fdc9bec47c7f289d6d95d5623b: Status 404 returned error can't find the container with id 5ad4bf9ff7793d8fda8c81ee46d1919fa84d34fdc9bec47c7f289d6d95d5623b Apr 17 18:12:34.681915 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:34.681876 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c64zm" event={"ID":"283b3043-05e4-492f-a287-eafa84443d02","Type":"ContainerStarted","Data":"31e3b506b9dde9ef2581c15fb9ca7e200fbfd0faf9e8547262ea46adfa24ec0c"} Apr 17 18:12:34.681915 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:34.681920 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c64zm" event={"ID":"283b3043-05e4-492f-a287-eafa84443d02","Type":"ContainerStarted","Data":"5ad4bf9ff7793d8fda8c81ee46d1919fa84d34fdc9bec47c7f289d6d95d5623b"} Apr 17 18:12:37.573967 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:37.573917 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xd2bq" podUID="638ce086-8f01-4848-a5b1-a6894b720bf6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.49:8643/healthz\": dial tcp 10.133.0.49:8643: connect: connection refused" Apr 17 18:12:39.699267 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:39.699221 2566 generic.go:358] "Generic (PLEG): container finished" podID="283b3043-05e4-492f-a287-eafa84443d02" containerID="31e3b506b9dde9ef2581c15fb9ca7e200fbfd0faf9e8547262ea46adfa24ec0c" exitCode=0 Apr 17 18:12:39.699690 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:39.699307 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c64zm" event={"ID":"283b3043-05e4-492f-a287-eafa84443d02","Type":"ContainerDied","Data":"31e3b506b9dde9ef2581c15fb9ca7e200fbfd0faf9e8547262ea46adfa24ec0c"} Apr 17 18:12:40.704207 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:40.704180 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c64zm" event={"ID":"283b3043-05e4-492f-a287-eafa84443d02","Type":"ContainerStarted","Data":"6401eff97dd0a71f7c3f977eef6a96d12a5eee73f63977f74d99cf2c88c6f8ed"} Apr 17 18:12:40.704207 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:40.704211 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c64zm" event={"ID":"283b3043-05e4-492f-a287-eafa84443d02","Type":"ContainerStarted","Data":"cbadc4e7c0330750f9107ae4abc2c45e25322e49900a61fbfc5b3dfc79306d88"} Apr 17 18:12:40.704675 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:40.704434 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c64zm" Apr 17 18:12:40.724549 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:40.724505 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c64zm" podStartSLOduration=7.724490714 podStartE2EDuration="7.724490714s" podCreationTimestamp="2026-04-17 18:12:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:12:40.723730839 +0000 UTC m=+2872.393810505" watchObservedRunningTime="2026-04-17 18:12:40.724490714 +0000 UTC m=+2872.394570357" Apr 17 18:12:41.707707 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:41.707672 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c64zm" Apr 17 18:12:41.708920 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:41.708888 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c64zm" podUID="283b3043-05e4-492f-a287-eafa84443d02" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Apr 17 18:12:42.573892 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:42.573844 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xd2bq" podUID="638ce086-8f01-4848-a5b1-a6894b720bf6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.49:8643/healthz\": dial tcp 10.133.0.49:8643: connect: connection refused" Apr 17 18:12:42.711342 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:42.711299 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c64zm" podUID="283b3043-05e4-492f-a287-eafa84443d02" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Apr 17 18:12:47.573768 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:47.573718 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xd2bq" podUID="638ce086-8f01-4848-a5b1-a6894b720bf6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.49:8643/healthz\": dial tcp 10.133.0.49:8643: connect: connection refused" Apr 17 18:12:47.574161 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:47.573869 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xd2bq" Apr 17 18:12:47.716718 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:47.716694 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c64zm" Apr 17 18:12:47.717322 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:47.717297 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c64zm" podUID="283b3043-05e4-492f-a287-eafa84443d02" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Apr 17 18:12:52.574488 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:52.574442 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xd2bq" podUID="638ce086-8f01-4848-a5b1-a6894b720bf6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.49:8643/healthz\": dial tcp 10.133.0.49:8643: connect: connection refused" Apr 17 18:12:57.574234 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:57.574192 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xd2bq" podUID="638ce086-8f01-4848-a5b1-a6894b720bf6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.49:8643/healthz\": dial tcp 10.133.0.49:8643: connect: connection refused" Apr 17 18:12:57.717660 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:12:57.717629 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c64zm" Apr 17 18:13:02.573699 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:02.573657 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xd2bq" podUID="638ce086-8f01-4848-a5b1-a6894b720bf6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.49:8643/healthz\": dial tcp 10.133.0.49:8643: connect: connection refused" Apr 17 18:13:03.772410 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:03.772323 2566 generic.go:358] "Generic (PLEG): container finished" podID="638ce086-8f01-4848-a5b1-a6894b720bf6" containerID="70000f7efa7ad4841f12b4a520f68e64c7f67b27e52607e4a944fe763d43fc03" exitCode=137 Apr 17 18:13:03.772745 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:03.772400 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xd2bq" event={"ID":"638ce086-8f01-4848-a5b1-a6894b720bf6","Type":"ContainerDied","Data":"70000f7efa7ad4841f12b4a520f68e64c7f67b27e52607e4a944fe763d43fc03"} Apr 17 18:13:04.012762 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:04.012738 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xd2bq" Apr 17 18:13:04.107438 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:04.107410 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/638ce086-8f01-4848-a5b1-a6894b720bf6-kserve-provision-location\") pod \"638ce086-8f01-4848-a5b1-a6894b720bf6\" (UID: \"638ce086-8f01-4848-a5b1-a6894b720bf6\") " Apr 17 18:13:04.107608 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:04.107469 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xv5c\" (UniqueName: \"kubernetes.io/projected/638ce086-8f01-4848-a5b1-a6894b720bf6-kube-api-access-5xv5c\") pod \"638ce086-8f01-4848-a5b1-a6894b720bf6\" (UID: \"638ce086-8f01-4848-a5b1-a6894b720bf6\") " Apr 17 18:13:04.107608 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:04.107490 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/638ce086-8f01-4848-a5b1-a6894b720bf6-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"638ce086-8f01-4848-a5b1-a6894b720bf6\" (UID: \"638ce086-8f01-4848-a5b1-a6894b720bf6\") " Apr 17 18:13:04.107608 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:04.107514 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/638ce086-8f01-4848-a5b1-a6894b720bf6-proxy-tls\") pod \"638ce086-8f01-4848-a5b1-a6894b720bf6\" (UID: \"638ce086-8f01-4848-a5b1-a6894b720bf6\") " Apr 17 18:13:04.107914 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:04.107890 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/638ce086-8f01-4848-a5b1-a6894b720bf6-isvc-tensorflow-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-tensorflow-kube-rbac-proxy-sar-config") pod "638ce086-8f01-4848-a5b1-a6894b720bf6" (UID: "638ce086-8f01-4848-a5b1-a6894b720bf6"). InnerVolumeSpecName "isvc-tensorflow-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:13:04.109578 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:04.109553 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/638ce086-8f01-4848-a5b1-a6894b720bf6-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "638ce086-8f01-4848-a5b1-a6894b720bf6" (UID: "638ce086-8f01-4848-a5b1-a6894b720bf6"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:13:04.109578 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:04.109571 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/638ce086-8f01-4848-a5b1-a6894b720bf6-kube-api-access-5xv5c" (OuterVolumeSpecName: "kube-api-access-5xv5c") pod "638ce086-8f01-4848-a5b1-a6894b720bf6" (UID: "638ce086-8f01-4848-a5b1-a6894b720bf6"). InnerVolumeSpecName "kube-api-access-5xv5c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:13:04.118107 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:04.118083 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/638ce086-8f01-4848-a5b1-a6894b720bf6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "638ce086-8f01-4848-a5b1-a6894b720bf6" (UID: "638ce086-8f01-4848-a5b1-a6894b720bf6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:13:04.208267 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:04.208230 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/638ce086-8f01-4848-a5b1-a6894b720bf6-kserve-provision-location\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:13:04.208412 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:04.208282 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5xv5c\" (UniqueName: \"kubernetes.io/projected/638ce086-8f01-4848-a5b1-a6894b720bf6-kube-api-access-5xv5c\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:13:04.208412 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:04.208298 2566 reconciler_common.go:299] "Volume detached for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/638ce086-8f01-4848-a5b1-a6894b720bf6-isvc-tensorflow-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:13:04.208412 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:04.208311 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/638ce086-8f01-4848-a5b1-a6894b720bf6-proxy-tls\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:13:04.776753 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:04.776722 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xd2bq" event={"ID":"638ce086-8f01-4848-a5b1-a6894b720bf6","Type":"ContainerDied","Data":"395332da7f355abf0adf4bf4e2e510c82386b7c18edcc46075f45fa6c13f3cfa"} Apr 17 18:13:04.777188 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:04.776767 2566 scope.go:117] "RemoveContainer" containerID="3880e7ee3cb5d56c971b871e0dcda8fc6131434ccca3e9c24dd7307bd3f18652" Apr 17 18:13:04.777188 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:04.776790 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xd2bq" Apr 17 18:13:04.784683 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:04.784599 2566 scope.go:117] "RemoveContainer" containerID="70000f7efa7ad4841f12b4a520f68e64c7f67b27e52607e4a944fe763d43fc03" Apr 17 18:13:04.791685 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:04.791665 2566 scope.go:117] "RemoveContainer" containerID="ab611e94c94e6bdea42317c71d555926a38dcfd2728d4a1d1eddf7e4d7ded5c3" Apr 17 18:13:04.797492 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:04.797472 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xd2bq"] Apr 17 18:13:04.801153 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:04.801135 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xd2bq"] Apr 17 18:13:04.838617 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:04.838592 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="638ce086-8f01-4848-a5b1-a6894b720bf6" path="/var/lib/kubelet/pods/638ce086-8f01-4848-a5b1-a6894b720bf6/volumes" Apr 17 18:13:14.085898 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:14.085865 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c64zm"] Apr 17 18:13:14.086480 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:14.086297 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c64zm" podUID="283b3043-05e4-492f-a287-eafa84443d02" containerName="kserve-container" containerID="cri-o://cbadc4e7c0330750f9107ae4abc2c45e25322e49900a61fbfc5b3dfc79306d88" gracePeriod=30 Apr 17 18:13:14.086480 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:14.086311 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c64zm" podUID="283b3043-05e4-492f-a287-eafa84443d02" containerName="kube-rbac-proxy" containerID="cri-o://6401eff97dd0a71f7c3f977eef6a96d12a5eee73f63977f74d99cf2c88c6f8ed" gracePeriod=30 Apr 17 18:13:14.164148 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:14.164109 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-lv2cf"] Apr 17 18:13:14.164525 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:14.164508 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="638ce086-8f01-4848-a5b1-a6894b720bf6" containerName="kserve-container" Apr 17 18:13:14.164615 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:14.164527 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="638ce086-8f01-4848-a5b1-a6894b720bf6" containerName="kserve-container" Apr 17 18:13:14.164615 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:14.164541 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="638ce086-8f01-4848-a5b1-a6894b720bf6" containerName="kube-rbac-proxy" Apr 17 18:13:14.164615 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:14.164550 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="638ce086-8f01-4848-a5b1-a6894b720bf6" containerName="kube-rbac-proxy" Apr 17 18:13:14.164615 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:14.164562 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="638ce086-8f01-4848-a5b1-a6894b720bf6" containerName="storage-initializer" Apr 17 18:13:14.164615 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:14.164571 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="638ce086-8f01-4848-a5b1-a6894b720bf6" containerName="storage-initializer" Apr 17 18:13:14.164885 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:14.164665 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="638ce086-8f01-4848-a5b1-a6894b720bf6" containerName="kube-rbac-proxy" Apr 17 18:13:14.164885 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:14.164679 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="638ce086-8f01-4848-a5b1-a6894b720bf6" containerName="kserve-container" Apr 17 18:13:14.169195 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:14.169175 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-lv2cf" Apr 17 18:13:14.171401 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:14.171382 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-triton-kube-rbac-proxy-sar-config\"" Apr 17 18:13:14.171511 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:14.171401 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-triton-predictor-serving-cert\"" Apr 17 18:13:14.176324 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:14.176299 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-lv2cf"] Apr 17 18:13:14.190158 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:14.190137 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e643438f-3067-4d57-8f54-f37a5fe3cf10-isvc-triton-kube-rbac-proxy-sar-config\") pod \"isvc-triton-predictor-84bb65d94b-lv2cf\" (UID: \"e643438f-3067-4d57-8f54-f37a5fe3cf10\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-lv2cf" Apr 17 18:13:14.190291 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:14.190187 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e643438f-3067-4d57-8f54-f37a5fe3cf10-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-lv2cf\" (UID: \"e643438f-3067-4d57-8f54-f37a5fe3cf10\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-lv2cf" Apr 17 18:13:14.190291 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:14.190224 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e643438f-3067-4d57-8f54-f37a5fe3cf10-kserve-provision-location\") pod \"isvc-triton-predictor-84bb65d94b-lv2cf\" (UID: \"e643438f-3067-4d57-8f54-f37a5fe3cf10\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-lv2cf" Apr 17 18:13:14.190412 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:14.190313 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcsdw\" (UniqueName: \"kubernetes.io/projected/e643438f-3067-4d57-8f54-f37a5fe3cf10-kube-api-access-qcsdw\") pod \"isvc-triton-predictor-84bb65d94b-lv2cf\" (UID: \"e643438f-3067-4d57-8f54-f37a5fe3cf10\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-lv2cf" Apr 17 18:13:14.291338 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:14.291302 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e643438f-3067-4d57-8f54-f37a5fe3cf10-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-lv2cf\" (UID: \"e643438f-3067-4d57-8f54-f37a5fe3cf10\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-lv2cf" Apr 17 18:13:14.291338 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:14.291343 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e643438f-3067-4d57-8f54-f37a5fe3cf10-kserve-provision-location\") pod \"isvc-triton-predictor-84bb65d94b-lv2cf\" (UID: \"e643438f-3067-4d57-8f54-f37a5fe3cf10\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-lv2cf" Apr 17 18:13:14.291587 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:14.291382 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qcsdw\" (UniqueName: \"kubernetes.io/projected/e643438f-3067-4d57-8f54-f37a5fe3cf10-kube-api-access-qcsdw\") pod \"isvc-triton-predictor-84bb65d94b-lv2cf\" (UID: \"e643438f-3067-4d57-8f54-f37a5fe3cf10\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-lv2cf" Apr 17 18:13:14.291587 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:14.291422 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e643438f-3067-4d57-8f54-f37a5fe3cf10-isvc-triton-kube-rbac-proxy-sar-config\") pod \"isvc-triton-predictor-84bb65d94b-lv2cf\" (UID: \"e643438f-3067-4d57-8f54-f37a5fe3cf10\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-lv2cf" Apr 17 18:13:14.291587 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:13:14.291464 2566 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-triton-predictor-serving-cert: secret "isvc-triton-predictor-serving-cert" not found Apr 17 18:13:14.291587 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:13:14.291556 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e643438f-3067-4d57-8f54-f37a5fe3cf10-proxy-tls podName:e643438f-3067-4d57-8f54-f37a5fe3cf10 nodeName:}" failed. No retries permitted until 2026-04-17 18:13:14.791526242 +0000 UTC m=+2906.461605862 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/e643438f-3067-4d57-8f54-f37a5fe3cf10-proxy-tls") pod "isvc-triton-predictor-84bb65d94b-lv2cf" (UID: "e643438f-3067-4d57-8f54-f37a5fe3cf10") : secret "isvc-triton-predictor-serving-cert" not found Apr 17 18:13:14.291803 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:14.291761 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e643438f-3067-4d57-8f54-f37a5fe3cf10-kserve-provision-location\") pod \"isvc-triton-predictor-84bb65d94b-lv2cf\" (UID: \"e643438f-3067-4d57-8f54-f37a5fe3cf10\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-lv2cf" Apr 17 18:13:14.292020 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:14.292003 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e643438f-3067-4d57-8f54-f37a5fe3cf10-isvc-triton-kube-rbac-proxy-sar-config\") pod \"isvc-triton-predictor-84bb65d94b-lv2cf\" (UID: \"e643438f-3067-4d57-8f54-f37a5fe3cf10\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-lv2cf" Apr 17 18:13:14.299574 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:14.299550 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcsdw\" (UniqueName: \"kubernetes.io/projected/e643438f-3067-4d57-8f54-f37a5fe3cf10-kube-api-access-qcsdw\") pod \"isvc-triton-predictor-84bb65d94b-lv2cf\" (UID: \"e643438f-3067-4d57-8f54-f37a5fe3cf10\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-lv2cf" Apr 17 18:13:14.795329 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:14.795289 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e643438f-3067-4d57-8f54-f37a5fe3cf10-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-lv2cf\" (UID: \"e643438f-3067-4d57-8f54-f37a5fe3cf10\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-lv2cf" Apr 17 18:13:14.797563 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:14.797540 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e643438f-3067-4d57-8f54-f37a5fe3cf10-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-lv2cf\" (UID: \"e643438f-3067-4d57-8f54-f37a5fe3cf10\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-lv2cf" Apr 17 18:13:14.807613 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:14.807588 2566 generic.go:358] "Generic (PLEG): container finished" podID="283b3043-05e4-492f-a287-eafa84443d02" containerID="6401eff97dd0a71f7c3f977eef6a96d12a5eee73f63977f74d99cf2c88c6f8ed" exitCode=2 Apr 17 18:13:14.807723 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:14.807660 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c64zm" event={"ID":"283b3043-05e4-492f-a287-eafa84443d02","Type":"ContainerDied","Data":"6401eff97dd0a71f7c3f977eef6a96d12a5eee73f63977f74d99cf2c88c6f8ed"} Apr 17 18:13:15.080228 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:15.080204 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-lv2cf" Apr 17 18:13:15.195698 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:15.195627 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-lv2cf"] Apr 17 18:13:15.198620 ip-10-0-140-147 kubenswrapper[2566]: W0417 18:13:15.198590 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode643438f_3067_4d57_8f54_f37a5fe3cf10.slice/crio-6430f6a1f7b09854115b1276c227e2e0f66fd66049486559513c027c1b77707d WatchSource:0}: Error finding container 6430f6a1f7b09854115b1276c227e2e0f66fd66049486559513c027c1b77707d: Status 404 returned error can't find the container with id 6430f6a1f7b09854115b1276c227e2e0f66fd66049486559513c027c1b77707d Apr 17 18:13:15.812169 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:15.812135 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-lv2cf" event={"ID":"e643438f-3067-4d57-8f54-f37a5fe3cf10","Type":"ContainerStarted","Data":"3e32fdb1d6650c957f8ca9101ed5eceff603af8861194516fa4e419e39092478"} Apr 17 18:13:15.812169 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:15.812170 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-lv2cf" event={"ID":"e643438f-3067-4d57-8f54-f37a5fe3cf10","Type":"ContainerStarted","Data":"6430f6a1f7b09854115b1276c227e2e0f66fd66049486559513c027c1b77707d"} Apr 17 18:13:17.711689 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:17.711646 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c64zm" podUID="283b3043-05e4-492f-a287-eafa84443d02" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.50:8643/healthz\": dial tcp 10.133.0.50:8643: connect: connection refused" Apr 17 18:13:19.824687 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:19.824656 2566 generic.go:358] "Generic (PLEG): container finished" podID="e643438f-3067-4d57-8f54-f37a5fe3cf10" containerID="3e32fdb1d6650c957f8ca9101ed5eceff603af8861194516fa4e419e39092478" exitCode=0 Apr 17 18:13:19.825073 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:19.824697 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-lv2cf" event={"ID":"e643438f-3067-4d57-8f54-f37a5fe3cf10","Type":"ContainerDied","Data":"3e32fdb1d6650c957f8ca9101ed5eceff603af8861194516fa4e419e39092478"} Apr 17 18:13:22.712660 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:22.712162 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c64zm" podUID="283b3043-05e4-492f-a287-eafa84443d02" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.50:8643/healthz\": dial tcp 10.133.0.50:8643: connect: connection refused" Apr 17 18:13:27.711660 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:27.711603 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c64zm" podUID="283b3043-05e4-492f-a287-eafa84443d02" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.50:8643/healthz\": dial tcp 10.133.0.50:8643: connect: connection refused" Apr 17 18:13:27.712125 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:27.711771 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c64zm" Apr 17 18:13:32.712130 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:32.712088 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c64zm" podUID="283b3043-05e4-492f-a287-eafa84443d02" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.50:8643/healthz\": dial tcp 10.133.0.50:8643: connect: connection refused" Apr 17 18:13:37.712052 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:37.711997 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c64zm" podUID="283b3043-05e4-492f-a287-eafa84443d02" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.50:8643/healthz\": dial tcp 10.133.0.50:8643: connect: connection refused" Apr 17 18:13:42.712362 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:42.712321 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c64zm" podUID="283b3043-05e4-492f-a287-eafa84443d02" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.50:8643/healthz\": dial tcp 10.133.0.50:8643: connect: connection refused" Apr 17 18:13:44.772611 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:44.772584 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c64zm" Apr 17 18:13:44.875532 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:44.875437 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/283b3043-05e4-492f-a287-eafa84443d02-kserve-provision-location\") pod \"283b3043-05e4-492f-a287-eafa84443d02\" (UID: \"283b3043-05e4-492f-a287-eafa84443d02\") " Apr 17 18:13:44.875532 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:44.875509 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/283b3043-05e4-492f-a287-eafa84443d02-proxy-tls\") pod \"283b3043-05e4-492f-a287-eafa84443d02\" (UID: \"283b3043-05e4-492f-a287-eafa84443d02\") " Apr 17 18:13:44.875767 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:44.875561 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcf2c\" (UniqueName: \"kubernetes.io/projected/283b3043-05e4-492f-a287-eafa84443d02-kube-api-access-fcf2c\") pod \"283b3043-05e4-492f-a287-eafa84443d02\" (UID: \"283b3043-05e4-492f-a287-eafa84443d02\") " Apr 17 18:13:44.875767 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:44.875648 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/283b3043-05e4-492f-a287-eafa84443d02-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"283b3043-05e4-492f-a287-eafa84443d02\" (UID: \"283b3043-05e4-492f-a287-eafa84443d02\") " Apr 17 18:13:44.876223 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:44.876190 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/283b3043-05e4-492f-a287-eafa84443d02-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-tensorflow-runtime-kube-rbac-proxy-sar-config") pod "283b3043-05e4-492f-a287-eafa84443d02" (UID: "283b3043-05e4-492f-a287-eafa84443d02"). InnerVolumeSpecName "isvc-tensorflow-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:13:44.880850 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:44.880819 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/283b3043-05e4-492f-a287-eafa84443d02-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "283b3043-05e4-492f-a287-eafa84443d02" (UID: "283b3043-05e4-492f-a287-eafa84443d02"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:13:44.882053 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:44.882013 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/283b3043-05e4-492f-a287-eafa84443d02-kube-api-access-fcf2c" (OuterVolumeSpecName: "kube-api-access-fcf2c") pod "283b3043-05e4-492f-a287-eafa84443d02" (UID: "283b3043-05e4-492f-a287-eafa84443d02"). InnerVolumeSpecName "kube-api-access-fcf2c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:13:44.882164 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:44.882091 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/283b3043-05e4-492f-a287-eafa84443d02-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "283b3043-05e4-492f-a287-eafa84443d02" (UID: "283b3043-05e4-492f-a287-eafa84443d02"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:13:44.927622 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:44.927321 2566 generic.go:358] "Generic (PLEG): container finished" podID="283b3043-05e4-492f-a287-eafa84443d02" containerID="cbadc4e7c0330750f9107ae4abc2c45e25322e49900a61fbfc5b3dfc79306d88" exitCode=137 Apr 17 18:13:44.927622 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:44.927367 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c64zm" event={"ID":"283b3043-05e4-492f-a287-eafa84443d02","Type":"ContainerDied","Data":"cbadc4e7c0330750f9107ae4abc2c45e25322e49900a61fbfc5b3dfc79306d88"} Apr 17 18:13:44.927622 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:44.927398 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c64zm" event={"ID":"283b3043-05e4-492f-a287-eafa84443d02","Type":"ContainerDied","Data":"5ad4bf9ff7793d8fda8c81ee46d1919fa84d34fdc9bec47c7f289d6d95d5623b"} Apr 17 18:13:44.927622 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:44.927418 2566 scope.go:117] "RemoveContainer" containerID="6401eff97dd0a71f7c3f977eef6a96d12a5eee73f63977f74d99cf2c88c6f8ed" Apr 17 18:13:44.927622 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:44.927456 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c64zm" Apr 17 18:13:44.940036 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:44.940015 2566 scope.go:117] "RemoveContainer" containerID="cbadc4e7c0330750f9107ae4abc2c45e25322e49900a61fbfc5b3dfc79306d88" Apr 17 18:13:44.949270 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:44.949228 2566 scope.go:117] "RemoveContainer" containerID="31e3b506b9dde9ef2581c15fb9ca7e200fbfd0faf9e8547262ea46adfa24ec0c" Apr 17 18:13:44.955576 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:44.955554 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c64zm"] Apr 17 18:13:44.959034 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:44.959015 2566 scope.go:117] "RemoveContainer" containerID="6401eff97dd0a71f7c3f977eef6a96d12a5eee73f63977f74d99cf2c88c6f8ed" Apr 17 18:13:44.959993 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:13:44.959823 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6401eff97dd0a71f7c3f977eef6a96d12a5eee73f63977f74d99cf2c88c6f8ed\": container with ID starting with 6401eff97dd0a71f7c3f977eef6a96d12a5eee73f63977f74d99cf2c88c6f8ed not found: ID does not exist" containerID="6401eff97dd0a71f7c3f977eef6a96d12a5eee73f63977f74d99cf2c88c6f8ed" Apr 17 18:13:44.959993 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:44.959875 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6401eff97dd0a71f7c3f977eef6a96d12a5eee73f63977f74d99cf2c88c6f8ed"} err="failed to get container status \"6401eff97dd0a71f7c3f977eef6a96d12a5eee73f63977f74d99cf2c88c6f8ed\": rpc error: code = NotFound desc = could not find container \"6401eff97dd0a71f7c3f977eef6a96d12a5eee73f63977f74d99cf2c88c6f8ed\": container with ID starting with 6401eff97dd0a71f7c3f977eef6a96d12a5eee73f63977f74d99cf2c88c6f8ed not found: ID does not exist" Apr 17 18:13:44.959993 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:44.959900 2566 scope.go:117] "RemoveContainer" containerID="cbadc4e7c0330750f9107ae4abc2c45e25322e49900a61fbfc5b3dfc79306d88" Apr 17 18:13:44.961344 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:13:44.960353 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbadc4e7c0330750f9107ae4abc2c45e25322e49900a61fbfc5b3dfc79306d88\": container with ID starting with cbadc4e7c0330750f9107ae4abc2c45e25322e49900a61fbfc5b3dfc79306d88 not found: ID does not exist" containerID="cbadc4e7c0330750f9107ae4abc2c45e25322e49900a61fbfc5b3dfc79306d88" Apr 17 18:13:44.961344 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:44.960391 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbadc4e7c0330750f9107ae4abc2c45e25322e49900a61fbfc5b3dfc79306d88"} err="failed to get container status \"cbadc4e7c0330750f9107ae4abc2c45e25322e49900a61fbfc5b3dfc79306d88\": rpc error: code = NotFound desc = could not find container \"cbadc4e7c0330750f9107ae4abc2c45e25322e49900a61fbfc5b3dfc79306d88\": container with ID starting with cbadc4e7c0330750f9107ae4abc2c45e25322e49900a61fbfc5b3dfc79306d88 not found: ID does not exist" Apr 17 18:13:44.961344 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:44.960413 2566 scope.go:117] "RemoveContainer" containerID="31e3b506b9dde9ef2581c15fb9ca7e200fbfd0faf9e8547262ea46adfa24ec0c" Apr 17 18:13:44.961344 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:13:44.960733 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31e3b506b9dde9ef2581c15fb9ca7e200fbfd0faf9e8547262ea46adfa24ec0c\": container with ID starting with 31e3b506b9dde9ef2581c15fb9ca7e200fbfd0faf9e8547262ea46adfa24ec0c not found: ID does not exist" containerID="31e3b506b9dde9ef2581c15fb9ca7e200fbfd0faf9e8547262ea46adfa24ec0c" Apr 17 18:13:44.961344 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:44.960776 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31e3b506b9dde9ef2581c15fb9ca7e200fbfd0faf9e8547262ea46adfa24ec0c"} err="failed to get container status \"31e3b506b9dde9ef2581c15fb9ca7e200fbfd0faf9e8547262ea46adfa24ec0c\": rpc error: code = NotFound desc = could not find container \"31e3b506b9dde9ef2581c15fb9ca7e200fbfd0faf9e8547262ea46adfa24ec0c\": container with ID starting with 31e3b506b9dde9ef2581c15fb9ca7e200fbfd0faf9e8547262ea46adfa24ec0c not found: ID does not exist" Apr 17 18:13:44.963755 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:44.962501 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c64zm"] Apr 17 18:13:44.976302 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:44.976281 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/283b3043-05e4-492f-a287-eafa84443d02-kserve-provision-location\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:13:44.976410 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:44.976309 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/283b3043-05e4-492f-a287-eafa84443d02-proxy-tls\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:13:44.976410 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:44.976324 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fcf2c\" (UniqueName: \"kubernetes.io/projected/283b3043-05e4-492f-a287-eafa84443d02-kube-api-access-fcf2c\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:13:44.976410 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:44.976338 2566 reconciler_common.go:299] "Volume detached for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/283b3043-05e4-492f-a287-eafa84443d02-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:13:46.840944 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:13:46.840467 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="283b3043-05e4-492f-a287-eafa84443d02" path="/var/lib/kubelet/pods/283b3043-05e4-492f-a287-eafa84443d02/volumes" Apr 17 18:15:13.457638 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:13.457608 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz777_35fe1bc2-e1a4-4744-8f67-3cc38747e567/ovn-acl-logging/0.log" Apr 17 18:15:13.458162 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:13.457608 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz777_35fe1bc2-e1a4-4744-8f67-3cc38747e567/ovn-acl-logging/0.log" Apr 17 18:15:15.218245 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:15.218209 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-lv2cf" event={"ID":"e643438f-3067-4d57-8f54-f37a5fe3cf10","Type":"ContainerStarted","Data":"34978278d21c65bdda36df84505d93b59bf0d4c91849c3d437e8703e33581047"} Apr 17 18:15:15.218763 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:15.218273 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-lv2cf" event={"ID":"e643438f-3067-4d57-8f54-f37a5fe3cf10","Type":"ContainerStarted","Data":"0275653249d6235cd748334a89a94b9ec3d6559add6f48d027ccad0ed769ccc9"} Apr 17 18:15:15.218763 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:15.218352 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-lv2cf" Apr 17 18:15:15.277012 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:15.276960 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-lv2cf" podStartSLOduration=6.746563579 podStartE2EDuration="2m1.276946166s" podCreationTimestamp="2026-04-17 18:13:14 +0000 UTC" firstStartedPulling="2026-04-17 18:13:19.825750382 +0000 UTC m=+2911.495830001" lastFinishedPulling="2026-04-17 18:15:14.356132964 +0000 UTC m=+3026.026212588" observedRunningTime="2026-04-17 18:15:15.276632476 +0000 UTC m=+3026.946712129" watchObservedRunningTime="2026-04-17 18:15:15.276946166 +0000 UTC m=+3026.947025808" Apr 17 18:15:16.220918 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:16.220890 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-lv2cf" Apr 17 18:15:16.221820 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:16.221796 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-lv2cf" podUID="e643438f-3067-4d57-8f54-f37a5fe3cf10" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.51:8080: connect: connection refused" Apr 17 18:15:17.224050 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:17.224005 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-lv2cf" podUID="e643438f-3067-4d57-8f54-f37a5fe3cf10" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.51:8080: connect: connection refused" Apr 17 18:15:22.228418 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:22.228387 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-lv2cf" Apr 17 18:15:22.229178 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:22.229160 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-lv2cf" Apr 17 18:15:25.908344 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:25.908313 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-lv2cf"] Apr 17 18:15:25.908883 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:25.908856 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-lv2cf" podUID="e643438f-3067-4d57-8f54-f37a5fe3cf10" containerName="kserve-container" containerID="cri-o://0275653249d6235cd748334a89a94b9ec3d6559add6f48d027ccad0ed769ccc9" gracePeriod=30 Apr 17 18:15:25.908972 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:25.908881 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-lv2cf" podUID="e643438f-3067-4d57-8f54-f37a5fe3cf10" containerName="kube-rbac-proxy" containerID="cri-o://34978278d21c65bdda36df84505d93b59bf0d4c91849c3d437e8703e33581047" gracePeriod=30 Apr 17 18:15:26.249933 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:26.249836 2566 generic.go:358] "Generic (PLEG): container finished" podID="e643438f-3067-4d57-8f54-f37a5fe3cf10" containerID="34978278d21c65bdda36df84505d93b59bf0d4c91849c3d437e8703e33581047" exitCode=2 Apr 17 18:15:26.249933 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:26.249879 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-lv2cf" event={"ID":"e643438f-3067-4d57-8f54-f37a5fe3cf10","Type":"ContainerDied","Data":"34978278d21c65bdda36df84505d93b59bf0d4c91849c3d437e8703e33581047"} Apr 17 18:15:26.254158 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:26.254134 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hcz8p"] Apr 17 18:15:26.254458 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:26.254446 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="283b3043-05e4-492f-a287-eafa84443d02" containerName="storage-initializer" Apr 17 18:15:26.254458 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:26.254460 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="283b3043-05e4-492f-a287-eafa84443d02" containerName="storage-initializer" Apr 17 18:15:26.254542 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:26.254466 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="283b3043-05e4-492f-a287-eafa84443d02" containerName="kube-rbac-proxy" Apr 17 18:15:26.254542 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:26.254472 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="283b3043-05e4-492f-a287-eafa84443d02" containerName="kube-rbac-proxy" Apr 17 18:15:26.254542 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:26.254480 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="283b3043-05e4-492f-a287-eafa84443d02" containerName="kserve-container" Apr 17 18:15:26.254542 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:26.254486 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="283b3043-05e4-492f-a287-eafa84443d02" containerName="kserve-container" Apr 17 18:15:26.254542 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:26.254535 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="283b3043-05e4-492f-a287-eafa84443d02" containerName="kube-rbac-proxy" Apr 17 18:15:26.254695 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:26.254545 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="283b3043-05e4-492f-a287-eafa84443d02" containerName="kserve-container" Apr 17 18:15:26.268924 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:26.268907 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hcz8p" Apr 17 18:15:26.271733 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:26.271713 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-kube-rbac-proxy-sar-config\"" Apr 17 18:15:26.271998 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:26.271984 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-predictor-serving-cert\"" Apr 17 18:15:26.291228 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:26.291198 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hcz8p"] Apr 17 18:15:26.357763 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:26.357734 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc6tr\" (UniqueName: \"kubernetes.io/projected/5e43c699-610e-4fe8-9f95-b0acb0fce3be-kube-api-access-qc6tr\") pod \"isvc-xgboost-predictor-8689c4cfcc-hcz8p\" (UID: \"5e43c699-610e-4fe8-9f95-b0acb0fce3be\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hcz8p" Apr 17 18:15:26.357934 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:26.357769 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e43c699-610e-4fe8-9f95-b0acb0fce3be-kserve-provision-location\") pod \"isvc-xgboost-predictor-8689c4cfcc-hcz8p\" (UID: \"5e43c699-610e-4fe8-9f95-b0acb0fce3be\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hcz8p" Apr 17 18:15:26.357934 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:26.357858 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5e43c699-610e-4fe8-9f95-b0acb0fce3be-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-hcz8p\" (UID: \"5e43c699-610e-4fe8-9f95-b0acb0fce3be\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hcz8p" Apr 17 18:15:26.357934 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:26.357904 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5e43c699-610e-4fe8-9f95-b0acb0fce3be-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-predictor-8689c4cfcc-hcz8p\" (UID: \"5e43c699-610e-4fe8-9f95-b0acb0fce3be\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hcz8p" Apr 17 18:15:26.458639 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:26.458607 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5e43c699-610e-4fe8-9f95-b0acb0fce3be-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-hcz8p\" (UID: \"5e43c699-610e-4fe8-9f95-b0acb0fce3be\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hcz8p" Apr 17 18:15:26.458884 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:26.458656 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5e43c699-610e-4fe8-9f95-b0acb0fce3be-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-predictor-8689c4cfcc-hcz8p\" (UID: \"5e43c699-610e-4fe8-9f95-b0acb0fce3be\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hcz8p" Apr 17 18:15:26.458884 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:26.458689 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qc6tr\" (UniqueName: \"kubernetes.io/projected/5e43c699-610e-4fe8-9f95-b0acb0fce3be-kube-api-access-qc6tr\") pod \"isvc-xgboost-predictor-8689c4cfcc-hcz8p\" (UID: \"5e43c699-610e-4fe8-9f95-b0acb0fce3be\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hcz8p" Apr 17 18:15:26.458884 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:26.458711 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e43c699-610e-4fe8-9f95-b0acb0fce3be-kserve-provision-location\") pod \"isvc-xgboost-predictor-8689c4cfcc-hcz8p\" (UID: \"5e43c699-610e-4fe8-9f95-b0acb0fce3be\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hcz8p" Apr 17 18:15:26.459125 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:26.459107 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e43c699-610e-4fe8-9f95-b0acb0fce3be-kserve-provision-location\") pod \"isvc-xgboost-predictor-8689c4cfcc-hcz8p\" (UID: \"5e43c699-610e-4fe8-9f95-b0acb0fce3be\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hcz8p" Apr 17 18:15:26.459382 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:26.459363 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5e43c699-610e-4fe8-9f95-b0acb0fce3be-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-predictor-8689c4cfcc-hcz8p\" (UID: \"5e43c699-610e-4fe8-9f95-b0acb0fce3be\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hcz8p" Apr 17 18:15:26.461238 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:26.461218 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5e43c699-610e-4fe8-9f95-b0acb0fce3be-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-hcz8p\" (UID: \"5e43c699-610e-4fe8-9f95-b0acb0fce3be\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hcz8p" Apr 17 18:15:26.483857 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:26.483828 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc6tr\" (UniqueName: \"kubernetes.io/projected/5e43c699-610e-4fe8-9f95-b0acb0fce3be-kube-api-access-qc6tr\") pod \"isvc-xgboost-predictor-8689c4cfcc-hcz8p\" (UID: \"5e43c699-610e-4fe8-9f95-b0acb0fce3be\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hcz8p" Apr 17 18:15:26.578181 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:26.578154 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hcz8p" Apr 17 18:15:26.704898 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:26.704845 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hcz8p"] Apr 17 18:15:26.710793 ip-10-0-140-147 kubenswrapper[2566]: W0417 18:15:26.710767 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e43c699_610e_4fe8_9f95_b0acb0fce3be.slice/crio-098e8d7919c96518a2c357e1b33a4a27b91958e8dfc47f2a9999441daea181a8 WatchSource:0}: Error finding container 098e8d7919c96518a2c357e1b33a4a27b91958e8dfc47f2a9999441daea181a8: Status 404 returned error can't find the container with id 098e8d7919c96518a2c357e1b33a4a27b91958e8dfc47f2a9999441daea181a8 Apr 17 18:15:26.712651 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:26.712630 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 18:15:27.225270 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:27.225210 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-lv2cf" podUID="e643438f-3067-4d57-8f54-f37a5fe3cf10" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.51:8643/healthz\": dial tcp 10.133.0.51:8643: connect: connection refused" Apr 17 18:15:27.253757 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:27.253725 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hcz8p" event={"ID":"5e43c699-610e-4fe8-9f95-b0acb0fce3be","Type":"ContainerStarted","Data":"f5389c0894086777c6ccd9e7d33781c2178daec34fa046fada9642e1a425dfc8"} Apr 17 18:15:27.253757 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:27.253762 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hcz8p" event={"ID":"5e43c699-610e-4fe8-9f95-b0acb0fce3be","Type":"ContainerStarted","Data":"098e8d7919c96518a2c357e1b33a4a27b91958e8dfc47f2a9999441daea181a8"} Apr 17 18:15:28.154826 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:28.154802 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-lv2cf" Apr 17 18:15:28.170616 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:28.170590 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e643438f-3067-4d57-8f54-f37a5fe3cf10-isvc-triton-kube-rbac-proxy-sar-config\") pod \"e643438f-3067-4d57-8f54-f37a5fe3cf10\" (UID: \"e643438f-3067-4d57-8f54-f37a5fe3cf10\") " Apr 17 18:15:28.170765 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:28.170637 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e643438f-3067-4d57-8f54-f37a5fe3cf10-kserve-provision-location\") pod \"e643438f-3067-4d57-8f54-f37a5fe3cf10\" (UID: \"e643438f-3067-4d57-8f54-f37a5fe3cf10\") " Apr 17 18:15:28.170765 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:28.170680 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcsdw\" (UniqueName: \"kubernetes.io/projected/e643438f-3067-4d57-8f54-f37a5fe3cf10-kube-api-access-qcsdw\") pod \"e643438f-3067-4d57-8f54-f37a5fe3cf10\" (UID: \"e643438f-3067-4d57-8f54-f37a5fe3cf10\") " Apr 17 18:15:28.170765 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:28.170713 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e643438f-3067-4d57-8f54-f37a5fe3cf10-proxy-tls\") pod \"e643438f-3067-4d57-8f54-f37a5fe3cf10\" (UID: \"e643438f-3067-4d57-8f54-f37a5fe3cf10\") " Apr 17 18:15:28.171049 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:28.171013 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e643438f-3067-4d57-8f54-f37a5fe3cf10-isvc-triton-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-triton-kube-rbac-proxy-sar-config") pod "e643438f-3067-4d57-8f54-f37a5fe3cf10" (UID: "e643438f-3067-4d57-8f54-f37a5fe3cf10"). InnerVolumeSpecName "isvc-triton-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:15:28.171138 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:28.171082 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e643438f-3067-4d57-8f54-f37a5fe3cf10-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e643438f-3067-4d57-8f54-f37a5fe3cf10" (UID: "e643438f-3067-4d57-8f54-f37a5fe3cf10"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:15:28.172811 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:28.172788 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e643438f-3067-4d57-8f54-f37a5fe3cf10-kube-api-access-qcsdw" (OuterVolumeSpecName: "kube-api-access-qcsdw") pod "e643438f-3067-4d57-8f54-f37a5fe3cf10" (UID: "e643438f-3067-4d57-8f54-f37a5fe3cf10"). InnerVolumeSpecName "kube-api-access-qcsdw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:15:28.173042 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:28.173022 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e643438f-3067-4d57-8f54-f37a5fe3cf10-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e643438f-3067-4d57-8f54-f37a5fe3cf10" (UID: "e643438f-3067-4d57-8f54-f37a5fe3cf10"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:15:28.263043 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:28.262939 2566 generic.go:358] "Generic (PLEG): container finished" podID="e643438f-3067-4d57-8f54-f37a5fe3cf10" containerID="0275653249d6235cd748334a89a94b9ec3d6559add6f48d027ccad0ed769ccc9" exitCode=0 Apr 17 18:15:28.263508 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:28.263040 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-lv2cf" event={"ID":"e643438f-3067-4d57-8f54-f37a5fe3cf10","Type":"ContainerDied","Data":"0275653249d6235cd748334a89a94b9ec3d6559add6f48d027ccad0ed769ccc9"} Apr 17 18:15:28.263508 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:28.263118 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-lv2cf" event={"ID":"e643438f-3067-4d57-8f54-f37a5fe3cf10","Type":"ContainerDied","Data":"6430f6a1f7b09854115b1276c227e2e0f66fd66049486559513c027c1b77707d"} Apr 17 18:15:28.263508 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:28.263122 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-lv2cf" Apr 17 18:15:28.263508 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:28.263153 2566 scope.go:117] "RemoveContainer" containerID="34978278d21c65bdda36df84505d93b59bf0d4c91849c3d437e8703e33581047" Apr 17 18:15:28.271233 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:28.271213 2566 scope.go:117] "RemoveContainer" containerID="0275653249d6235cd748334a89a94b9ec3d6559add6f48d027ccad0ed769ccc9" Apr 17 18:15:28.271441 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:28.271421 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qcsdw\" (UniqueName: \"kubernetes.io/projected/e643438f-3067-4d57-8f54-f37a5fe3cf10-kube-api-access-qcsdw\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:15:28.271543 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:28.271443 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e643438f-3067-4d57-8f54-f37a5fe3cf10-proxy-tls\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:15:28.271543 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:28.271453 2566 reconciler_common.go:299] "Volume detached for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e643438f-3067-4d57-8f54-f37a5fe3cf10-isvc-triton-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:15:28.271543 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:28.271462 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e643438f-3067-4d57-8f54-f37a5fe3cf10-kserve-provision-location\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:15:28.278143 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:28.278129 2566 scope.go:117] "RemoveContainer" containerID="3e32fdb1d6650c957f8ca9101ed5eceff603af8861194516fa4e419e39092478" Apr 17 18:15:28.285100 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:28.285080 2566 scope.go:117] "RemoveContainer" containerID="34978278d21c65bdda36df84505d93b59bf0d4c91849c3d437e8703e33581047" Apr 17 18:15:28.285443 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:15:28.285415 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34978278d21c65bdda36df84505d93b59bf0d4c91849c3d437e8703e33581047\": container with ID starting with 34978278d21c65bdda36df84505d93b59bf0d4c91849c3d437e8703e33581047 not found: ID does not exist" containerID="34978278d21c65bdda36df84505d93b59bf0d4c91849c3d437e8703e33581047" Apr 17 18:15:28.285495 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:28.285453 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34978278d21c65bdda36df84505d93b59bf0d4c91849c3d437e8703e33581047"} err="failed to get container status \"34978278d21c65bdda36df84505d93b59bf0d4c91849c3d437e8703e33581047\": rpc error: code = NotFound desc = could not find container \"34978278d21c65bdda36df84505d93b59bf0d4c91849c3d437e8703e33581047\": container with ID starting with 34978278d21c65bdda36df84505d93b59bf0d4c91849c3d437e8703e33581047 not found: ID does not exist" Apr 17 18:15:28.285495 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:28.285471 2566 scope.go:117] "RemoveContainer" containerID="0275653249d6235cd748334a89a94b9ec3d6559add6f48d027ccad0ed769ccc9" Apr 17 18:15:28.285717 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:15:28.285702 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0275653249d6235cd748334a89a94b9ec3d6559add6f48d027ccad0ed769ccc9\": container with ID starting with 0275653249d6235cd748334a89a94b9ec3d6559add6f48d027ccad0ed769ccc9 not found: ID does not exist" containerID="0275653249d6235cd748334a89a94b9ec3d6559add6f48d027ccad0ed769ccc9" Apr 17 18:15:28.285768 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:28.285721 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0275653249d6235cd748334a89a94b9ec3d6559add6f48d027ccad0ed769ccc9"} err="failed to get container status \"0275653249d6235cd748334a89a94b9ec3d6559add6f48d027ccad0ed769ccc9\": rpc error: code = NotFound desc = could not find container \"0275653249d6235cd748334a89a94b9ec3d6559add6f48d027ccad0ed769ccc9\": container with ID starting with 0275653249d6235cd748334a89a94b9ec3d6559add6f48d027ccad0ed769ccc9 not found: ID does not exist" Apr 17 18:15:28.285768 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:28.285735 2566 scope.go:117] "RemoveContainer" containerID="3e32fdb1d6650c957f8ca9101ed5eceff603af8861194516fa4e419e39092478" Apr 17 18:15:28.285974 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:15:28.285956 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e32fdb1d6650c957f8ca9101ed5eceff603af8861194516fa4e419e39092478\": container with ID starting with 3e32fdb1d6650c957f8ca9101ed5eceff603af8861194516fa4e419e39092478 not found: ID does not exist" containerID="3e32fdb1d6650c957f8ca9101ed5eceff603af8861194516fa4e419e39092478" Apr 17 18:15:28.286018 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:28.285978 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e32fdb1d6650c957f8ca9101ed5eceff603af8861194516fa4e419e39092478"} err="failed to get container status \"3e32fdb1d6650c957f8ca9101ed5eceff603af8861194516fa4e419e39092478\": rpc error: code = NotFound desc = could not find container \"3e32fdb1d6650c957f8ca9101ed5eceff603af8861194516fa4e419e39092478\": container with ID starting with 3e32fdb1d6650c957f8ca9101ed5eceff603af8861194516fa4e419e39092478 not found: ID does not exist" Apr 17 18:15:28.313320 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:28.313294 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-lv2cf"] Apr 17 18:15:28.324868 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:28.324846 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-lv2cf"] Apr 17 18:15:28.838176 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:28.838145 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e643438f-3067-4d57-8f54-f37a5fe3cf10" path="/var/lib/kubelet/pods/e643438f-3067-4d57-8f54-f37a5fe3cf10/volumes" Apr 17 18:15:31.272969 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:31.272940 2566 generic.go:358] "Generic (PLEG): container finished" podID="5e43c699-610e-4fe8-9f95-b0acb0fce3be" containerID="f5389c0894086777c6ccd9e7d33781c2178daec34fa046fada9642e1a425dfc8" exitCode=0 Apr 17 18:15:31.273369 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:31.273021 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hcz8p" event={"ID":"5e43c699-610e-4fe8-9f95-b0acb0fce3be","Type":"ContainerDied","Data":"f5389c0894086777c6ccd9e7d33781c2178daec34fa046fada9642e1a425dfc8"} Apr 17 18:15:53.343468 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:53.343429 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hcz8p" event={"ID":"5e43c699-610e-4fe8-9f95-b0acb0fce3be","Type":"ContainerStarted","Data":"949d1f45ff035ff65018e1119f5576223f5fc1cf497dbda26d6d685827914e8c"} Apr 17 18:15:53.343908 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:53.343480 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hcz8p" event={"ID":"5e43c699-610e-4fe8-9f95-b0acb0fce3be","Type":"ContainerStarted","Data":"b300d56e9a653a69a78e63ede32880b1317780c97e1717a2008ab205c75205e8"} Apr 17 18:15:53.343908 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:53.343718 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hcz8p" Apr 17 18:15:53.382371 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:53.382325 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hcz8p" podStartSLOduration=5.766601464 podStartE2EDuration="27.38231079s" podCreationTimestamp="2026-04-17 18:15:26 +0000 UTC" firstStartedPulling="2026-04-17 18:15:31.274167 +0000 UTC m=+3042.944246620" lastFinishedPulling="2026-04-17 18:15:52.889876325 +0000 UTC m=+3064.559955946" observedRunningTime="2026-04-17 18:15:53.381460206 +0000 UTC m=+3065.051539848" watchObservedRunningTime="2026-04-17 18:15:53.38231079 +0000 UTC m=+3065.052390431" Apr 17 18:15:54.346760 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:54.346727 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hcz8p" Apr 17 18:15:54.347899 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:54.347865 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hcz8p" podUID="5e43c699-610e-4fe8-9f95-b0acb0fce3be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.52:8080: connect: connection refused" Apr 17 18:15:55.349306 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:15:55.349272 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hcz8p" podUID="5e43c699-610e-4fe8-9f95-b0acb0fce3be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.52:8080: connect: connection refused" Apr 17 18:16:00.353942 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:16:00.353913 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hcz8p" Apr 17 18:16:00.354460 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:16:00.354436 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hcz8p" podUID="5e43c699-610e-4fe8-9f95-b0acb0fce3be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.52:8080: connect: connection refused" Apr 17 18:16:10.355170 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:16:10.355128 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hcz8p" podUID="5e43c699-610e-4fe8-9f95-b0acb0fce3be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.52:8080: connect: connection refused" Apr 17 18:16:20.354476 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:16:20.354440 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hcz8p" podUID="5e43c699-610e-4fe8-9f95-b0acb0fce3be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.52:8080: connect: connection refused" Apr 17 18:16:30.354474 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:16:30.354436 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hcz8p" podUID="5e43c699-610e-4fe8-9f95-b0acb0fce3be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.52:8080: connect: connection refused" Apr 17 18:16:40.354697 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:16:40.354654 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hcz8p" podUID="5e43c699-610e-4fe8-9f95-b0acb0fce3be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.52:8080: connect: connection refused" Apr 17 18:16:50.354895 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:16:50.354856 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hcz8p" podUID="5e43c699-610e-4fe8-9f95-b0acb0fce3be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.52:8080: connect: connection refused" Apr 17 18:17:00.355346 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:00.355272 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hcz8p" Apr 17 18:17:06.064203 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:06.064173 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hcz8p"] Apr 17 18:17:06.064625 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:06.064515 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hcz8p" podUID="5e43c699-610e-4fe8-9f95-b0acb0fce3be" containerName="kserve-container" containerID="cri-o://b300d56e9a653a69a78e63ede32880b1317780c97e1717a2008ab205c75205e8" gracePeriod=30 Apr 17 18:17:06.064705 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:06.064596 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hcz8p" podUID="5e43c699-610e-4fe8-9f95-b0acb0fce3be" containerName="kube-rbac-proxy" containerID="cri-o://949d1f45ff035ff65018e1119f5576223f5fc1cf497dbda26d6d685827914e8c" gracePeriod=30 Apr 17 18:17:06.209843 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:06.209813 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw"] Apr 17 18:17:06.210223 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:06.210209 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e643438f-3067-4d57-8f54-f37a5fe3cf10" containerName="kserve-container" Apr 17 18:17:06.210299 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:06.210224 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="e643438f-3067-4d57-8f54-f37a5fe3cf10" containerName="kserve-container" Apr 17 18:17:06.210299 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:06.210236 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e643438f-3067-4d57-8f54-f37a5fe3cf10" containerName="kube-rbac-proxy" Apr 17 18:17:06.210299 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:06.210242 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="e643438f-3067-4d57-8f54-f37a5fe3cf10" containerName="kube-rbac-proxy" Apr 17 18:17:06.210299 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:06.210278 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e643438f-3067-4d57-8f54-f37a5fe3cf10" containerName="storage-initializer" Apr 17 18:17:06.210299 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:06.210288 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="e643438f-3067-4d57-8f54-f37a5fe3cf10" containerName="storage-initializer" Apr 17 18:17:06.210471 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:06.210355 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="e643438f-3067-4d57-8f54-f37a5fe3cf10" containerName="kserve-container" Apr 17 18:17:06.210471 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:06.210367 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="e643438f-3067-4d57-8f54-f37a5fe3cf10" containerName="kube-rbac-proxy" Apr 17 18:17:06.213243 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:06.213222 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw" Apr 17 18:17:06.215695 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:06.215676 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\"" Apr 17 18:17:06.215828 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:06.215725 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-mlserver-predictor-serving-cert\"" Apr 17 18:17:06.224595 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:06.224570 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw"] Apr 17 18:17:06.275911 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:06.275880 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhzfg\" (UniqueName: \"kubernetes.io/projected/bb044221-071e-4902-8ec2-d25fb734753d-kube-api-access-jhzfg\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw\" (UID: \"bb044221-071e-4902-8ec2-d25fb734753d\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw" Apr 17 18:17:06.276085 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:06.275946 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bb044221-071e-4902-8ec2-d25fb734753d-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw\" (UID: \"bb044221-071e-4902-8ec2-d25fb734753d\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw" Apr 17 18:17:06.276085 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:06.276006 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bb044221-071e-4902-8ec2-d25fb734753d-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw\" (UID: \"bb044221-071e-4902-8ec2-d25fb734753d\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw" Apr 17 18:17:06.276085 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:06.276067 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bb044221-071e-4902-8ec2-d25fb734753d-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw\" (UID: \"bb044221-071e-4902-8ec2-d25fb734753d\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw" Apr 17 18:17:06.377346 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:06.377268 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bb044221-071e-4902-8ec2-d25fb734753d-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw\" (UID: \"bb044221-071e-4902-8ec2-d25fb734753d\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw" Apr 17 18:17:06.377346 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:06.377330 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jhzfg\" (UniqueName: \"kubernetes.io/projected/bb044221-071e-4902-8ec2-d25fb734753d-kube-api-access-jhzfg\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw\" (UID: \"bb044221-071e-4902-8ec2-d25fb734753d\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw" Apr 17 18:17:06.377559 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:06.377374 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bb044221-071e-4902-8ec2-d25fb734753d-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw\" (UID: \"bb044221-071e-4902-8ec2-d25fb734753d\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw" Apr 17 18:17:06.377559 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:06.377394 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bb044221-071e-4902-8ec2-d25fb734753d-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw\" (UID: \"bb044221-071e-4902-8ec2-d25fb734753d\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw" Apr 17 18:17:06.377654 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:06.377633 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bb044221-071e-4902-8ec2-d25fb734753d-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw\" (UID: \"bb044221-071e-4902-8ec2-d25fb734753d\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw" Apr 17 18:17:06.378006 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:06.377983 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bb044221-071e-4902-8ec2-d25fb734753d-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw\" (UID: \"bb044221-071e-4902-8ec2-d25fb734753d\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw" Apr 17 18:17:06.379765 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:06.379744 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bb044221-071e-4902-8ec2-d25fb734753d-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw\" (UID: \"bb044221-071e-4902-8ec2-d25fb734753d\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw" Apr 17 18:17:06.386097 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:06.386075 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhzfg\" (UniqueName: \"kubernetes.io/projected/bb044221-071e-4902-8ec2-d25fb734753d-kube-api-access-jhzfg\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw\" (UID: \"bb044221-071e-4902-8ec2-d25fb734753d\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw" Apr 17 18:17:06.523310 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:06.523274 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw" Apr 17 18:17:06.557434 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:06.557404 2566 generic.go:358] "Generic (PLEG): container finished" podID="5e43c699-610e-4fe8-9f95-b0acb0fce3be" containerID="949d1f45ff035ff65018e1119f5576223f5fc1cf497dbda26d6d685827914e8c" exitCode=2 Apr 17 18:17:06.557578 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:06.557440 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hcz8p" event={"ID":"5e43c699-610e-4fe8-9f95-b0acb0fce3be","Type":"ContainerDied","Data":"949d1f45ff035ff65018e1119f5576223f5fc1cf497dbda26d6d685827914e8c"} Apr 17 18:17:06.645381 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:06.645281 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw"] Apr 17 18:17:06.647984 ip-10-0-140-147 kubenswrapper[2566]: W0417 18:17:06.647956 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb044221_071e_4902_8ec2_d25fb734753d.slice/crio-839eb7a5110a4e0f3657c99b283f4e795e6264004ac9ab50925e3e52fbd17ba1 WatchSource:0}: Error finding container 839eb7a5110a4e0f3657c99b283f4e795e6264004ac9ab50925e3e52fbd17ba1: Status 404 returned error can't find the container with id 839eb7a5110a4e0f3657c99b283f4e795e6264004ac9ab50925e3e52fbd17ba1 Apr 17 18:17:07.561792 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:07.561755 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw" event={"ID":"bb044221-071e-4902-8ec2-d25fb734753d","Type":"ContainerStarted","Data":"dbc467c261370524276f54a12be17ed80744a283fb6453c6f3d42d33181587fd"} Apr 17 18:17:07.561792 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:07.561793 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw" event={"ID":"bb044221-071e-4902-8ec2-d25fb734753d","Type":"ContainerStarted","Data":"839eb7a5110a4e0f3657c99b283f4e795e6264004ac9ab50925e3e52fbd17ba1"} Apr 17 18:17:09.503521 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:09.503499 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hcz8p" Apr 17 18:17:09.569499 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:09.569471 2566 generic.go:358] "Generic (PLEG): container finished" podID="5e43c699-610e-4fe8-9f95-b0acb0fce3be" containerID="b300d56e9a653a69a78e63ede32880b1317780c97e1717a2008ab205c75205e8" exitCode=0 Apr 17 18:17:09.569652 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:09.569505 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hcz8p" event={"ID":"5e43c699-610e-4fe8-9f95-b0acb0fce3be","Type":"ContainerDied","Data":"b300d56e9a653a69a78e63ede32880b1317780c97e1717a2008ab205c75205e8"} Apr 17 18:17:09.569652 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:09.569528 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hcz8p" event={"ID":"5e43c699-610e-4fe8-9f95-b0acb0fce3be","Type":"ContainerDied","Data":"098e8d7919c96518a2c357e1b33a4a27b91958e8dfc47f2a9999441daea181a8"} Apr 17 18:17:09.569652 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:09.569547 2566 scope.go:117] "RemoveContainer" containerID="949d1f45ff035ff65018e1119f5576223f5fc1cf497dbda26d6d685827914e8c" Apr 17 18:17:09.569652 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:09.569555 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hcz8p" Apr 17 18:17:09.577377 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:09.577358 2566 scope.go:117] "RemoveContainer" containerID="b300d56e9a653a69a78e63ede32880b1317780c97e1717a2008ab205c75205e8" Apr 17 18:17:09.584102 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:09.584087 2566 scope.go:117] "RemoveContainer" containerID="f5389c0894086777c6ccd9e7d33781c2178daec34fa046fada9642e1a425dfc8" Apr 17 18:17:09.590414 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:09.590400 2566 scope.go:117] "RemoveContainer" containerID="949d1f45ff035ff65018e1119f5576223f5fc1cf497dbda26d6d685827914e8c" Apr 17 18:17:09.590635 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:17:09.590615 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"949d1f45ff035ff65018e1119f5576223f5fc1cf497dbda26d6d685827914e8c\": container with ID starting with 949d1f45ff035ff65018e1119f5576223f5fc1cf497dbda26d6d685827914e8c not found: ID does not exist" containerID="949d1f45ff035ff65018e1119f5576223f5fc1cf497dbda26d6d685827914e8c" Apr 17 18:17:09.590677 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:09.590641 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"949d1f45ff035ff65018e1119f5576223f5fc1cf497dbda26d6d685827914e8c"} err="failed to get container status \"949d1f45ff035ff65018e1119f5576223f5fc1cf497dbda26d6d685827914e8c\": rpc error: code = NotFound desc = could not find container \"949d1f45ff035ff65018e1119f5576223f5fc1cf497dbda26d6d685827914e8c\": container with ID starting with 949d1f45ff035ff65018e1119f5576223f5fc1cf497dbda26d6d685827914e8c not found: ID does not exist" Apr 17 18:17:09.590677 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:09.590656 2566 scope.go:117] "RemoveContainer" containerID="b300d56e9a653a69a78e63ede32880b1317780c97e1717a2008ab205c75205e8" Apr 17 18:17:09.590899 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:17:09.590882 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b300d56e9a653a69a78e63ede32880b1317780c97e1717a2008ab205c75205e8\": container with ID starting with b300d56e9a653a69a78e63ede32880b1317780c97e1717a2008ab205c75205e8 not found: ID does not exist" containerID="b300d56e9a653a69a78e63ede32880b1317780c97e1717a2008ab205c75205e8" Apr 17 18:17:09.590951 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:09.590907 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b300d56e9a653a69a78e63ede32880b1317780c97e1717a2008ab205c75205e8"} err="failed to get container status \"b300d56e9a653a69a78e63ede32880b1317780c97e1717a2008ab205c75205e8\": rpc error: code = NotFound desc = could not find container \"b300d56e9a653a69a78e63ede32880b1317780c97e1717a2008ab205c75205e8\": container with ID starting with b300d56e9a653a69a78e63ede32880b1317780c97e1717a2008ab205c75205e8 not found: ID does not exist" Apr 17 18:17:09.590951 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:09.590922 2566 scope.go:117] "RemoveContainer" containerID="f5389c0894086777c6ccd9e7d33781c2178daec34fa046fada9642e1a425dfc8" Apr 17 18:17:09.591124 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:17:09.591109 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5389c0894086777c6ccd9e7d33781c2178daec34fa046fada9642e1a425dfc8\": container with ID starting with f5389c0894086777c6ccd9e7d33781c2178daec34fa046fada9642e1a425dfc8 not found: ID does not exist" containerID="f5389c0894086777c6ccd9e7d33781c2178daec34fa046fada9642e1a425dfc8" Apr 17 18:17:09.591164 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:09.591128 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5389c0894086777c6ccd9e7d33781c2178daec34fa046fada9642e1a425dfc8"} err="failed to get container status \"f5389c0894086777c6ccd9e7d33781c2178daec34fa046fada9642e1a425dfc8\": rpc error: code = NotFound desc = could not find container \"f5389c0894086777c6ccd9e7d33781c2178daec34fa046fada9642e1a425dfc8\": container with ID starting with f5389c0894086777c6ccd9e7d33781c2178daec34fa046fada9642e1a425dfc8 not found: ID does not exist" Apr 17 18:17:09.603436 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:09.603419 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5e43c699-610e-4fe8-9f95-b0acb0fce3be-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"5e43c699-610e-4fe8-9f95-b0acb0fce3be\" (UID: \"5e43c699-610e-4fe8-9f95-b0acb0fce3be\") " Apr 17 18:17:09.603507 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:09.603451 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e43c699-610e-4fe8-9f95-b0acb0fce3be-kserve-provision-location\") pod \"5e43c699-610e-4fe8-9f95-b0acb0fce3be\" (UID: \"5e43c699-610e-4fe8-9f95-b0acb0fce3be\") " Apr 17 18:17:09.603507 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:09.603477 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5e43c699-610e-4fe8-9f95-b0acb0fce3be-proxy-tls\") pod \"5e43c699-610e-4fe8-9f95-b0acb0fce3be\" (UID: \"5e43c699-610e-4fe8-9f95-b0acb0fce3be\") " Apr 17 18:17:09.603573 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:09.603547 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qc6tr\" (UniqueName: \"kubernetes.io/projected/5e43c699-610e-4fe8-9f95-b0acb0fce3be-kube-api-access-qc6tr\") pod \"5e43c699-610e-4fe8-9f95-b0acb0fce3be\" (UID: \"5e43c699-610e-4fe8-9f95-b0acb0fce3be\") " Apr 17 18:17:09.603709 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:09.603690 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e43c699-610e-4fe8-9f95-b0acb0fce3be-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5e43c699-610e-4fe8-9f95-b0acb0fce3be" (UID: "5e43c699-610e-4fe8-9f95-b0acb0fce3be"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:17:09.603783 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:09.603743 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e43c699-610e-4fe8-9f95-b0acb0fce3be-isvc-xgboost-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-kube-rbac-proxy-sar-config") pod "5e43c699-610e-4fe8-9f95-b0acb0fce3be" (UID: "5e43c699-610e-4fe8-9f95-b0acb0fce3be"). InnerVolumeSpecName "isvc-xgboost-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:17:09.603783 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:09.603750 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e43c699-610e-4fe8-9f95-b0acb0fce3be-kserve-provision-location\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:17:09.605362 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:09.605338 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e43c699-610e-4fe8-9f95-b0acb0fce3be-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5e43c699-610e-4fe8-9f95-b0acb0fce3be" (UID: "5e43c699-610e-4fe8-9f95-b0acb0fce3be"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:17:09.605470 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:09.605448 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e43c699-610e-4fe8-9f95-b0acb0fce3be-kube-api-access-qc6tr" (OuterVolumeSpecName: "kube-api-access-qc6tr") pod "5e43c699-610e-4fe8-9f95-b0acb0fce3be" (UID: "5e43c699-610e-4fe8-9f95-b0acb0fce3be"). InnerVolumeSpecName "kube-api-access-qc6tr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:17:09.705081 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:09.705054 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qc6tr\" (UniqueName: \"kubernetes.io/projected/5e43c699-610e-4fe8-9f95-b0acb0fce3be-kube-api-access-qc6tr\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:17:09.705081 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:09.705080 2566 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5e43c699-610e-4fe8-9f95-b0acb0fce3be-isvc-xgboost-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:17:09.705242 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:09.705092 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5e43c699-610e-4fe8-9f95-b0acb0fce3be-proxy-tls\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:17:09.895795 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:09.895762 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hcz8p"] Apr 17 18:17:09.899740 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:09.899717 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hcz8p"] Apr 17 18:17:10.573801 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:10.573774 2566 generic.go:358] "Generic (PLEG): container finished" podID="bb044221-071e-4902-8ec2-d25fb734753d" containerID="dbc467c261370524276f54a12be17ed80744a283fb6453c6f3d42d33181587fd" exitCode=0 Apr 17 18:17:10.574160 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:10.573848 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw" event={"ID":"bb044221-071e-4902-8ec2-d25fb734753d","Type":"ContainerDied","Data":"dbc467c261370524276f54a12be17ed80744a283fb6453c6f3d42d33181587fd"} Apr 17 18:17:10.838818 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:10.838788 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e43c699-610e-4fe8-9f95-b0acb0fce3be" path="/var/lib/kubelet/pods/5e43c699-610e-4fe8-9f95-b0acb0fce3be/volumes" Apr 17 18:17:11.578994 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:11.578965 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw" event={"ID":"bb044221-071e-4902-8ec2-d25fb734753d","Type":"ContainerStarted","Data":"2ac0543b6803c616f07fdbda332ee193999c04b649f598616e8838578c9289ad"} Apr 17 18:17:11.578994 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:11.579000 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw" event={"ID":"bb044221-071e-4902-8ec2-d25fb734753d","Type":"ContainerStarted","Data":"b1b97f36bfb86e4ee8980f30087c6e3c040de6374ea42bfd44c71fb3d8fdae08"} Apr 17 18:17:11.579416 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:11.579212 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw" Apr 17 18:17:11.579416 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:11.579276 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw" Apr 17 18:17:11.603820 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:11.603771 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw" podStartSLOduration=5.603755541 podStartE2EDuration="5.603755541s" podCreationTimestamp="2026-04-17 18:17:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:17:11.602663334 +0000 UTC m=+3143.272742989" watchObservedRunningTime="2026-04-17 18:17:11.603755541 +0000 UTC m=+3143.273835184" Apr 17 18:17:17.587561 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:17.587534 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw" Apr 17 18:17:47.591631 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:47.591601 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw" Apr 17 18:17:56.263393 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:56.263354 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw"] Apr 17 18:17:56.263826 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:56.263680 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw" podUID="bb044221-071e-4902-8ec2-d25fb734753d" containerName="kserve-container" containerID="cri-o://b1b97f36bfb86e4ee8980f30087c6e3c040de6374ea42bfd44c71fb3d8fdae08" gracePeriod=30 Apr 17 18:17:56.263826 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:56.263736 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw" podUID="bb044221-071e-4902-8ec2-d25fb734753d" containerName="kube-rbac-proxy" containerID="cri-o://2ac0543b6803c616f07fdbda332ee193999c04b649f598616e8838578c9289ad" gracePeriod=30 Apr 17 18:17:56.338367 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:56.338332 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-p5lrg"] Apr 17 18:17:56.338669 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:56.338658 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e43c699-610e-4fe8-9f95-b0acb0fce3be" containerName="kube-rbac-proxy" Apr 17 18:17:56.338715 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:56.338671 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e43c699-610e-4fe8-9f95-b0acb0fce3be" containerName="kube-rbac-proxy" Apr 17 18:17:56.338715 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:56.338684 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e43c699-610e-4fe8-9f95-b0acb0fce3be" containerName="storage-initializer" Apr 17 18:17:56.338715 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:56.338689 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e43c699-610e-4fe8-9f95-b0acb0fce3be" containerName="storage-initializer" Apr 17 18:17:56.338715 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:56.338695 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e43c699-610e-4fe8-9f95-b0acb0fce3be" containerName="kserve-container" Apr 17 18:17:56.338715 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:56.338701 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e43c699-610e-4fe8-9f95-b0acb0fce3be" containerName="kserve-container" Apr 17 18:17:56.338870 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:56.338744 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="5e43c699-610e-4fe8-9f95-b0acb0fce3be" containerName="kserve-container" Apr 17 18:17:56.338870 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:56.338753 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="5e43c699-610e-4fe8-9f95-b0acb0fce3be" containerName="kube-rbac-proxy" Apr 17 18:17:56.342920 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:56.342905 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-p5lrg" Apr 17 18:17:56.345647 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:56.345625 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"xgboost-v2-mlserver-predictor-serving-cert\"" Apr 17 18:17:56.345647 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:56.345646 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\"" Apr 17 18:17:56.351541 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:56.351519 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-p5lrg"] Apr 17 18:17:56.477579 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:56.477544 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7e237db1-4368-4167-aaf6-55756fa0fd69-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-p5lrg\" (UID: \"7e237db1-4368-4167-aaf6-55756fa0fd69\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-p5lrg" Apr 17 18:17:56.477773 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:56.477595 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7e237db1-4368-4167-aaf6-55756fa0fd69-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-p5lrg\" (UID: \"7e237db1-4368-4167-aaf6-55756fa0fd69\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-p5lrg" Apr 17 18:17:56.477773 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:56.477680 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4jfj\" (UniqueName: \"kubernetes.io/projected/7e237db1-4368-4167-aaf6-55756fa0fd69-kube-api-access-f4jfj\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-p5lrg\" (UID: \"7e237db1-4368-4167-aaf6-55756fa0fd69\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-p5lrg" Apr 17 18:17:56.477773 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:56.477725 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e237db1-4368-4167-aaf6-55756fa0fd69-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-p5lrg\" (UID: \"7e237db1-4368-4167-aaf6-55756fa0fd69\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-p5lrg" Apr 17 18:17:56.578510 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:56.578476 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e237db1-4368-4167-aaf6-55756fa0fd69-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-p5lrg\" (UID: \"7e237db1-4368-4167-aaf6-55756fa0fd69\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-p5lrg" Apr 17 18:17:56.578689 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:56.578545 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7e237db1-4368-4167-aaf6-55756fa0fd69-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-p5lrg\" (UID: \"7e237db1-4368-4167-aaf6-55756fa0fd69\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-p5lrg" Apr 17 18:17:56.578689 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:56.578603 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7e237db1-4368-4167-aaf6-55756fa0fd69-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-p5lrg\" (UID: \"7e237db1-4368-4167-aaf6-55756fa0fd69\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-p5lrg" Apr 17 18:17:56.578689 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:56.578684 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f4jfj\" (UniqueName: \"kubernetes.io/projected/7e237db1-4368-4167-aaf6-55756fa0fd69-kube-api-access-f4jfj\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-p5lrg\" (UID: \"7e237db1-4368-4167-aaf6-55756fa0fd69\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-p5lrg" Apr 17 18:17:56.578960 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:56.578932 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e237db1-4368-4167-aaf6-55756fa0fd69-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-p5lrg\" (UID: \"7e237db1-4368-4167-aaf6-55756fa0fd69\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-p5lrg" Apr 17 18:17:56.579211 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:56.579190 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7e237db1-4368-4167-aaf6-55756fa0fd69-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-p5lrg\" (UID: \"7e237db1-4368-4167-aaf6-55756fa0fd69\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-p5lrg" Apr 17 18:17:56.580951 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:56.580928 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7e237db1-4368-4167-aaf6-55756fa0fd69-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-p5lrg\" (UID: \"7e237db1-4368-4167-aaf6-55756fa0fd69\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-p5lrg" Apr 17 18:17:56.587222 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:56.587204 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4jfj\" (UniqueName: \"kubernetes.io/projected/7e237db1-4368-4167-aaf6-55756fa0fd69-kube-api-access-f4jfj\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-p5lrg\" (UID: \"7e237db1-4368-4167-aaf6-55756fa0fd69\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-p5lrg" Apr 17 18:17:56.654996 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:56.654969 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-p5lrg" Apr 17 18:17:56.709772 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:56.709735 2566 generic.go:358] "Generic (PLEG): container finished" podID="bb044221-071e-4902-8ec2-d25fb734753d" containerID="2ac0543b6803c616f07fdbda332ee193999c04b649f598616e8838578c9289ad" exitCode=2 Apr 17 18:17:56.709931 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:56.709810 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw" event={"ID":"bb044221-071e-4902-8ec2-d25fb734753d","Type":"ContainerDied","Data":"2ac0543b6803c616f07fdbda332ee193999c04b649f598616e8838578c9289ad"} Apr 17 18:17:56.777982 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:56.777953 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-p5lrg"] Apr 17 18:17:56.781794 ip-10-0-140-147 kubenswrapper[2566]: W0417 18:17:56.781769 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e237db1_4368_4167_aaf6_55756fa0fd69.slice/crio-0697e2d5c41022583d901f79456f3909904c27e142a83b508acf5e6d7ee628fd WatchSource:0}: Error finding container 0697e2d5c41022583d901f79456f3909904c27e142a83b508acf5e6d7ee628fd: Status 404 returned error can't find the container with id 0697e2d5c41022583d901f79456f3909904c27e142a83b508acf5e6d7ee628fd Apr 17 18:17:57.583068 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:57.583031 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw" podUID="bb044221-071e-4902-8ec2-d25fb734753d" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.53:8643/healthz\": dial tcp 10.133.0.53:8643: connect: connection refused" Apr 17 18:17:57.714163 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:57.714129 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-p5lrg" event={"ID":"7e237db1-4368-4167-aaf6-55756fa0fd69","Type":"ContainerStarted","Data":"377db2a4f5d66ce8fcedda11c03360969052a15e5ab2e0ba69b77cbbbb21d34f"} Apr 17 18:17:57.714163 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:57.714168 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-p5lrg" event={"ID":"7e237db1-4368-4167-aaf6-55756fa0fd69","Type":"ContainerStarted","Data":"0697e2d5c41022583d901f79456f3909904c27e142a83b508acf5e6d7ee628fd"} Apr 17 18:17:58.629444 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:17:58.629400 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw" podUID="bb044221-071e-4902-8ec2-d25fb734753d" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.53:8080/v2/models/isvc-xgboost-v2-mlserver/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 17 18:18:00.724155 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:00.724125 2566 generic.go:358] "Generic (PLEG): container finished" podID="7e237db1-4368-4167-aaf6-55756fa0fd69" containerID="377db2a4f5d66ce8fcedda11c03360969052a15e5ab2e0ba69b77cbbbb21d34f" exitCode=0 Apr 17 18:18:00.724497 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:00.724181 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-p5lrg" event={"ID":"7e237db1-4368-4167-aaf6-55756fa0fd69","Type":"ContainerDied","Data":"377db2a4f5d66ce8fcedda11c03360969052a15e5ab2e0ba69b77cbbbb21d34f"} Apr 17 18:18:01.729076 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:01.729042 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-p5lrg" event={"ID":"7e237db1-4368-4167-aaf6-55756fa0fd69","Type":"ContainerStarted","Data":"4c334cbf418167343f3945e463df97f95c529535a2d1274e43ce8c81067d178b"} Apr 17 18:18:01.729526 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:01.729084 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-p5lrg" event={"ID":"7e237db1-4368-4167-aaf6-55756fa0fd69","Type":"ContainerStarted","Data":"f59d18e69ceaf8bdf893701d917885aa12ff0494e3964f55880c80229a457c50"} Apr 17 18:18:01.729526 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:01.729297 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-p5lrg" Apr 17 18:18:01.750339 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:01.750295 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-p5lrg" podStartSLOduration=5.750282167 podStartE2EDuration="5.750282167s" podCreationTimestamp="2026-04-17 18:17:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:18:01.748772163 +0000 UTC m=+3193.418851818" watchObservedRunningTime="2026-04-17 18:18:01.750282167 +0000 UTC m=+3193.420361809" Apr 17 18:18:02.605780 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:02.605759 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw" Apr 17 18:18:02.734132 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:02.734036 2566 generic.go:358] "Generic (PLEG): container finished" podID="bb044221-071e-4902-8ec2-d25fb734753d" containerID="b1b97f36bfb86e4ee8980f30087c6e3c040de6374ea42bfd44c71fb3d8fdae08" exitCode=0 Apr 17 18:18:02.734132 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:02.734119 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw" Apr 17 18:18:02.734132 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:02.734121 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw" event={"ID":"bb044221-071e-4902-8ec2-d25fb734753d","Type":"ContainerDied","Data":"b1b97f36bfb86e4ee8980f30087c6e3c040de6374ea42bfd44c71fb3d8fdae08"} Apr 17 18:18:02.734746 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:02.734158 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw" event={"ID":"bb044221-071e-4902-8ec2-d25fb734753d","Type":"ContainerDied","Data":"839eb7a5110a4e0f3657c99b283f4e795e6264004ac9ab50925e3e52fbd17ba1"} Apr 17 18:18:02.734746 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:02.734173 2566 scope.go:117] "RemoveContainer" containerID="2ac0543b6803c616f07fdbda332ee193999c04b649f598616e8838578c9289ad" Apr 17 18:18:02.734746 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:02.734537 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-p5lrg" Apr 17 18:18:02.734913 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:02.734782 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bb044221-071e-4902-8ec2-d25fb734753d-proxy-tls\") pod \"bb044221-071e-4902-8ec2-d25fb734753d\" (UID: \"bb044221-071e-4902-8ec2-d25fb734753d\") " Apr 17 18:18:02.734913 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:02.734845 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bb044221-071e-4902-8ec2-d25fb734753d-kserve-provision-location\") pod \"bb044221-071e-4902-8ec2-d25fb734753d\" (UID: \"bb044221-071e-4902-8ec2-d25fb734753d\") " Apr 17 18:18:02.734913 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:02.734887 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhzfg\" (UniqueName: \"kubernetes.io/projected/bb044221-071e-4902-8ec2-d25fb734753d-kube-api-access-jhzfg\") pod \"bb044221-071e-4902-8ec2-d25fb734753d\" (UID: \"bb044221-071e-4902-8ec2-d25fb734753d\") " Apr 17 18:18:02.735080 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:02.734935 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bb044221-071e-4902-8ec2-d25fb734753d-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"bb044221-071e-4902-8ec2-d25fb734753d\" (UID: \"bb044221-071e-4902-8ec2-d25fb734753d\") " Apr 17 18:18:02.735208 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:02.735116 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb044221-071e-4902-8ec2-d25fb734753d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bb044221-071e-4902-8ec2-d25fb734753d" (UID: "bb044221-071e-4902-8ec2-d25fb734753d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:18:02.735409 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:02.735382 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb044221-071e-4902-8ec2-d25fb734753d-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config") pod "bb044221-071e-4902-8ec2-d25fb734753d" (UID: "bb044221-071e-4902-8ec2-d25fb734753d"). InnerVolumeSpecName "isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:18:02.737071 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:02.737050 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb044221-071e-4902-8ec2-d25fb734753d-kube-api-access-jhzfg" (OuterVolumeSpecName: "kube-api-access-jhzfg") pod "bb044221-071e-4902-8ec2-d25fb734753d" (UID: "bb044221-071e-4902-8ec2-d25fb734753d"). InnerVolumeSpecName "kube-api-access-jhzfg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:18:02.737351 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:02.737334 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb044221-071e-4902-8ec2-d25fb734753d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "bb044221-071e-4902-8ec2-d25fb734753d" (UID: "bb044221-071e-4902-8ec2-d25fb734753d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:18:02.750717 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:02.750701 2566 scope.go:117] "RemoveContainer" containerID="b1b97f36bfb86e4ee8980f30087c6e3c040de6374ea42bfd44c71fb3d8fdae08" Apr 17 18:18:02.757937 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:02.757920 2566 scope.go:117] "RemoveContainer" containerID="dbc467c261370524276f54a12be17ed80744a283fb6453c6f3d42d33181587fd" Apr 17 18:18:02.764444 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:02.764428 2566 scope.go:117] "RemoveContainer" containerID="2ac0543b6803c616f07fdbda332ee193999c04b649f598616e8838578c9289ad" Apr 17 18:18:02.764699 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:18:02.764681 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ac0543b6803c616f07fdbda332ee193999c04b649f598616e8838578c9289ad\": container with ID starting with 2ac0543b6803c616f07fdbda332ee193999c04b649f598616e8838578c9289ad not found: ID does not exist" containerID="2ac0543b6803c616f07fdbda332ee193999c04b649f598616e8838578c9289ad" Apr 17 18:18:02.764748 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:02.764717 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ac0543b6803c616f07fdbda332ee193999c04b649f598616e8838578c9289ad"} err="failed to get container status \"2ac0543b6803c616f07fdbda332ee193999c04b649f598616e8838578c9289ad\": rpc error: code = NotFound desc = could not find container \"2ac0543b6803c616f07fdbda332ee193999c04b649f598616e8838578c9289ad\": container with ID starting with 2ac0543b6803c616f07fdbda332ee193999c04b649f598616e8838578c9289ad not found: ID does not exist" Apr 17 18:18:02.764748 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:02.764736 2566 scope.go:117] "RemoveContainer" containerID="b1b97f36bfb86e4ee8980f30087c6e3c040de6374ea42bfd44c71fb3d8fdae08" Apr 17 18:18:02.764984 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:18:02.764966 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1b97f36bfb86e4ee8980f30087c6e3c040de6374ea42bfd44c71fb3d8fdae08\": container with ID starting with b1b97f36bfb86e4ee8980f30087c6e3c040de6374ea42bfd44c71fb3d8fdae08 not found: ID does not exist" containerID="b1b97f36bfb86e4ee8980f30087c6e3c040de6374ea42bfd44c71fb3d8fdae08" Apr 17 18:18:02.765031 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:02.764992 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1b97f36bfb86e4ee8980f30087c6e3c040de6374ea42bfd44c71fb3d8fdae08"} err="failed to get container status \"b1b97f36bfb86e4ee8980f30087c6e3c040de6374ea42bfd44c71fb3d8fdae08\": rpc error: code = NotFound desc = could not find container \"b1b97f36bfb86e4ee8980f30087c6e3c040de6374ea42bfd44c71fb3d8fdae08\": container with ID starting with b1b97f36bfb86e4ee8980f30087c6e3c040de6374ea42bfd44c71fb3d8fdae08 not found: ID does not exist" Apr 17 18:18:02.765031 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:02.765008 2566 scope.go:117] "RemoveContainer" containerID="dbc467c261370524276f54a12be17ed80744a283fb6453c6f3d42d33181587fd" Apr 17 18:18:02.765228 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:18:02.765212 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbc467c261370524276f54a12be17ed80744a283fb6453c6f3d42d33181587fd\": container with ID starting with dbc467c261370524276f54a12be17ed80744a283fb6453c6f3d42d33181587fd not found: ID does not exist" containerID="dbc467c261370524276f54a12be17ed80744a283fb6453c6f3d42d33181587fd" Apr 17 18:18:02.765309 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:02.765234 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbc467c261370524276f54a12be17ed80744a283fb6453c6f3d42d33181587fd"} err="failed to get container status \"dbc467c261370524276f54a12be17ed80744a283fb6453c6f3d42d33181587fd\": rpc error: code = NotFound desc = could not find container \"dbc467c261370524276f54a12be17ed80744a283fb6453c6f3d42d33181587fd\": container with ID starting with dbc467c261370524276f54a12be17ed80744a283fb6453c6f3d42d33181587fd not found: ID does not exist" Apr 17 18:18:02.835775 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:02.835739 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jhzfg\" (UniqueName: \"kubernetes.io/projected/bb044221-071e-4902-8ec2-d25fb734753d-kube-api-access-jhzfg\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:18:02.835775 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:02.835781 2566 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bb044221-071e-4902-8ec2-d25fb734753d-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:18:02.836014 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:02.835800 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bb044221-071e-4902-8ec2-d25fb734753d-proxy-tls\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:18:02.836014 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:02.835814 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bb044221-071e-4902-8ec2-d25fb734753d-kserve-provision-location\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:18:03.050815 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:03.050738 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw"] Apr 17 18:18:03.054228 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:03.054202 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw"] Apr 17 18:18:03.582953 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:03.582913 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9nzhw" podUID="bb044221-071e-4902-8ec2-d25fb734753d" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.53:8643/healthz\": context deadline exceeded" Apr 17 18:18:04.837895 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:04.837864 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb044221-071e-4902-8ec2-d25fb734753d" path="/var/lib/kubelet/pods/bb044221-071e-4902-8ec2-d25fb734753d/volumes" Apr 17 18:18:08.743528 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:08.743492 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-p5lrg" Apr 17 18:18:38.747679 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:38.747646 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-p5lrg" Apr 17 18:18:46.417763 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:46.417718 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-p5lrg"] Apr 17 18:18:46.418209 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:46.418145 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-p5lrg" podUID="7e237db1-4368-4167-aaf6-55756fa0fd69" containerName="kserve-container" containerID="cri-o://f59d18e69ceaf8bdf893701d917885aa12ff0494e3964f55880c80229a457c50" gracePeriod=30 Apr 17 18:18:46.418298 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:46.418189 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-p5lrg" podUID="7e237db1-4368-4167-aaf6-55756fa0fd69" containerName="kube-rbac-proxy" containerID="cri-o://4c334cbf418167343f3945e463df97f95c529535a2d1274e43ce8c81067d178b" gracePeriod=30 Apr 17 18:18:46.502471 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:46.502441 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-5f4v7"] Apr 17 18:18:46.502841 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:46.502822 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb044221-071e-4902-8ec2-d25fb734753d" containerName="storage-initializer" Apr 17 18:18:46.502893 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:46.502842 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb044221-071e-4902-8ec2-d25fb734753d" containerName="storage-initializer" Apr 17 18:18:46.502893 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:46.502851 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb044221-071e-4902-8ec2-d25fb734753d" containerName="kserve-container" Apr 17 18:18:46.502893 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:46.502856 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb044221-071e-4902-8ec2-d25fb734753d" containerName="kserve-container" Apr 17 18:18:46.502893 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:46.502871 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb044221-071e-4902-8ec2-d25fb734753d" containerName="kube-rbac-proxy" Apr 17 18:18:46.502893 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:46.502877 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb044221-071e-4902-8ec2-d25fb734753d" containerName="kube-rbac-proxy" Apr 17 18:18:46.503075 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:46.502962 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="bb044221-071e-4902-8ec2-d25fb734753d" containerName="kserve-container" Apr 17 18:18:46.503075 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:46.502971 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="bb044221-071e-4902-8ec2-d25fb734753d" containerName="kube-rbac-proxy" Apr 17 18:18:46.507207 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:46.507191 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-5f4v7" Apr 17 18:18:46.509511 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:46.509489 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\"" Apr 17 18:18:46.509607 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:46.509494 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-runtime-predictor-serving-cert\"" Apr 17 18:18:46.516186 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:46.516164 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-5f4v7"] Apr 17 18:18:46.576186 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:46.576156 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbmdl\" (UniqueName: \"kubernetes.io/projected/34c0350c-b616-48b7-9ada-e90510c12ac6-kube-api-access-gbmdl\") pod \"isvc-xgboost-runtime-predictor-779db84d9-5f4v7\" (UID: \"34c0350c-b616-48b7-9ada-e90510c12ac6\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-5f4v7" Apr 17 18:18:46.576334 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:46.576194 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34c0350c-b616-48b7-9ada-e90510c12ac6-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-779db84d9-5f4v7\" (UID: \"34c0350c-b616-48b7-9ada-e90510c12ac6\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-5f4v7" Apr 17 18:18:46.576334 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:46.576221 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/34c0350c-b616-48b7-9ada-e90510c12ac6-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-5f4v7\" (UID: \"34c0350c-b616-48b7-9ada-e90510c12ac6\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-5f4v7" Apr 17 18:18:46.576416 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:46.576364 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/34c0350c-b616-48b7-9ada-e90510c12ac6-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-runtime-predictor-779db84d9-5f4v7\" (UID: \"34c0350c-b616-48b7-9ada-e90510c12ac6\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-5f4v7" Apr 17 18:18:46.676744 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:46.676672 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gbmdl\" (UniqueName: \"kubernetes.io/projected/34c0350c-b616-48b7-9ada-e90510c12ac6-kube-api-access-gbmdl\") pod \"isvc-xgboost-runtime-predictor-779db84d9-5f4v7\" (UID: \"34c0350c-b616-48b7-9ada-e90510c12ac6\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-5f4v7" Apr 17 18:18:46.676744 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:46.676712 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34c0350c-b616-48b7-9ada-e90510c12ac6-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-779db84d9-5f4v7\" (UID: \"34c0350c-b616-48b7-9ada-e90510c12ac6\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-5f4v7" Apr 17 18:18:46.676744 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:46.676734 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/34c0350c-b616-48b7-9ada-e90510c12ac6-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-5f4v7\" (UID: \"34c0350c-b616-48b7-9ada-e90510c12ac6\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-5f4v7" Apr 17 18:18:46.676950 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:46.676775 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/34c0350c-b616-48b7-9ada-e90510c12ac6-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-runtime-predictor-779db84d9-5f4v7\" (UID: \"34c0350c-b616-48b7-9ada-e90510c12ac6\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-5f4v7" Apr 17 18:18:46.676950 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:18:46.676899 2566 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-serving-cert: secret "isvc-xgboost-runtime-predictor-serving-cert" not found Apr 17 18:18:46.677030 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:18:46.676969 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34c0350c-b616-48b7-9ada-e90510c12ac6-proxy-tls podName:34c0350c-b616-48b7-9ada-e90510c12ac6 nodeName:}" failed. No retries permitted until 2026-04-17 18:18:47.176947283 +0000 UTC m=+3238.847026910 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/34c0350c-b616-48b7-9ada-e90510c12ac6-proxy-tls") pod "isvc-xgboost-runtime-predictor-779db84d9-5f4v7" (UID: "34c0350c-b616-48b7-9ada-e90510c12ac6") : secret "isvc-xgboost-runtime-predictor-serving-cert" not found Apr 17 18:18:46.677095 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:46.677079 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34c0350c-b616-48b7-9ada-e90510c12ac6-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-779db84d9-5f4v7\" (UID: \"34c0350c-b616-48b7-9ada-e90510c12ac6\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-5f4v7" Apr 17 18:18:46.677458 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:46.677440 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/34c0350c-b616-48b7-9ada-e90510c12ac6-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-runtime-predictor-779db84d9-5f4v7\" (UID: \"34c0350c-b616-48b7-9ada-e90510c12ac6\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-5f4v7" Apr 17 18:18:46.688418 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:46.688396 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbmdl\" (UniqueName: \"kubernetes.io/projected/34c0350c-b616-48b7-9ada-e90510c12ac6-kube-api-access-gbmdl\") pod \"isvc-xgboost-runtime-predictor-779db84d9-5f4v7\" (UID: \"34c0350c-b616-48b7-9ada-e90510c12ac6\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-5f4v7" Apr 17 18:18:46.861203 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:46.861169 2566 generic.go:358] "Generic (PLEG): container finished" podID="7e237db1-4368-4167-aaf6-55756fa0fd69" containerID="4c334cbf418167343f3945e463df97f95c529535a2d1274e43ce8c81067d178b" exitCode=2 Apr 17 18:18:46.861384 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:46.861243 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-p5lrg" event={"ID":"7e237db1-4368-4167-aaf6-55756fa0fd69","Type":"ContainerDied","Data":"4c334cbf418167343f3945e463df97f95c529535a2d1274e43ce8c81067d178b"} Apr 17 18:18:47.180432 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:47.180395 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/34c0350c-b616-48b7-9ada-e90510c12ac6-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-5f4v7\" (UID: \"34c0350c-b616-48b7-9ada-e90510c12ac6\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-5f4v7" Apr 17 18:18:47.182766 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:47.182746 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/34c0350c-b616-48b7-9ada-e90510c12ac6-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-5f4v7\" (UID: \"34c0350c-b616-48b7-9ada-e90510c12ac6\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-5f4v7" Apr 17 18:18:47.417365 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:47.417330 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-5f4v7" Apr 17 18:18:47.537240 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:47.537216 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-5f4v7"] Apr 17 18:18:47.539349 ip-10-0-140-147 kubenswrapper[2566]: W0417 18:18:47.539317 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34c0350c_b616_48b7_9ada_e90510c12ac6.slice/crio-19b880aa751dd51571e546239968b27244d752f806b3bb420d8a1a9672f347c9 WatchSource:0}: Error finding container 19b880aa751dd51571e546239968b27244d752f806b3bb420d8a1a9672f347c9: Status 404 returned error can't find the container with id 19b880aa751dd51571e546239968b27244d752f806b3bb420d8a1a9672f347c9 Apr 17 18:18:47.865478 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:47.865443 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-5f4v7" event={"ID":"34c0350c-b616-48b7-9ada-e90510c12ac6","Type":"ContainerStarted","Data":"b918e813d9413846864ad336d944443e10b1ace6632fd36b308a2c741edc4f52"} Apr 17 18:18:47.865478 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:47.865481 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-5f4v7" event={"ID":"34c0350c-b616-48b7-9ada-e90510c12ac6","Type":"ContainerStarted","Data":"19b880aa751dd51571e546239968b27244d752f806b3bb420d8a1a9672f347c9"} Apr 17 18:18:48.739948 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:48.739905 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-p5lrg" podUID="7e237db1-4368-4167-aaf6-55756fa0fd69" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.54:8643/healthz\": dial tcp 10.133.0.54:8643: connect: connection refused" Apr 17 18:18:51.877878 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:51.877847 2566 generic.go:358] "Generic (PLEG): container finished" podID="34c0350c-b616-48b7-9ada-e90510c12ac6" containerID="b918e813d9413846864ad336d944443e10b1ace6632fd36b308a2c741edc4f52" exitCode=0 Apr 17 18:18:51.878304 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:51.877887 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-5f4v7" event={"ID":"34c0350c-b616-48b7-9ada-e90510c12ac6","Type":"ContainerDied","Data":"b918e813d9413846864ad336d944443e10b1ace6632fd36b308a2c741edc4f52"} Apr 17 18:18:52.655190 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:18:52.655159 2566 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e237db1_4368_4167_aaf6_55756fa0fd69.slice/crio-f59d18e69ceaf8bdf893701d917885aa12ff0494e3964f55880c80229a457c50.scope\": RecentStats: unable to find data in memory cache]" Apr 17 18:18:52.753920 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:52.753896 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-p5lrg" Apr 17 18:18:52.825401 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:52.825366 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7e237db1-4368-4167-aaf6-55756fa0fd69-proxy-tls\") pod \"7e237db1-4368-4167-aaf6-55756fa0fd69\" (UID: \"7e237db1-4368-4167-aaf6-55756fa0fd69\") " Apr 17 18:18:52.825564 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:52.825430 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4jfj\" (UniqueName: \"kubernetes.io/projected/7e237db1-4368-4167-aaf6-55756fa0fd69-kube-api-access-f4jfj\") pod \"7e237db1-4368-4167-aaf6-55756fa0fd69\" (UID: \"7e237db1-4368-4167-aaf6-55756fa0fd69\") " Apr 17 18:18:52.825564 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:52.825455 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7e237db1-4368-4167-aaf6-55756fa0fd69-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"7e237db1-4368-4167-aaf6-55756fa0fd69\" (UID: \"7e237db1-4368-4167-aaf6-55756fa0fd69\") " Apr 17 18:18:52.825564 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:52.825488 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e237db1-4368-4167-aaf6-55756fa0fd69-kserve-provision-location\") pod \"7e237db1-4368-4167-aaf6-55756fa0fd69\" (UID: \"7e237db1-4368-4167-aaf6-55756fa0fd69\") " Apr 17 18:18:52.825846 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:52.825820 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e237db1-4368-4167-aaf6-55756fa0fd69-xgboost-v2-mlserver-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "xgboost-v2-mlserver-kube-rbac-proxy-sar-config") pod "7e237db1-4368-4167-aaf6-55756fa0fd69" (UID: "7e237db1-4368-4167-aaf6-55756fa0fd69"). InnerVolumeSpecName "xgboost-v2-mlserver-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:18:52.825912 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:52.825843 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e237db1-4368-4167-aaf6-55756fa0fd69-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7e237db1-4368-4167-aaf6-55756fa0fd69" (UID: "7e237db1-4368-4167-aaf6-55756fa0fd69"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:18:52.827532 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:52.827506 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e237db1-4368-4167-aaf6-55756fa0fd69-kube-api-access-f4jfj" (OuterVolumeSpecName: "kube-api-access-f4jfj") pod "7e237db1-4368-4167-aaf6-55756fa0fd69" (UID: "7e237db1-4368-4167-aaf6-55756fa0fd69"). InnerVolumeSpecName "kube-api-access-f4jfj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:18:52.827588 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:52.827569 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e237db1-4368-4167-aaf6-55756fa0fd69-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "7e237db1-4368-4167-aaf6-55756fa0fd69" (UID: "7e237db1-4368-4167-aaf6-55756fa0fd69"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:18:52.882500 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:52.882427 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-5f4v7" event={"ID":"34c0350c-b616-48b7-9ada-e90510c12ac6","Type":"ContainerStarted","Data":"a480756ac22ed3f46eb186bc2afc1771dc5e6b20bbe05159b864e087fe7af5da"} Apr 17 18:18:52.882500 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:52.882466 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-5f4v7" event={"ID":"34c0350c-b616-48b7-9ada-e90510c12ac6","Type":"ContainerStarted","Data":"f952ed4242949e2c9c82ab75123c7930b85f05a2e4a1f5ee8fa378b5e730c80b"} Apr 17 18:18:52.882932 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:52.882683 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-5f4v7" Apr 17 18:18:52.884005 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:52.883984 2566 generic.go:358] "Generic (PLEG): container finished" podID="7e237db1-4368-4167-aaf6-55756fa0fd69" containerID="f59d18e69ceaf8bdf893701d917885aa12ff0494e3964f55880c80229a457c50" exitCode=0 Apr 17 18:18:52.884103 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:52.884060 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-p5lrg" Apr 17 18:18:52.884169 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:52.884056 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-p5lrg" event={"ID":"7e237db1-4368-4167-aaf6-55756fa0fd69","Type":"ContainerDied","Data":"f59d18e69ceaf8bdf893701d917885aa12ff0494e3964f55880c80229a457c50"} Apr 17 18:18:52.884169 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:52.884161 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-p5lrg" event={"ID":"7e237db1-4368-4167-aaf6-55756fa0fd69","Type":"ContainerDied","Data":"0697e2d5c41022583d901f79456f3909904c27e142a83b508acf5e6d7ee628fd"} Apr 17 18:18:52.884291 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:52.884179 2566 scope.go:117] "RemoveContainer" containerID="4c334cbf418167343f3945e463df97f95c529535a2d1274e43ce8c81067d178b" Apr 17 18:18:52.891454 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:52.891435 2566 scope.go:117] "RemoveContainer" containerID="f59d18e69ceaf8bdf893701d917885aa12ff0494e3964f55880c80229a457c50" Apr 17 18:18:52.898221 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:52.898207 2566 scope.go:117] "RemoveContainer" containerID="377db2a4f5d66ce8fcedda11c03360969052a15e5ab2e0ba69b77cbbbb21d34f" Apr 17 18:18:52.903735 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:52.903695 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-5f4v7" podStartSLOduration=6.903683183 podStartE2EDuration="6.903683183s" podCreationTimestamp="2026-04-17 18:18:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:18:52.901660371 +0000 UTC m=+3244.571740014" watchObservedRunningTime="2026-04-17 18:18:52.903683183 +0000 UTC m=+3244.573762825" Apr 17 18:18:52.906552 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:52.906533 2566 scope.go:117] "RemoveContainer" containerID="4c334cbf418167343f3945e463df97f95c529535a2d1274e43ce8c81067d178b" Apr 17 18:18:52.906789 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:18:52.906771 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c334cbf418167343f3945e463df97f95c529535a2d1274e43ce8c81067d178b\": container with ID starting with 4c334cbf418167343f3945e463df97f95c529535a2d1274e43ce8c81067d178b not found: ID does not exist" containerID="4c334cbf418167343f3945e463df97f95c529535a2d1274e43ce8c81067d178b" Apr 17 18:18:52.906833 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:52.906800 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c334cbf418167343f3945e463df97f95c529535a2d1274e43ce8c81067d178b"} err="failed to get container status \"4c334cbf418167343f3945e463df97f95c529535a2d1274e43ce8c81067d178b\": rpc error: code = NotFound desc = could not find container \"4c334cbf418167343f3945e463df97f95c529535a2d1274e43ce8c81067d178b\": container with ID starting with 4c334cbf418167343f3945e463df97f95c529535a2d1274e43ce8c81067d178b not found: ID does not exist" Apr 17 18:18:52.906833 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:52.906823 2566 scope.go:117] "RemoveContainer" containerID="f59d18e69ceaf8bdf893701d917885aa12ff0494e3964f55880c80229a457c50" Apr 17 18:18:52.907039 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:18:52.907025 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f59d18e69ceaf8bdf893701d917885aa12ff0494e3964f55880c80229a457c50\": container with ID starting with f59d18e69ceaf8bdf893701d917885aa12ff0494e3964f55880c80229a457c50 not found: ID does not exist" containerID="f59d18e69ceaf8bdf893701d917885aa12ff0494e3964f55880c80229a457c50" Apr 17 18:18:52.907077 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:52.907043 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f59d18e69ceaf8bdf893701d917885aa12ff0494e3964f55880c80229a457c50"} err="failed to get container status \"f59d18e69ceaf8bdf893701d917885aa12ff0494e3964f55880c80229a457c50\": rpc error: code = NotFound desc = could not find container \"f59d18e69ceaf8bdf893701d917885aa12ff0494e3964f55880c80229a457c50\": container with ID starting with f59d18e69ceaf8bdf893701d917885aa12ff0494e3964f55880c80229a457c50 not found: ID does not exist" Apr 17 18:18:52.907077 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:52.907056 2566 scope.go:117] "RemoveContainer" containerID="377db2a4f5d66ce8fcedda11c03360969052a15e5ab2e0ba69b77cbbbb21d34f" Apr 17 18:18:52.907240 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:18:52.907228 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"377db2a4f5d66ce8fcedda11c03360969052a15e5ab2e0ba69b77cbbbb21d34f\": container with ID starting with 377db2a4f5d66ce8fcedda11c03360969052a15e5ab2e0ba69b77cbbbb21d34f not found: ID does not exist" containerID="377db2a4f5d66ce8fcedda11c03360969052a15e5ab2e0ba69b77cbbbb21d34f" Apr 17 18:18:52.907305 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:52.907245 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"377db2a4f5d66ce8fcedda11c03360969052a15e5ab2e0ba69b77cbbbb21d34f"} err="failed to get container status \"377db2a4f5d66ce8fcedda11c03360969052a15e5ab2e0ba69b77cbbbb21d34f\": rpc error: code = NotFound desc = could not find container \"377db2a4f5d66ce8fcedda11c03360969052a15e5ab2e0ba69b77cbbbb21d34f\": container with ID starting with 377db2a4f5d66ce8fcedda11c03360969052a15e5ab2e0ba69b77cbbbb21d34f not found: ID does not exist" Apr 17 18:18:52.915054 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:52.915031 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-p5lrg"] Apr 17 18:18:52.917341 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:52.917322 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-p5lrg"] Apr 17 18:18:52.926204 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:52.926190 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7e237db1-4368-4167-aaf6-55756fa0fd69-proxy-tls\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:18:52.926311 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:52.926216 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f4jfj\" (UniqueName: \"kubernetes.io/projected/7e237db1-4368-4167-aaf6-55756fa0fd69-kube-api-access-f4jfj\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:18:52.926311 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:52.926235 2566 reconciler_common.go:299] "Volume detached for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7e237db1-4368-4167-aaf6-55756fa0fd69-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:18:52.926311 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:52.926281 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e237db1-4368-4167-aaf6-55756fa0fd69-kserve-provision-location\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:18:53.888350 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:53.888314 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-5f4v7" Apr 17 18:18:53.889576 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:53.889550 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-5f4v7" podUID="34c0350c-b616-48b7-9ada-e90510c12ac6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.55:8080: connect: connection refused" Apr 17 18:18:54.838410 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:54.838377 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e237db1-4368-4167-aaf6-55756fa0fd69" path="/var/lib/kubelet/pods/7e237db1-4368-4167-aaf6-55756fa0fd69/volumes" Apr 17 18:18:54.891392 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:54.891356 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-5f4v7" podUID="34c0350c-b616-48b7-9ada-e90510c12ac6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.55:8080: connect: connection refused" Apr 17 18:18:59.895931 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:59.895903 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-5f4v7" Apr 17 18:18:59.896456 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:18:59.896431 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-5f4v7" podUID="34c0350c-b616-48b7-9ada-e90510c12ac6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.55:8080: connect: connection refused" Apr 17 18:19:09.896898 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:19:09.896857 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-5f4v7" podUID="34c0350c-b616-48b7-9ada-e90510c12ac6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.55:8080: connect: connection refused" Apr 17 18:19:19.897042 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:19:19.897003 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-5f4v7" podUID="34c0350c-b616-48b7-9ada-e90510c12ac6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.55:8080: connect: connection refused" Apr 17 18:19:29.899611 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:19:29.899522 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-5f4v7" podUID="34c0350c-b616-48b7-9ada-e90510c12ac6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.55:8080: connect: connection refused" Apr 17 18:19:39.896954 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:19:39.896912 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-5f4v7" podUID="34c0350c-b616-48b7-9ada-e90510c12ac6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.55:8080: connect: connection refused" Apr 17 18:19:49.897211 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:19:49.897180 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-5f4v7" Apr 17 18:19:56.626151 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:19:56.626077 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-5f4v7"] Apr 17 18:19:56.626668 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:19:56.626421 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-5f4v7" podUID="34c0350c-b616-48b7-9ada-e90510c12ac6" containerName="kserve-container" containerID="cri-o://f952ed4242949e2c9c82ab75123c7930b85f05a2e4a1f5ee8fa378b5e730c80b" gracePeriod=30 Apr 17 18:19:56.626668 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:19:56.626449 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-5f4v7" podUID="34c0350c-b616-48b7-9ada-e90510c12ac6" containerName="kube-rbac-proxy" containerID="cri-o://a480756ac22ed3f46eb186bc2afc1771dc5e6b20bbe05159b864e087fe7af5da" gracePeriod=30 Apr 17 18:19:56.723359 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:19:56.723323 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52"] Apr 17 18:19:56.723672 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:19:56.723659 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e237db1-4368-4167-aaf6-55756fa0fd69" containerName="storage-initializer" Apr 17 18:19:56.723721 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:19:56.723673 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e237db1-4368-4167-aaf6-55756fa0fd69" containerName="storage-initializer" Apr 17 18:19:56.723721 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:19:56.723687 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e237db1-4368-4167-aaf6-55756fa0fd69" containerName="kserve-container" Apr 17 18:19:56.723721 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:19:56.723693 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e237db1-4368-4167-aaf6-55756fa0fd69" containerName="kserve-container" Apr 17 18:19:56.723721 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:19:56.723700 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e237db1-4368-4167-aaf6-55756fa0fd69" containerName="kube-rbac-proxy" Apr 17 18:19:56.723721 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:19:56.723705 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e237db1-4368-4167-aaf6-55756fa0fd69" containerName="kube-rbac-proxy" Apr 17 18:19:56.723878 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:19:56.723753 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="7e237db1-4368-4167-aaf6-55756fa0fd69" containerName="kube-rbac-proxy" Apr 17 18:19:56.723878 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:19:56.723762 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="7e237db1-4368-4167-aaf6-55756fa0fd69" containerName="kserve-container" Apr 17 18:19:56.727974 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:19:56.727957 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52" Apr 17 18:19:56.730306 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:19:56.730285 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-runtime-predictor-serving-cert\"" Apr 17 18:19:56.730598 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:19:56.730577 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 17 18:19:56.737546 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:19:56.737524 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52"] Apr 17 18:19:56.842567 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:19:56.842538 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52\" (UID: \"5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52" Apr 17 18:19:56.842731 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:19:56.842575 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52\" (UID: \"5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52" Apr 17 18:19:56.842731 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:19:56.842600 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfn6c\" (UniqueName: \"kubernetes.io/projected/5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22-kube-api-access-kfn6c\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52\" (UID: \"5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52" Apr 17 18:19:56.842731 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:19:56.842716 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52\" (UID: \"5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52" Apr 17 18:19:56.943155 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:19:56.943076 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52\" (UID: \"5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52" Apr 17 18:19:56.943155 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:19:56.943117 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52\" (UID: \"5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52" Apr 17 18:19:56.943415 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:19:56.943347 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kfn6c\" (UniqueName: \"kubernetes.io/projected/5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22-kube-api-access-kfn6c\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52\" (UID: \"5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52" Apr 17 18:19:56.943481 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:19:56.943430 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52\" (UID: \"5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52" Apr 17 18:19:56.943780 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:19:56.943755 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52\" (UID: \"5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52" Apr 17 18:19:56.943920 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:19:56.943802 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52\" (UID: \"5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52" Apr 17 18:19:56.945640 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:19:56.945623 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52\" (UID: \"5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52" Apr 17 18:19:56.957898 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:19:56.957879 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfn6c\" (UniqueName: \"kubernetes.io/projected/5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22-kube-api-access-kfn6c\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52\" (UID: \"5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52" Apr 17 18:19:57.040059 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:19:57.040023 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52" Apr 17 18:19:57.068011 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:19:57.067979 2566 generic.go:358] "Generic (PLEG): container finished" podID="34c0350c-b616-48b7-9ada-e90510c12ac6" containerID="a480756ac22ed3f46eb186bc2afc1771dc5e6b20bbe05159b864e087fe7af5da" exitCode=2 Apr 17 18:19:57.068133 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:19:57.068041 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-5f4v7" event={"ID":"34c0350c-b616-48b7-9ada-e90510c12ac6","Type":"ContainerDied","Data":"a480756ac22ed3f46eb186bc2afc1771dc5e6b20bbe05159b864e087fe7af5da"} Apr 17 18:19:57.160032 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:19:57.159945 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52"] Apr 17 18:19:57.162604 ip-10-0-140-147 kubenswrapper[2566]: W0417 18:19:57.162577 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a1ecbc8_6814_4a5e_ba13_a9f5aff2cc22.slice/crio-2e7d6b1d87aa2a1c477b4c1e8bf162d54e7287b434087df34339534d3c1047f0 WatchSource:0}: Error finding container 2e7d6b1d87aa2a1c477b4c1e8bf162d54e7287b434087df34339534d3c1047f0: Status 404 returned error can't find the container with id 2e7d6b1d87aa2a1c477b4c1e8bf162d54e7287b434087df34339534d3c1047f0 Apr 17 18:19:58.072490 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:19:58.072458 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52" event={"ID":"5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22","Type":"ContainerStarted","Data":"e78db8ac8cdc2dd73e472ba8e38ac92707f0b8b5242a08a2640e3cc4558b4c59"} Apr 17 18:19:58.072490 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:19:58.072494 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52" event={"ID":"5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22","Type":"ContainerStarted","Data":"2e7d6b1d87aa2a1c477b4c1e8bf162d54e7287b434087df34339534d3c1047f0"} Apr 17 18:19:59.892034 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:19:59.891984 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-5f4v7" podUID="34c0350c-b616-48b7-9ada-e90510c12ac6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.55:8643/healthz\": dial tcp 10.133.0.55:8643: connect: connection refused" Apr 17 18:19:59.897332 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:19:59.897304 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-5f4v7" podUID="34c0350c-b616-48b7-9ada-e90510c12ac6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.55:8080: connect: connection refused" Apr 17 18:20:00.079889 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:00.079852 2566 generic.go:358] "Generic (PLEG): container finished" podID="34c0350c-b616-48b7-9ada-e90510c12ac6" containerID="f952ed4242949e2c9c82ab75123c7930b85f05a2e4a1f5ee8fa378b5e730c80b" exitCode=0 Apr 17 18:20:00.080073 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:00.079957 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-5f4v7" event={"ID":"34c0350c-b616-48b7-9ada-e90510c12ac6","Type":"ContainerDied","Data":"f952ed4242949e2c9c82ab75123c7930b85f05a2e4a1f5ee8fa378b5e730c80b"} Apr 17 18:20:00.163295 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:00.163273 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-5f4v7" Apr 17 18:20:00.270271 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:00.270219 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/34c0350c-b616-48b7-9ada-e90510c12ac6-proxy-tls\") pod \"34c0350c-b616-48b7-9ada-e90510c12ac6\" (UID: \"34c0350c-b616-48b7-9ada-e90510c12ac6\") " Apr 17 18:20:00.270271 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:00.270275 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/34c0350c-b616-48b7-9ada-e90510c12ac6-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"34c0350c-b616-48b7-9ada-e90510c12ac6\" (UID: \"34c0350c-b616-48b7-9ada-e90510c12ac6\") " Apr 17 18:20:00.270515 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:00.270311 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34c0350c-b616-48b7-9ada-e90510c12ac6-kserve-provision-location\") pod \"34c0350c-b616-48b7-9ada-e90510c12ac6\" (UID: \"34c0350c-b616-48b7-9ada-e90510c12ac6\") " Apr 17 18:20:00.270515 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:00.270367 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbmdl\" (UniqueName: \"kubernetes.io/projected/34c0350c-b616-48b7-9ada-e90510c12ac6-kube-api-access-gbmdl\") pod \"34c0350c-b616-48b7-9ada-e90510c12ac6\" (UID: \"34c0350c-b616-48b7-9ada-e90510c12ac6\") " Apr 17 18:20:00.270691 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:00.270649 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34c0350c-b616-48b7-9ada-e90510c12ac6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "34c0350c-b616-48b7-9ada-e90510c12ac6" (UID: "34c0350c-b616-48b7-9ada-e90510c12ac6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:20:00.270814 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:00.270699 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34c0350c-b616-48b7-9ada-e90510c12ac6-isvc-xgboost-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-runtime-kube-rbac-proxy-sar-config") pod "34c0350c-b616-48b7-9ada-e90510c12ac6" (UID: "34c0350c-b616-48b7-9ada-e90510c12ac6"). InnerVolumeSpecName "isvc-xgboost-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:20:00.272445 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:00.272425 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34c0350c-b616-48b7-9ada-e90510c12ac6-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "34c0350c-b616-48b7-9ada-e90510c12ac6" (UID: "34c0350c-b616-48b7-9ada-e90510c12ac6"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:20:00.272525 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:00.272485 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34c0350c-b616-48b7-9ada-e90510c12ac6-kube-api-access-gbmdl" (OuterVolumeSpecName: "kube-api-access-gbmdl") pod "34c0350c-b616-48b7-9ada-e90510c12ac6" (UID: "34c0350c-b616-48b7-9ada-e90510c12ac6"). InnerVolumeSpecName "kube-api-access-gbmdl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:20:00.371717 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:00.371689 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gbmdl\" (UniqueName: \"kubernetes.io/projected/34c0350c-b616-48b7-9ada-e90510c12ac6-kube-api-access-gbmdl\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:20:00.371717 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:00.371714 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/34c0350c-b616-48b7-9ada-e90510c12ac6-proxy-tls\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:20:00.371889 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:00.371726 2566 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/34c0350c-b616-48b7-9ada-e90510c12ac6-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:20:00.371889 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:00.371739 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34c0350c-b616-48b7-9ada-e90510c12ac6-kserve-provision-location\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:20:01.085033 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:01.085003 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-5f4v7" event={"ID":"34c0350c-b616-48b7-9ada-e90510c12ac6","Type":"ContainerDied","Data":"19b880aa751dd51571e546239968b27244d752f806b3bb420d8a1a9672f347c9"} Apr 17 18:20:01.085451 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:01.085052 2566 scope.go:117] "RemoveContainer" containerID="a480756ac22ed3f46eb186bc2afc1771dc5e6b20bbe05159b864e087fe7af5da" Apr 17 18:20:01.085451 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:01.085064 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-5f4v7" Apr 17 18:20:01.151010 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:01.150987 2566 scope.go:117] "RemoveContainer" containerID="f952ed4242949e2c9c82ab75123c7930b85f05a2e4a1f5ee8fa378b5e730c80b" Apr 17 18:20:01.157867 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:01.157850 2566 scope.go:117] "RemoveContainer" containerID="b918e813d9413846864ad336d944443e10b1ace6632fd36b308a2c741edc4f52" Apr 17 18:20:01.163614 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:01.163591 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-5f4v7"] Apr 17 18:20:01.169195 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:01.169176 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-5f4v7"] Apr 17 18:20:02.089991 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:02.089961 2566 generic.go:358] "Generic (PLEG): container finished" podID="5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22" containerID="e78db8ac8cdc2dd73e472ba8e38ac92707f0b8b5242a08a2640e3cc4558b4c59" exitCode=0 Apr 17 18:20:02.090399 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:02.090036 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52" event={"ID":"5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22","Type":"ContainerDied","Data":"e78db8ac8cdc2dd73e472ba8e38ac92707f0b8b5242a08a2640e3cc4558b4c59"} Apr 17 18:20:02.838365 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:02.838324 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34c0350c-b616-48b7-9ada-e90510c12ac6" path="/var/lib/kubelet/pods/34c0350c-b616-48b7-9ada-e90510c12ac6/volumes" Apr 17 18:20:03.094744 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:03.094662 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52" event={"ID":"5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22","Type":"ContainerStarted","Data":"127d2d300572626b735afcd924fb7cdccd6dbab4303522b62d4819192cb63470"} Apr 17 18:20:03.094744 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:03.094700 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52" event={"ID":"5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22","Type":"ContainerStarted","Data":"8b5c1abecfd9244b2ea8dda12946a34502e75897babd86c5f10228adf467384b"} Apr 17 18:20:03.095134 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:03.094901 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52" Apr 17 18:20:03.117716 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:03.117654 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52" podStartSLOduration=7.117638381 podStartE2EDuration="7.117638381s" podCreationTimestamp="2026-04-17 18:19:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:20:03.114739944 +0000 UTC m=+3314.784819596" watchObservedRunningTime="2026-04-17 18:20:03.117638381 +0000 UTC m=+3314.787718024" Apr 17 18:20:04.098609 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:04.098573 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52" Apr 17 18:20:10.106233 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:10.106205 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52" Apr 17 18:20:13.480571 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:13.480536 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz777_35fe1bc2-e1a4-4744-8f67-3cc38747e567/ovn-acl-logging/0.log" Apr 17 18:20:13.481948 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:13.481926 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz777_35fe1bc2-e1a4-4744-8f67-3cc38747e567/ovn-acl-logging/0.log" Apr 17 18:20:40.132173 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:40.132135 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52" podUID="5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 17 18:20:50.108583 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:50.108553 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52" Apr 17 18:20:56.839220 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:56.839187 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52"] Apr 17 18:20:56.839657 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:56.839520 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52" podUID="5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22" containerName="kserve-container" containerID="cri-o://8b5c1abecfd9244b2ea8dda12946a34502e75897babd86c5f10228adf467384b" gracePeriod=30 Apr 17 18:20:56.839657 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:56.839621 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52" podUID="5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22" containerName="kube-rbac-proxy" containerID="cri-o://127d2d300572626b735afcd924fb7cdccd6dbab4303522b62d4819192cb63470" gracePeriod=30 Apr 17 18:20:56.923268 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:56.923231 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-trvrr"] Apr 17 18:20:56.923609 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:56.923592 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="34c0350c-b616-48b7-9ada-e90510c12ac6" containerName="kserve-container" Apr 17 18:20:56.923694 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:56.923611 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="34c0350c-b616-48b7-9ada-e90510c12ac6" containerName="kserve-container" Apr 17 18:20:56.923694 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:56.923646 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="34c0350c-b616-48b7-9ada-e90510c12ac6" containerName="kube-rbac-proxy" Apr 17 18:20:56.923694 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:56.923656 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="34c0350c-b616-48b7-9ada-e90510c12ac6" containerName="kube-rbac-proxy" Apr 17 18:20:56.923694 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:56.923667 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="34c0350c-b616-48b7-9ada-e90510c12ac6" containerName="storage-initializer" Apr 17 18:20:56.923694 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:56.923676 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="34c0350c-b616-48b7-9ada-e90510c12ac6" containerName="storage-initializer" Apr 17 18:20:56.923970 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:56.923746 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="34c0350c-b616-48b7-9ada-e90510c12ac6" containerName="kserve-container" Apr 17 18:20:56.923970 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:56.923762 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="34c0350c-b616-48b7-9ada-e90510c12ac6" containerName="kube-rbac-proxy" Apr 17 18:20:56.927794 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:56.927774 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-trvrr" Apr 17 18:20:56.930067 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:56.930047 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-predictor-serving-cert\"" Apr 17 18:20:56.930166 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:56.930116 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-kube-rbac-proxy-sar-config\"" Apr 17 18:20:56.939838 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:56.939816 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-trvrr"] Apr 17 18:20:57.022057 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:57.022025 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rjjf\" (UniqueName: \"kubernetes.io/projected/e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d-kube-api-access-8rjjf\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-trvrr\" (UID: \"e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-trvrr" Apr 17 18:20:57.022057 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:57.022059 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-trvrr\" (UID: \"e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-trvrr" Apr 17 18:20:57.022312 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:57.022095 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-trvrr\" (UID: \"e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-trvrr" Apr 17 18:20:57.022312 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:57.022146 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-trvrr\" (UID: \"e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-trvrr" Apr 17 18:20:57.123016 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:57.122944 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8rjjf\" (UniqueName: \"kubernetes.io/projected/e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d-kube-api-access-8rjjf\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-trvrr\" (UID: \"e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-trvrr" Apr 17 18:20:57.123016 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:57.122978 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-trvrr\" (UID: \"e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-trvrr" Apr 17 18:20:57.123198 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:57.123172 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-trvrr\" (UID: \"e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-trvrr" Apr 17 18:20:57.123280 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:57.123214 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-trvrr\" (UID: \"e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-trvrr" Apr 17 18:20:57.123377 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:57.123359 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-trvrr\" (UID: \"e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-trvrr" Apr 17 18:20:57.123438 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:20:57.123373 2566 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-v2-predictor-serving-cert: secret "isvc-xgboost-v2-predictor-serving-cert" not found Apr 17 18:20:57.123438 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:20:57.123423 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d-proxy-tls podName:e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d nodeName:}" failed. No retries permitted until 2026-04-17 18:20:57.623406764 +0000 UTC m=+3369.293486388 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d-proxy-tls") pod "isvc-xgboost-v2-predictor-6fcdd6977c-trvrr" (UID: "e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d") : secret "isvc-xgboost-v2-predictor-serving-cert" not found Apr 17 18:20:57.123838 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:57.123817 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-trvrr\" (UID: \"e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-trvrr" Apr 17 18:20:57.131613 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:57.131588 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rjjf\" (UniqueName: \"kubernetes.io/projected/e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d-kube-api-access-8rjjf\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-trvrr\" (UID: \"e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-trvrr" Apr 17 18:20:57.248959 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:57.248929 2566 generic.go:358] "Generic (PLEG): container finished" podID="5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22" containerID="127d2d300572626b735afcd924fb7cdccd6dbab4303522b62d4819192cb63470" exitCode=2 Apr 17 18:20:57.249110 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:57.248991 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52" event={"ID":"5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22","Type":"ContainerDied","Data":"127d2d300572626b735afcd924fb7cdccd6dbab4303522b62d4819192cb63470"} Apr 17 18:20:57.627505 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:57.627473 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-trvrr\" (UID: \"e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-trvrr" Apr 17 18:20:57.629780 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:57.629754 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-trvrr\" (UID: \"e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-trvrr" Apr 17 18:20:57.837847 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:57.837812 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-trvrr" Apr 17 18:20:57.959062 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:57.959013 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-trvrr"] Apr 17 18:20:57.961181 ip-10-0-140-147 kubenswrapper[2566]: W0417 18:20:57.961145 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode58ac0a3_7130_4bb8_8f32_eaaf55a3fe5d.slice/crio-53aa8719f3947615d1fcac7dab8a2cc071c210714a2655d55b8df205e283cba2 WatchSource:0}: Error finding container 53aa8719f3947615d1fcac7dab8a2cc071c210714a2655d55b8df205e283cba2: Status 404 returned error can't find the container with id 53aa8719f3947615d1fcac7dab8a2cc071c210714a2655d55b8df205e283cba2 Apr 17 18:20:57.962983 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:57.962965 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 18:20:58.253903 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:58.253824 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-trvrr" event={"ID":"e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d","Type":"ContainerStarted","Data":"19ffbbb23c93b5d9f8ab18e66b71eb6447d2bb8b3183b775118d8b93576246f7"} Apr 17 18:20:58.253903 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:20:58.253861 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-trvrr" event={"ID":"e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d","Type":"ContainerStarted","Data":"53aa8719f3947615d1fcac7dab8a2cc071c210714a2655d55b8df205e283cba2"} Apr 17 18:21:00.101614 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:21:00.101566 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52" podUID="5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.56:8643/healthz\": dial tcp 10.133.0.56:8643: connect: connection refused" Apr 17 18:21:02.266640 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:21:02.266607 2566 generic.go:358] "Generic (PLEG): container finished" podID="e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d" containerID="19ffbbb23c93b5d9f8ab18e66b71eb6447d2bb8b3183b775118d8b93576246f7" exitCode=0 Apr 17 18:21:02.267024 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:21:02.266669 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-trvrr" event={"ID":"e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d","Type":"ContainerDied","Data":"19ffbbb23c93b5d9f8ab18e66b71eb6447d2bb8b3183b775118d8b93576246f7"} Apr 17 18:21:03.271359 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:21:03.271328 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-trvrr" event={"ID":"e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d","Type":"ContainerStarted","Data":"9ec47df58085dad46ee1e7994ce2712648ca3251f8670eb7770fb4e1e5c9865b"} Apr 17 18:21:03.271359 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:21:03.271367 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-trvrr" event={"ID":"e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d","Type":"ContainerStarted","Data":"fc150ca7c95799a069ed995a2cc0884183df45aa75681717c44e031e3f6ad167"} Apr 17 18:21:03.271847 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:21:03.271675 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-trvrr" Apr 17 18:21:03.271847 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:21:03.271785 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-trvrr" Apr 17 18:21:03.273051 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:21:03.273029 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-trvrr" podUID="e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.57:8080: connect: connection refused" Apr 17 18:21:03.297960 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:21:03.297916 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-trvrr" podStartSLOduration=7.297904401 podStartE2EDuration="7.297904401s" podCreationTimestamp="2026-04-17 18:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:21:03.295630914 +0000 UTC m=+3374.965710556" watchObservedRunningTime="2026-04-17 18:21:03.297904401 +0000 UTC m=+3374.967984043" Apr 17 18:21:04.175015 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:21:04.174993 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52" Apr 17 18:21:04.276163 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:21:04.276129 2566 generic.go:358] "Generic (PLEG): container finished" podID="5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22" containerID="8b5c1abecfd9244b2ea8dda12946a34502e75897babd86c5f10228adf467384b" exitCode=0 Apr 17 18:21:04.276585 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:21:04.276217 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52" Apr 17 18:21:04.276585 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:21:04.276222 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52" event={"ID":"5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22","Type":"ContainerDied","Data":"8b5c1abecfd9244b2ea8dda12946a34502e75897babd86c5f10228adf467384b"} Apr 17 18:21:04.276585 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:21:04.276275 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52" event={"ID":"5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22","Type":"ContainerDied","Data":"2e7d6b1d87aa2a1c477b4c1e8bf162d54e7287b434087df34339534d3c1047f0"} Apr 17 18:21:04.276585 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:21:04.276295 2566 scope.go:117] "RemoveContainer" containerID="127d2d300572626b735afcd924fb7cdccd6dbab4303522b62d4819192cb63470" Apr 17 18:21:04.276835 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:21:04.276811 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-trvrr" podUID="e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.57:8080: connect: connection refused" Apr 17 18:21:04.280269 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:21:04.280238 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22-proxy-tls\") pod \"5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22\" (UID: \"5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22\") " Apr 17 18:21:04.280364 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:21:04.280308 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22-kserve-provision-location\") pod \"5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22\" (UID: \"5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22\") " Apr 17 18:21:04.280364 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:21:04.280356 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfn6c\" (UniqueName: \"kubernetes.io/projected/5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22-kube-api-access-kfn6c\") pod \"5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22\" (UID: \"5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22\") " Apr 17 18:21:04.280490 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:21:04.280418 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22\" (UID: \"5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22\") " Apr 17 18:21:04.280618 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:21:04.280590 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22" (UID: "5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:21:04.280731 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:21:04.280679 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22-kserve-provision-location\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:21:04.280814 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:21:04.280789 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config") pod "5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22" (UID: "5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22"). InnerVolumeSpecName "isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:21:04.282402 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:21:04.282351 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22-kube-api-access-kfn6c" (OuterVolumeSpecName: "kube-api-access-kfn6c") pod "5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22" (UID: "5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22"). InnerVolumeSpecName "kube-api-access-kfn6c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:21:04.282475 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:21:04.282402 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22" (UID: "5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:21:04.286055 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:21:04.286029 2566 scope.go:117] "RemoveContainer" containerID="8b5c1abecfd9244b2ea8dda12946a34502e75897babd86c5f10228adf467384b" Apr 17 18:21:04.296558 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:21:04.296541 2566 scope.go:117] "RemoveContainer" containerID="e78db8ac8cdc2dd73e472ba8e38ac92707f0b8b5242a08a2640e3cc4558b4c59" Apr 17 18:21:04.302838 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:21:04.302822 2566 scope.go:117] "RemoveContainer" containerID="127d2d300572626b735afcd924fb7cdccd6dbab4303522b62d4819192cb63470" Apr 17 18:21:04.303075 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:21:04.303055 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"127d2d300572626b735afcd924fb7cdccd6dbab4303522b62d4819192cb63470\": container with ID starting with 127d2d300572626b735afcd924fb7cdccd6dbab4303522b62d4819192cb63470 not found: ID does not exist" containerID="127d2d300572626b735afcd924fb7cdccd6dbab4303522b62d4819192cb63470" Apr 17 18:21:04.303122 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:21:04.303084 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"127d2d300572626b735afcd924fb7cdccd6dbab4303522b62d4819192cb63470"} err="failed to get container status \"127d2d300572626b735afcd924fb7cdccd6dbab4303522b62d4819192cb63470\": rpc error: code = NotFound desc = could not find container \"127d2d300572626b735afcd924fb7cdccd6dbab4303522b62d4819192cb63470\": container with ID starting with 127d2d300572626b735afcd924fb7cdccd6dbab4303522b62d4819192cb63470 not found: ID does not exist" Apr 17 18:21:04.303122 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:21:04.303101 2566 scope.go:117] "RemoveContainer" containerID="8b5c1abecfd9244b2ea8dda12946a34502e75897babd86c5f10228adf467384b" Apr 17 18:21:04.303318 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:21:04.303301 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b5c1abecfd9244b2ea8dda12946a34502e75897babd86c5f10228adf467384b\": container with ID starting with 8b5c1abecfd9244b2ea8dda12946a34502e75897babd86c5f10228adf467384b not found: ID does not exist" containerID="8b5c1abecfd9244b2ea8dda12946a34502e75897babd86c5f10228adf467384b" Apr 17 18:21:04.303357 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:21:04.303327 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b5c1abecfd9244b2ea8dda12946a34502e75897babd86c5f10228adf467384b"} err="failed to get container status \"8b5c1abecfd9244b2ea8dda12946a34502e75897babd86c5f10228adf467384b\": rpc error: code = NotFound desc = could not find container \"8b5c1abecfd9244b2ea8dda12946a34502e75897babd86c5f10228adf467384b\": container with ID starting with 8b5c1abecfd9244b2ea8dda12946a34502e75897babd86c5f10228adf467384b not found: ID does not exist" Apr 17 18:21:04.303357 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:21:04.303347 2566 scope.go:117] "RemoveContainer" containerID="e78db8ac8cdc2dd73e472ba8e38ac92707f0b8b5242a08a2640e3cc4558b4c59" Apr 17 18:21:04.303549 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:21:04.303533 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e78db8ac8cdc2dd73e472ba8e38ac92707f0b8b5242a08a2640e3cc4558b4c59\": container with ID starting with e78db8ac8cdc2dd73e472ba8e38ac92707f0b8b5242a08a2640e3cc4558b4c59 not found: ID does not exist" containerID="e78db8ac8cdc2dd73e472ba8e38ac92707f0b8b5242a08a2640e3cc4558b4c59" Apr 17 18:21:04.303602 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:21:04.303553 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e78db8ac8cdc2dd73e472ba8e38ac92707f0b8b5242a08a2640e3cc4558b4c59"} err="failed to get container status \"e78db8ac8cdc2dd73e472ba8e38ac92707f0b8b5242a08a2640e3cc4558b4c59\": rpc error: code = NotFound desc = could not find container \"e78db8ac8cdc2dd73e472ba8e38ac92707f0b8b5242a08a2640e3cc4558b4c59\": container with ID starting with e78db8ac8cdc2dd73e472ba8e38ac92707f0b8b5242a08a2640e3cc4558b4c59 not found: ID does not exist" Apr 17 18:21:04.381223 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:21:04.381196 2566 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:21:04.381223 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:21:04.381220 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22-proxy-tls\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:21:04.381476 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:21:04.381229 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kfn6c\" (UniqueName: \"kubernetes.io/projected/5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22-kube-api-access-kfn6c\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:21:04.599699 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:21:04.599674 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52"] Apr 17 18:21:04.604437 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:21:04.604414 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bzt52"] Apr 17 18:21:04.837731 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:21:04.837626 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22" path="/var/lib/kubelet/pods/5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22/volumes" Apr 17 18:21:09.280855 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:21:09.280825 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-trvrr" Apr 17 18:21:09.281370 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:21:09.281346 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-trvrr" podUID="e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.57:8080: connect: connection refused" Apr 17 18:21:19.282079 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:21:19.282043 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-trvrr" podUID="e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.57:8080: connect: connection refused" Apr 17 18:21:29.281885 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:21:29.281797 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-trvrr" podUID="e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.57:8080: connect: connection refused" Apr 17 18:21:39.281682 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:21:39.281645 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-trvrr" podUID="e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.57:8080: connect: connection refused" Apr 17 18:21:49.282127 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:21:49.282087 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-trvrr" podUID="e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.57:8080: connect: connection refused" Apr 17 18:21:59.281839 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:21:59.281803 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-trvrr" podUID="e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.57:8080: connect: connection refused" Apr 17 18:22:09.281960 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:09.281933 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-trvrr" Apr 17 18:22:17.054642 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:17.054606 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-trvrr"] Apr 17 18:22:17.055052 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:17.054900 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-trvrr" podUID="e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d" containerName="kserve-container" containerID="cri-o://fc150ca7c95799a069ed995a2cc0884183df45aa75681717c44e031e3f6ad167" gracePeriod=30 Apr 17 18:22:17.055052 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:17.054930 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-trvrr" podUID="e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d" containerName="kube-rbac-proxy" containerID="cri-o://9ec47df58085dad46ee1e7994ce2712648ca3251f8670eb7770fb4e1e5c9865b" gracePeriod=30 Apr 17 18:22:17.346798 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:17.346762 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-qg5rm"] Apr 17 18:22:17.347091 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:17.347078 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22" containerName="storage-initializer" Apr 17 18:22:17.347135 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:17.347093 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22" containerName="storage-initializer" Apr 17 18:22:17.347135 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:17.347110 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22" containerName="kserve-container" Apr 17 18:22:17.347135 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:17.347115 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22" containerName="kserve-container" Apr 17 18:22:17.347135 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:17.347122 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22" containerName="kube-rbac-proxy" Apr 17 18:22:17.347135 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:17.347128 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22" containerName="kube-rbac-proxy" Apr 17 18:22:17.347340 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:17.347177 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22" containerName="kube-rbac-proxy" Apr 17 18:22:17.347340 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:17.347187 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="5a1ecbc8-6814-4a5e-ba13-a9f5aff2cc22" containerName="kserve-container" Apr 17 18:22:17.350182 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:17.350165 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-qg5rm" Apr 17 18:22:17.353627 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:17.353609 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-predictor-serving-cert\"" Apr 17 18:22:17.354234 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:17.354218 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-kube-rbac-proxy-sar-config\"" Apr 17 18:22:17.354352 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:17.354343 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 17 18:22:17.371009 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:17.370987 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-qg5rm"] Apr 17 18:22:17.488198 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:17.488166 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w4j4\" (UniqueName: \"kubernetes.io/projected/15a8a9b8-c815-4d64-a3bb-e74f967df5f2-kube-api-access-4w4j4\") pod \"isvc-sklearn-s3-predictor-88457d696-qg5rm\" (UID: \"15a8a9b8-c815-4d64-a3bb-e74f967df5f2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-qg5rm" Apr 17 18:22:17.488397 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:17.488211 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/15a8a9b8-c815-4d64-a3bb-e74f967df5f2-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-predictor-88457d696-qg5rm\" (UID: \"15a8a9b8-c815-4d64-a3bb-e74f967df5f2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-qg5rm" Apr 17 18:22:17.488397 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:17.488272 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15a8a9b8-c815-4d64-a3bb-e74f967df5f2-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-88457d696-qg5rm\" (UID: \"15a8a9b8-c815-4d64-a3bb-e74f967df5f2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-qg5rm" Apr 17 18:22:17.488500 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:17.488398 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/15a8a9b8-c815-4d64-a3bb-e74f967df5f2-proxy-tls\") pod \"isvc-sklearn-s3-predictor-88457d696-qg5rm\" (UID: \"15a8a9b8-c815-4d64-a3bb-e74f967df5f2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-qg5rm" Apr 17 18:22:17.491169 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:17.491145 2566 generic.go:358] "Generic (PLEG): container finished" podID="e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d" containerID="9ec47df58085dad46ee1e7994ce2712648ca3251f8670eb7770fb4e1e5c9865b" exitCode=2 Apr 17 18:22:17.491304 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:17.491214 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-trvrr" event={"ID":"e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d","Type":"ContainerDied","Data":"9ec47df58085dad46ee1e7994ce2712648ca3251f8670eb7770fb4e1e5c9865b"} Apr 17 18:22:17.589239 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:17.589207 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/15a8a9b8-c815-4d64-a3bb-e74f967df5f2-proxy-tls\") pod \"isvc-sklearn-s3-predictor-88457d696-qg5rm\" (UID: \"15a8a9b8-c815-4d64-a3bb-e74f967df5f2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-qg5rm" Apr 17 18:22:17.589441 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:17.589299 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4w4j4\" (UniqueName: \"kubernetes.io/projected/15a8a9b8-c815-4d64-a3bb-e74f967df5f2-kube-api-access-4w4j4\") pod \"isvc-sklearn-s3-predictor-88457d696-qg5rm\" (UID: \"15a8a9b8-c815-4d64-a3bb-e74f967df5f2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-qg5rm" Apr 17 18:22:17.589441 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:17.589332 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/15a8a9b8-c815-4d64-a3bb-e74f967df5f2-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-predictor-88457d696-qg5rm\" (UID: \"15a8a9b8-c815-4d64-a3bb-e74f967df5f2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-qg5rm" Apr 17 18:22:17.589441 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:17.589379 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15a8a9b8-c815-4d64-a3bb-e74f967df5f2-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-88457d696-qg5rm\" (UID: \"15a8a9b8-c815-4d64-a3bb-e74f967df5f2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-qg5rm" Apr 17 18:22:17.589827 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:17.589805 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15a8a9b8-c815-4d64-a3bb-e74f967df5f2-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-88457d696-qg5rm\" (UID: \"15a8a9b8-c815-4d64-a3bb-e74f967df5f2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-qg5rm" Apr 17 18:22:17.590025 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:17.590008 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/15a8a9b8-c815-4d64-a3bb-e74f967df5f2-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-predictor-88457d696-qg5rm\" (UID: \"15a8a9b8-c815-4d64-a3bb-e74f967df5f2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-qg5rm" Apr 17 18:22:17.591674 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:17.591655 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/15a8a9b8-c815-4d64-a3bb-e74f967df5f2-proxy-tls\") pod \"isvc-sklearn-s3-predictor-88457d696-qg5rm\" (UID: \"15a8a9b8-c815-4d64-a3bb-e74f967df5f2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-qg5rm" Apr 17 18:22:17.603165 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:17.603110 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w4j4\" (UniqueName: \"kubernetes.io/projected/15a8a9b8-c815-4d64-a3bb-e74f967df5f2-kube-api-access-4w4j4\") pod \"isvc-sklearn-s3-predictor-88457d696-qg5rm\" (UID: \"15a8a9b8-c815-4d64-a3bb-e74f967df5f2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-qg5rm" Apr 17 18:22:17.659378 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:17.659354 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-qg5rm" Apr 17 18:22:17.790652 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:17.790564 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-qg5rm"] Apr 17 18:22:17.792794 ip-10-0-140-147 kubenswrapper[2566]: W0417 18:22:17.792766 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15a8a9b8_c815_4d64_a3bb_e74f967df5f2.slice/crio-7030ec13ca55e9171e72a213498a8b54c17c549dd7df0e0218b43bb3bf02afd3 WatchSource:0}: Error finding container 7030ec13ca55e9171e72a213498a8b54c17c549dd7df0e0218b43bb3bf02afd3: Status 404 returned error can't find the container with id 7030ec13ca55e9171e72a213498a8b54c17c549dd7df0e0218b43bb3bf02afd3 Apr 17 18:22:18.496038 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:18.496001 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-qg5rm" event={"ID":"15a8a9b8-c815-4d64-a3bb-e74f967df5f2","Type":"ContainerStarted","Data":"189c8ab7e9f19099ac0b28e6c00e2ad3e2865131ac30fd45d2fa06ac46f641f1"} Apr 17 18:22:18.496446 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:18.496049 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-qg5rm" event={"ID":"15a8a9b8-c815-4d64-a3bb-e74f967df5f2","Type":"ContainerStarted","Data":"7030ec13ca55e9171e72a213498a8b54c17c549dd7df0e0218b43bb3bf02afd3"} Apr 17 18:22:19.277938 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:19.277848 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-trvrr" podUID="e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.57:8643/healthz\": dial tcp 10.133.0.57:8643: connect: connection refused" Apr 17 18:22:19.282147 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:19.282123 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-trvrr" podUID="e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.57:8080: connect: connection refused" Apr 17 18:22:19.499690 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:19.499652 2566 generic.go:358] "Generic (PLEG): container finished" podID="15a8a9b8-c815-4d64-a3bb-e74f967df5f2" containerID="189c8ab7e9f19099ac0b28e6c00e2ad3e2865131ac30fd45d2fa06ac46f641f1" exitCode=0 Apr 17 18:22:19.500063 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:19.499729 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-qg5rm" event={"ID":"15a8a9b8-c815-4d64-a3bb-e74f967df5f2","Type":"ContainerDied","Data":"189c8ab7e9f19099ac0b28e6c00e2ad3e2865131ac30fd45d2fa06ac46f641f1"} Apr 17 18:22:20.397488 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:20.397465 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-trvrr" Apr 17 18:22:20.505414 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:20.505331 2566 generic.go:358] "Generic (PLEG): container finished" podID="e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d" containerID="fc150ca7c95799a069ed995a2cc0884183df45aa75681717c44e031e3f6ad167" exitCode=0 Apr 17 18:22:20.505414 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:20.505394 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-trvrr" event={"ID":"e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d","Type":"ContainerDied","Data":"fc150ca7c95799a069ed995a2cc0884183df45aa75681717c44e031e3f6ad167"} Apr 17 18:22:20.505873 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:20.505418 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-trvrr" event={"ID":"e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d","Type":"ContainerDied","Data":"53aa8719f3947615d1fcac7dab8a2cc071c210714a2655d55b8df205e283cba2"} Apr 17 18:22:20.505873 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:20.505420 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-trvrr" Apr 17 18:22:20.505873 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:20.505436 2566 scope.go:117] "RemoveContainer" containerID="9ec47df58085dad46ee1e7994ce2712648ca3251f8670eb7770fb4e1e5c9865b" Apr 17 18:22:20.507310 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:20.507241 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-qg5rm" event={"ID":"15a8a9b8-c815-4d64-a3bb-e74f967df5f2","Type":"ContainerStarted","Data":"646f15fc16a927f69af14865726ca4c662d8d7649bebd5d0a73b6926ed51cf30"} Apr 17 18:22:20.507417 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:20.507316 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-qg5rm" event={"ID":"15a8a9b8-c815-4d64-a3bb-e74f967df5f2","Type":"ContainerStarted","Data":"ed4201d98d070bb1367bf3a69adf0bba003b3f652d45aa7c14e53ee9dda0f2a3"} Apr 17 18:22:20.507488 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:20.507468 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-qg5rm" Apr 17 18:22:20.513336 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:20.513319 2566 scope.go:117] "RemoveContainer" containerID="fc150ca7c95799a069ed995a2cc0884183df45aa75681717c44e031e3f6ad167" Apr 17 18:22:20.515380 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:20.515360 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d-proxy-tls\") pod \"e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d\" (UID: \"e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d\") " Apr 17 18:22:20.515466 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:20.515448 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d\" (UID: \"e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d\") " Apr 17 18:22:20.515551 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:20.515537 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rjjf\" (UniqueName: \"kubernetes.io/projected/e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d-kube-api-access-8rjjf\") pod \"e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d\" (UID: \"e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d\") " Apr 17 18:22:20.515616 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:20.515588 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d-kserve-provision-location\") pod \"e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d\" (UID: \"e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d\") " Apr 17 18:22:20.515925 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:20.515801 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d-isvc-xgboost-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-v2-kube-rbac-proxy-sar-config") pod "e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d" (UID: "e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d"). InnerVolumeSpecName "isvc-xgboost-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:22:20.516002 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:20.515946 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d" (UID: "e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:22:20.517340 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:20.517317 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d" (UID: "e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:22:20.517538 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:20.517515 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d-kube-api-access-8rjjf" (OuterVolumeSpecName: "kube-api-access-8rjjf") pod "e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d" (UID: "e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d"). InnerVolumeSpecName "kube-api-access-8rjjf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:22:20.530064 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:20.530047 2566 scope.go:117] "RemoveContainer" containerID="19ffbbb23c93b5d9f8ab18e66b71eb6447d2bb8b3183b775118d8b93576246f7" Apr 17 18:22:20.532205 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:20.532151 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-qg5rm" podStartSLOduration=3.532137299 podStartE2EDuration="3.532137299s" podCreationTimestamp="2026-04-17 18:22:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:22:20.53113124 +0000 UTC m=+3452.201210887" watchObservedRunningTime="2026-04-17 18:22:20.532137299 +0000 UTC m=+3452.202216952" Apr 17 18:22:20.539091 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:20.539072 2566 scope.go:117] "RemoveContainer" containerID="9ec47df58085dad46ee1e7994ce2712648ca3251f8670eb7770fb4e1e5c9865b" Apr 17 18:22:20.539371 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:22:20.539353 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ec47df58085dad46ee1e7994ce2712648ca3251f8670eb7770fb4e1e5c9865b\": container with ID starting with 9ec47df58085dad46ee1e7994ce2712648ca3251f8670eb7770fb4e1e5c9865b not found: ID does not exist" containerID="9ec47df58085dad46ee1e7994ce2712648ca3251f8670eb7770fb4e1e5c9865b" Apr 17 18:22:20.539423 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:20.539379 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ec47df58085dad46ee1e7994ce2712648ca3251f8670eb7770fb4e1e5c9865b"} err="failed to get container status \"9ec47df58085dad46ee1e7994ce2712648ca3251f8670eb7770fb4e1e5c9865b\": rpc error: code = NotFound desc = could not find container \"9ec47df58085dad46ee1e7994ce2712648ca3251f8670eb7770fb4e1e5c9865b\": container with ID starting with 9ec47df58085dad46ee1e7994ce2712648ca3251f8670eb7770fb4e1e5c9865b not found: ID does not exist" Apr 17 18:22:20.539423 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:20.539397 2566 scope.go:117] "RemoveContainer" containerID="fc150ca7c95799a069ed995a2cc0884183df45aa75681717c44e031e3f6ad167" Apr 17 18:22:20.539632 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:22:20.539615 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc150ca7c95799a069ed995a2cc0884183df45aa75681717c44e031e3f6ad167\": container with ID starting with fc150ca7c95799a069ed995a2cc0884183df45aa75681717c44e031e3f6ad167 not found: ID does not exist" containerID="fc150ca7c95799a069ed995a2cc0884183df45aa75681717c44e031e3f6ad167" Apr 17 18:22:20.539674 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:20.539638 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc150ca7c95799a069ed995a2cc0884183df45aa75681717c44e031e3f6ad167"} err="failed to get container status \"fc150ca7c95799a069ed995a2cc0884183df45aa75681717c44e031e3f6ad167\": rpc error: code = NotFound desc = could not find container \"fc150ca7c95799a069ed995a2cc0884183df45aa75681717c44e031e3f6ad167\": container with ID starting with fc150ca7c95799a069ed995a2cc0884183df45aa75681717c44e031e3f6ad167 not found: ID does not exist" Apr 17 18:22:20.539674 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:20.539655 2566 scope.go:117] "RemoveContainer" containerID="19ffbbb23c93b5d9f8ab18e66b71eb6447d2bb8b3183b775118d8b93576246f7" Apr 17 18:22:20.539886 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:22:20.539869 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19ffbbb23c93b5d9f8ab18e66b71eb6447d2bb8b3183b775118d8b93576246f7\": container with ID starting with 19ffbbb23c93b5d9f8ab18e66b71eb6447d2bb8b3183b775118d8b93576246f7 not found: ID does not exist" containerID="19ffbbb23c93b5d9f8ab18e66b71eb6447d2bb8b3183b775118d8b93576246f7" Apr 17 18:22:20.539936 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:20.539889 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19ffbbb23c93b5d9f8ab18e66b71eb6447d2bb8b3183b775118d8b93576246f7"} err="failed to get container status \"19ffbbb23c93b5d9f8ab18e66b71eb6447d2bb8b3183b775118d8b93576246f7\": rpc error: code = NotFound desc = could not find container \"19ffbbb23c93b5d9f8ab18e66b71eb6447d2bb8b3183b775118d8b93576246f7\": container with ID starting with 19ffbbb23c93b5d9f8ab18e66b71eb6447d2bb8b3183b775118d8b93576246f7 not found: ID does not exist" Apr 17 18:22:20.616215 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:20.616184 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8rjjf\" (UniqueName: \"kubernetes.io/projected/e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d-kube-api-access-8rjjf\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:22:20.616215 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:20.616210 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d-kserve-provision-location\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:22:20.616478 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:20.616223 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d-proxy-tls\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:22:20.616478 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:20.616237 2566 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:22:20.833578 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:20.833550 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-trvrr"] Apr 17 18:22:20.838455 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:20.838432 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-trvrr"] Apr 17 18:22:21.511312 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:21.511287 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-qg5rm" Apr 17 18:22:21.512230 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:21.512203 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-qg5rm" podUID="15a8a9b8-c815-4d64-a3bb-e74f967df5f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 17 18:22:22.519332 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:22.519294 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-qg5rm" podUID="15a8a9b8-c815-4d64-a3bb-e74f967df5f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 17 18:22:22.837832 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:22.837802 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d" path="/var/lib/kubelet/pods/e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d/volumes" Apr 17 18:22:27.523393 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:27.523363 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-qg5rm" Apr 17 18:22:27.523863 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:27.523837 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-qg5rm" podUID="15a8a9b8-c815-4d64-a3bb-e74f967df5f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 17 18:22:37.524844 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:37.524801 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-qg5rm" podUID="15a8a9b8-c815-4d64-a3bb-e74f967df5f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 17 18:22:47.524100 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:47.524061 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-qg5rm" podUID="15a8a9b8-c815-4d64-a3bb-e74f967df5f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 17 18:22:57.524204 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:22:57.524117 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-qg5rm" podUID="15a8a9b8-c815-4d64-a3bb-e74f967df5f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 17 18:23:07.524111 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:07.524067 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-qg5rm" podUID="15a8a9b8-c815-4d64-a3bb-e74f967df5f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 17 18:23:17.523863 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:17.523821 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-qg5rm" podUID="15a8a9b8-c815-4d64-a3bb-e74f967df5f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 17 18:23:27.525011 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:27.524984 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-qg5rm" Apr 17 18:23:37.237588 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:37.237555 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-qg5rm"] Apr 17 18:23:37.237997 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:37.237947 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-qg5rm" podUID="15a8a9b8-c815-4d64-a3bb-e74f967df5f2" containerName="kserve-container" containerID="cri-o://ed4201d98d070bb1367bf3a69adf0bba003b3f652d45aa7c14e53ee9dda0f2a3" gracePeriod=30 Apr 17 18:23:37.238091 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:37.237978 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-qg5rm" podUID="15a8a9b8-c815-4d64-a3bb-e74f967df5f2" containerName="kube-rbac-proxy" containerID="cri-o://646f15fc16a927f69af14865726ca4c662d8d7649bebd5d0a73b6926ed51cf30" gracePeriod=30 Apr 17 18:23:37.366219 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:37.366186 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft"] Apr 17 18:23:37.366529 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:37.366516 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d" containerName="kube-rbac-proxy" Apr 17 18:23:37.366610 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:37.366530 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d" containerName="kube-rbac-proxy" Apr 17 18:23:37.366610 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:37.366543 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d" containerName="storage-initializer" Apr 17 18:23:37.366610 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:37.366549 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d" containerName="storage-initializer" Apr 17 18:23:37.366610 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:37.366557 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d" containerName="kserve-container" Apr 17 18:23:37.366610 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:37.366563 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d" containerName="kserve-container" Apr 17 18:23:37.366610 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:37.366605 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d" containerName="kserve-container" Apr 17 18:23:37.366915 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:37.366617 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="e58ac0a3-7130-4bb8-8f32-eaaf55a3fe5d" containerName="kube-rbac-proxy" Apr 17 18:23:37.369579 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:37.369561 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft" Apr 17 18:23:37.371714 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:37.371692 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-pass-predictor-serving-cert\"" Apr 17 18:23:37.371818 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:37.371715 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\"" Apr 17 18:23:37.371818 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:37.371745 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 17 18:23:37.378362 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:37.378333 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft"] Apr 17 18:23:37.443026 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:37.442995 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tzvv\" (UniqueName: \"kubernetes.io/projected/9f85dfa4-6262-4962-a062-5464113e0214-kube-api-access-4tzvv\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft\" (UID: \"9f85dfa4-6262-4962-a062-5464113e0214\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft" Apr 17 18:23:37.443026 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:37.443030 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f85dfa4-6262-4962-a062-5464113e0214-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft\" (UID: \"9f85dfa4-6262-4962-a062-5464113e0214\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft" Apr 17 18:23:37.443312 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:37.443050 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9f85dfa4-6262-4962-a062-5464113e0214-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft\" (UID: \"9f85dfa4-6262-4962-a062-5464113e0214\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft" Apr 17 18:23:37.443312 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:37.443162 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9f85dfa4-6262-4962-a062-5464113e0214-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft\" (UID: \"9f85dfa4-6262-4962-a062-5464113e0214\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft" Apr 17 18:23:37.443312 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:37.443244 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9f85dfa4-6262-4962-a062-5464113e0214-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft\" (UID: \"9f85dfa4-6262-4962-a062-5464113e0214\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft" Apr 17 18:23:37.520399 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:37.520304 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-qg5rm" podUID="15a8a9b8-c815-4d64-a3bb-e74f967df5f2" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.58:8643/healthz\": dial tcp 10.133.0.58:8643: connect: connection refused" Apr 17 18:23:37.524180 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:37.524158 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-qg5rm" podUID="15a8a9b8-c815-4d64-a3bb-e74f967df5f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 17 18:23:37.544468 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:37.544444 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4tzvv\" (UniqueName: \"kubernetes.io/projected/9f85dfa4-6262-4962-a062-5464113e0214-kube-api-access-4tzvv\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft\" (UID: \"9f85dfa4-6262-4962-a062-5464113e0214\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft" Apr 17 18:23:37.544584 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:37.544478 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f85dfa4-6262-4962-a062-5464113e0214-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft\" (UID: \"9f85dfa4-6262-4962-a062-5464113e0214\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft" Apr 17 18:23:37.544584 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:37.544497 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9f85dfa4-6262-4962-a062-5464113e0214-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft\" (UID: \"9f85dfa4-6262-4962-a062-5464113e0214\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft" Apr 17 18:23:37.544584 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:37.544526 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9f85dfa4-6262-4962-a062-5464113e0214-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft\" (UID: \"9f85dfa4-6262-4962-a062-5464113e0214\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft" Apr 17 18:23:37.544584 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:37.544570 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9f85dfa4-6262-4962-a062-5464113e0214-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft\" (UID: \"9f85dfa4-6262-4962-a062-5464113e0214\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft" Apr 17 18:23:37.544813 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:23:37.544693 2566 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-serving-cert: secret "isvc-sklearn-s3-tls-global-pass-predictor-serving-cert" not found Apr 17 18:23:37.544813 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:23:37.544755 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f85dfa4-6262-4962-a062-5464113e0214-proxy-tls podName:9f85dfa4-6262-4962-a062-5464113e0214 nodeName:}" failed. No retries permitted until 2026-04-17 18:23:38.044733692 +0000 UTC m=+3529.714813318 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/9f85dfa4-6262-4962-a062-5464113e0214-proxy-tls") pod "isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft" (UID: "9f85dfa4-6262-4962-a062-5464113e0214") : secret "isvc-sklearn-s3-tls-global-pass-predictor-serving-cert" not found Apr 17 18:23:37.544936 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:37.544876 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f85dfa4-6262-4962-a062-5464113e0214-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft\" (UID: \"9f85dfa4-6262-4962-a062-5464113e0214\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft" Apr 17 18:23:37.545300 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:37.545279 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9f85dfa4-6262-4962-a062-5464113e0214-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft\" (UID: \"9f85dfa4-6262-4962-a062-5464113e0214\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft" Apr 17 18:23:37.545300 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:37.545292 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9f85dfa4-6262-4962-a062-5464113e0214-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft\" (UID: \"9f85dfa4-6262-4962-a062-5464113e0214\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft" Apr 17 18:23:37.555639 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:37.555622 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tzvv\" (UniqueName: \"kubernetes.io/projected/9f85dfa4-6262-4962-a062-5464113e0214-kube-api-access-4tzvv\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft\" (UID: \"9f85dfa4-6262-4962-a062-5464113e0214\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft" Apr 17 18:23:37.733598 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:37.733564 2566 generic.go:358] "Generic (PLEG): container finished" podID="15a8a9b8-c815-4d64-a3bb-e74f967df5f2" containerID="646f15fc16a927f69af14865726ca4c662d8d7649bebd5d0a73b6926ed51cf30" exitCode=2 Apr 17 18:23:37.733779 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:37.733622 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-qg5rm" event={"ID":"15a8a9b8-c815-4d64-a3bb-e74f967df5f2","Type":"ContainerDied","Data":"646f15fc16a927f69af14865726ca4c662d8d7649bebd5d0a73b6926ed51cf30"} Apr 17 18:23:38.048514 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:38.048480 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9f85dfa4-6262-4962-a062-5464113e0214-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft\" (UID: \"9f85dfa4-6262-4962-a062-5464113e0214\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft" Apr 17 18:23:38.050833 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:38.050815 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9f85dfa4-6262-4962-a062-5464113e0214-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft\" (UID: \"9f85dfa4-6262-4962-a062-5464113e0214\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft" Apr 17 18:23:38.280209 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:38.280177 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft" Apr 17 18:23:38.400933 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:38.400908 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft"] Apr 17 18:23:38.403563 ip-10-0-140-147 kubenswrapper[2566]: W0417 18:23:38.403537 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f85dfa4_6262_4962_a062_5464113e0214.slice/crio-0389175ff8884d931b8d730412e336a2a97459ddcfca48792cb102fc55de06b2 WatchSource:0}: Error finding container 0389175ff8884d931b8d730412e336a2a97459ddcfca48792cb102fc55de06b2: Status 404 returned error can't find the container with id 0389175ff8884d931b8d730412e336a2a97459ddcfca48792cb102fc55de06b2 Apr 17 18:23:38.737808 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:38.737713 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft" event={"ID":"9f85dfa4-6262-4962-a062-5464113e0214","Type":"ContainerStarted","Data":"d1f07b3e99e838d0ece02637980c7ccd3f5494f1f5ee85c82d0d16aea794b7f3"} Apr 17 18:23:38.737808 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:38.737757 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft" event={"ID":"9f85dfa4-6262-4962-a062-5464113e0214","Type":"ContainerStarted","Data":"0389175ff8884d931b8d730412e336a2a97459ddcfca48792cb102fc55de06b2"} Apr 17 18:23:39.741943 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:39.741853 2566 generic.go:358] "Generic (PLEG): container finished" podID="9f85dfa4-6262-4962-a062-5464113e0214" containerID="d1f07b3e99e838d0ece02637980c7ccd3f5494f1f5ee85c82d0d16aea794b7f3" exitCode=0 Apr 17 18:23:39.742324 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:39.741935 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft" event={"ID":"9f85dfa4-6262-4962-a062-5464113e0214","Type":"ContainerDied","Data":"d1f07b3e99e838d0ece02637980c7ccd3f5494f1f5ee85c82d0d16aea794b7f3"} Apr 17 18:23:40.746636 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:40.746604 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft" event={"ID":"9f85dfa4-6262-4962-a062-5464113e0214","Type":"ContainerStarted","Data":"9ceaa7ee45d22478133fde68373a9a503f58d348e7c9cbc7f5ac8c471717e425"} Apr 17 18:23:40.746636 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:40.746639 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft" event={"ID":"9f85dfa4-6262-4962-a062-5464113e0214","Type":"ContainerStarted","Data":"9effdf04b34aaf99cfeb40d15523641b01cc337d5e715df57225816972cd442f"} Apr 17 18:23:40.747078 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:40.746852 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft" Apr 17 18:23:40.747078 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:40.746967 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft" Apr 17 18:23:40.748330 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:40.748304 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft" podUID="9f85dfa4-6262-4962-a062-5464113e0214" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.59:8080: connect: connection refused" Apr 17 18:23:40.770419 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:40.770376 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft" podStartSLOduration=3.770366282 podStartE2EDuration="3.770366282s" podCreationTimestamp="2026-04-17 18:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:23:40.768620644 +0000 UTC m=+3532.438700285" watchObservedRunningTime="2026-04-17 18:23:40.770366282 +0000 UTC m=+3532.440445925" Apr 17 18:23:41.383827 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:41.383805 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-qg5rm" Apr 17 18:23:41.482095 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:41.482066 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/15a8a9b8-c815-4d64-a3bb-e74f967df5f2-proxy-tls\") pod \"15a8a9b8-c815-4d64-a3bb-e74f967df5f2\" (UID: \"15a8a9b8-c815-4d64-a3bb-e74f967df5f2\") " Apr 17 18:23:41.482242 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:41.482120 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/15a8a9b8-c815-4d64-a3bb-e74f967df5f2-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"15a8a9b8-c815-4d64-a3bb-e74f967df5f2\" (UID: \"15a8a9b8-c815-4d64-a3bb-e74f967df5f2\") " Apr 17 18:23:41.482242 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:41.482170 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15a8a9b8-c815-4d64-a3bb-e74f967df5f2-kserve-provision-location\") pod \"15a8a9b8-c815-4d64-a3bb-e74f967df5f2\" (UID: \"15a8a9b8-c815-4d64-a3bb-e74f967df5f2\") " Apr 17 18:23:41.482242 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:41.482217 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4w4j4\" (UniqueName: \"kubernetes.io/projected/15a8a9b8-c815-4d64-a3bb-e74f967df5f2-kube-api-access-4w4j4\") pod \"15a8a9b8-c815-4d64-a3bb-e74f967df5f2\" (UID: \"15a8a9b8-c815-4d64-a3bb-e74f967df5f2\") " Apr 17 18:23:41.482526 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:41.482492 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15a8a9b8-c815-4d64-a3bb-e74f967df5f2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "15a8a9b8-c815-4d64-a3bb-e74f967df5f2" (UID: "15a8a9b8-c815-4d64-a3bb-e74f967df5f2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:23:41.482646 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:41.482526 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15a8a9b8-c815-4d64-a3bb-e74f967df5f2-isvc-sklearn-s3-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-kube-rbac-proxy-sar-config") pod "15a8a9b8-c815-4d64-a3bb-e74f967df5f2" (UID: "15a8a9b8-c815-4d64-a3bb-e74f967df5f2"). InnerVolumeSpecName "isvc-sklearn-s3-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:23:41.484070 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:41.484047 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15a8a9b8-c815-4d64-a3bb-e74f967df5f2-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "15a8a9b8-c815-4d64-a3bb-e74f967df5f2" (UID: "15a8a9b8-c815-4d64-a3bb-e74f967df5f2"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:23:41.484293 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:41.484243 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15a8a9b8-c815-4d64-a3bb-e74f967df5f2-kube-api-access-4w4j4" (OuterVolumeSpecName: "kube-api-access-4w4j4") pod "15a8a9b8-c815-4d64-a3bb-e74f967df5f2" (UID: "15a8a9b8-c815-4d64-a3bb-e74f967df5f2"). InnerVolumeSpecName "kube-api-access-4w4j4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:23:41.582754 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:41.582723 2566 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/15a8a9b8-c815-4d64-a3bb-e74f967df5f2-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:23:41.582754 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:41.582749 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15a8a9b8-c815-4d64-a3bb-e74f967df5f2-kserve-provision-location\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:23:41.582754 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:41.582759 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4w4j4\" (UniqueName: \"kubernetes.io/projected/15a8a9b8-c815-4d64-a3bb-e74f967df5f2-kube-api-access-4w4j4\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:23:41.582963 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:41.582769 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/15a8a9b8-c815-4d64-a3bb-e74f967df5f2-proxy-tls\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:23:41.751599 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:41.751507 2566 generic.go:358] "Generic (PLEG): container finished" podID="15a8a9b8-c815-4d64-a3bb-e74f967df5f2" containerID="ed4201d98d070bb1367bf3a69adf0bba003b3f652d45aa7c14e53ee9dda0f2a3" exitCode=0 Apr 17 18:23:41.751599 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:41.751590 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-qg5rm" Apr 17 18:23:41.752071 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:41.751582 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-qg5rm" event={"ID":"15a8a9b8-c815-4d64-a3bb-e74f967df5f2","Type":"ContainerDied","Data":"ed4201d98d070bb1367bf3a69adf0bba003b3f652d45aa7c14e53ee9dda0f2a3"} Apr 17 18:23:41.752071 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:41.751697 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-qg5rm" event={"ID":"15a8a9b8-c815-4d64-a3bb-e74f967df5f2","Type":"ContainerDied","Data":"7030ec13ca55e9171e72a213498a8b54c17c549dd7df0e0218b43bb3bf02afd3"} Apr 17 18:23:41.752071 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:41.751715 2566 scope.go:117] "RemoveContainer" containerID="646f15fc16a927f69af14865726ca4c662d8d7649bebd5d0a73b6926ed51cf30" Apr 17 18:23:41.752289 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:41.752167 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft" podUID="9f85dfa4-6262-4962-a062-5464113e0214" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.59:8080: connect: connection refused" Apr 17 18:23:41.760201 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:41.760188 2566 scope.go:117] "RemoveContainer" containerID="ed4201d98d070bb1367bf3a69adf0bba003b3f652d45aa7c14e53ee9dda0f2a3" Apr 17 18:23:41.767189 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:41.767174 2566 scope.go:117] "RemoveContainer" containerID="189c8ab7e9f19099ac0b28e6c00e2ad3e2865131ac30fd45d2fa06ac46f641f1" Apr 17 18:23:41.773720 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:41.773705 2566 scope.go:117] "RemoveContainer" containerID="646f15fc16a927f69af14865726ca4c662d8d7649bebd5d0a73b6926ed51cf30" Apr 17 18:23:41.773946 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:23:41.773927 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"646f15fc16a927f69af14865726ca4c662d8d7649bebd5d0a73b6926ed51cf30\": container with ID starting with 646f15fc16a927f69af14865726ca4c662d8d7649bebd5d0a73b6926ed51cf30 not found: ID does not exist" containerID="646f15fc16a927f69af14865726ca4c662d8d7649bebd5d0a73b6926ed51cf30" Apr 17 18:23:41.773994 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:41.773952 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"646f15fc16a927f69af14865726ca4c662d8d7649bebd5d0a73b6926ed51cf30"} err="failed to get container status \"646f15fc16a927f69af14865726ca4c662d8d7649bebd5d0a73b6926ed51cf30\": rpc error: code = NotFound desc = could not find container \"646f15fc16a927f69af14865726ca4c662d8d7649bebd5d0a73b6926ed51cf30\": container with ID starting with 646f15fc16a927f69af14865726ca4c662d8d7649bebd5d0a73b6926ed51cf30 not found: ID does not exist" Apr 17 18:23:41.773994 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:41.773968 2566 scope.go:117] "RemoveContainer" containerID="ed4201d98d070bb1367bf3a69adf0bba003b3f652d45aa7c14e53ee9dda0f2a3" Apr 17 18:23:41.774213 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:23:41.774195 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed4201d98d070bb1367bf3a69adf0bba003b3f652d45aa7c14e53ee9dda0f2a3\": container with ID starting with ed4201d98d070bb1367bf3a69adf0bba003b3f652d45aa7c14e53ee9dda0f2a3 not found: ID does not exist" containerID="ed4201d98d070bb1367bf3a69adf0bba003b3f652d45aa7c14e53ee9dda0f2a3" Apr 17 18:23:41.774313 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:41.774222 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed4201d98d070bb1367bf3a69adf0bba003b3f652d45aa7c14e53ee9dda0f2a3"} err="failed to get container status \"ed4201d98d070bb1367bf3a69adf0bba003b3f652d45aa7c14e53ee9dda0f2a3\": rpc error: code = NotFound desc = could not find container \"ed4201d98d070bb1367bf3a69adf0bba003b3f652d45aa7c14e53ee9dda0f2a3\": container with ID starting with ed4201d98d070bb1367bf3a69adf0bba003b3f652d45aa7c14e53ee9dda0f2a3 not found: ID does not exist" Apr 17 18:23:41.774313 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:41.774240 2566 scope.go:117] "RemoveContainer" containerID="189c8ab7e9f19099ac0b28e6c00e2ad3e2865131ac30fd45d2fa06ac46f641f1" Apr 17 18:23:41.774557 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:23:41.774536 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"189c8ab7e9f19099ac0b28e6c00e2ad3e2865131ac30fd45d2fa06ac46f641f1\": container with ID starting with 189c8ab7e9f19099ac0b28e6c00e2ad3e2865131ac30fd45d2fa06ac46f641f1 not found: ID does not exist" containerID="189c8ab7e9f19099ac0b28e6c00e2ad3e2865131ac30fd45d2fa06ac46f641f1" Apr 17 18:23:41.774639 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:41.774566 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"189c8ab7e9f19099ac0b28e6c00e2ad3e2865131ac30fd45d2fa06ac46f641f1"} err="failed to get container status \"189c8ab7e9f19099ac0b28e6c00e2ad3e2865131ac30fd45d2fa06ac46f641f1\": rpc error: code = NotFound desc = could not find container \"189c8ab7e9f19099ac0b28e6c00e2ad3e2865131ac30fd45d2fa06ac46f641f1\": container with ID starting with 189c8ab7e9f19099ac0b28e6c00e2ad3e2865131ac30fd45d2fa06ac46f641f1 not found: ID does not exist" Apr 17 18:23:41.776276 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:41.776240 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-qg5rm"] Apr 17 18:23:41.782006 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:41.781988 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-qg5rm"] Apr 17 18:23:42.837857 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:42.837825 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15a8a9b8-c815-4d64-a3bb-e74f967df5f2" path="/var/lib/kubelet/pods/15a8a9b8-c815-4d64-a3bb-e74f967df5f2/volumes" Apr 17 18:23:46.756650 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:46.756621 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft" Apr 17 18:23:46.757171 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:46.757148 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft" podUID="9f85dfa4-6262-4962-a062-5464113e0214" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.59:8080: connect: connection refused" Apr 17 18:23:56.757308 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:23:56.757249 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft" podUID="9f85dfa4-6262-4962-a062-5464113e0214" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.59:8080: connect: connection refused" Apr 17 18:24:06.757524 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:06.757478 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft" podUID="9f85dfa4-6262-4962-a062-5464113e0214" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.59:8080: connect: connection refused" Apr 17 18:24:16.757127 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:16.757088 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft" podUID="9f85dfa4-6262-4962-a062-5464113e0214" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.59:8080: connect: connection refused" Apr 17 18:24:26.757220 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:26.757130 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft" podUID="9f85dfa4-6262-4962-a062-5464113e0214" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.59:8080: connect: connection refused" Apr 17 18:24:36.758095 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:36.758055 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft" podUID="9f85dfa4-6262-4962-a062-5464113e0214" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.59:8080: connect: connection refused" Apr 17 18:24:46.757402 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:46.757367 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft" Apr 17 18:24:47.425956 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:47.425926 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft"] Apr 17 18:24:47.426288 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:47.426215 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft" podUID="9f85dfa4-6262-4962-a062-5464113e0214" containerName="kserve-container" containerID="cri-o://9effdf04b34aaf99cfeb40d15523641b01cc337d5e715df57225816972cd442f" gracePeriod=30 Apr 17 18:24:47.426393 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:47.426293 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft" podUID="9f85dfa4-6262-4962-a062-5464113e0214" containerName="kube-rbac-proxy" containerID="cri-o://9ceaa7ee45d22478133fde68373a9a503f58d348e7c9cbc7f5ac8c471717e425" gracePeriod=30 Apr 17 18:24:47.940670 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:47.940636 2566 generic.go:358] "Generic (PLEG): container finished" podID="9f85dfa4-6262-4962-a062-5464113e0214" containerID="9ceaa7ee45d22478133fde68373a9a503f58d348e7c9cbc7f5ac8c471717e425" exitCode=2 Apr 17 18:24:47.941070 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:47.940710 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft" event={"ID":"9f85dfa4-6262-4962-a062-5464113e0214","Type":"ContainerDied","Data":"9ceaa7ee45d22478133fde68373a9a503f58d348e7c9cbc7f5ac8c471717e425"} Apr 17 18:24:48.483396 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:48.483358 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b"] Apr 17 18:24:48.483726 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:48.483713 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="15a8a9b8-c815-4d64-a3bb-e74f967df5f2" containerName="kserve-container" Apr 17 18:24:48.483784 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:48.483727 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="15a8a9b8-c815-4d64-a3bb-e74f967df5f2" containerName="kserve-container" Apr 17 18:24:48.483784 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:48.483743 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="15a8a9b8-c815-4d64-a3bb-e74f967df5f2" containerName="storage-initializer" Apr 17 18:24:48.483784 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:48.483748 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="15a8a9b8-c815-4d64-a3bb-e74f967df5f2" containerName="storage-initializer" Apr 17 18:24:48.483784 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:48.483765 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="15a8a9b8-c815-4d64-a3bb-e74f967df5f2" containerName="kube-rbac-proxy" Apr 17 18:24:48.483784 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:48.483771 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="15a8a9b8-c815-4d64-a3bb-e74f967df5f2" containerName="kube-rbac-proxy" Apr 17 18:24:48.483960 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:48.483812 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="15a8a9b8-c815-4d64-a3bb-e74f967df5f2" containerName="kserve-container" Apr 17 18:24:48.483960 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:48.483821 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="15a8a9b8-c815-4d64-a3bb-e74f967df5f2" containerName="kube-rbac-proxy" Apr 17 18:24:48.486922 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:48.486907 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b" Apr 17 18:24:48.489557 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:48.489535 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\"" Apr 17 18:24:48.489718 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:48.489703 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-fail-predictor-serving-cert\"" Apr 17 18:24:48.496226 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:48.496206 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b"] Apr 17 18:24:48.620925 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:48.620888 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlhp4\" (UniqueName: \"kubernetes.io/projected/c3de0b41-a670-44ce-9641-575ddc679933-kube-api-access-qlhp4\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b\" (UID: \"c3de0b41-a670-44ce-9641-575ddc679933\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b" Apr 17 18:24:48.621080 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:48.620945 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c3de0b41-a670-44ce-9641-575ddc679933-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b\" (UID: \"c3de0b41-a670-44ce-9641-575ddc679933\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b" Apr 17 18:24:48.621080 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:48.621002 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c3de0b41-a670-44ce-9641-575ddc679933-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b\" (UID: \"c3de0b41-a670-44ce-9641-575ddc679933\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b" Apr 17 18:24:48.621080 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:48.621037 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c3de0b41-a670-44ce-9641-575ddc679933-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b\" (UID: \"c3de0b41-a670-44ce-9641-575ddc679933\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b" Apr 17 18:24:48.721805 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:48.721771 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c3de0b41-a670-44ce-9641-575ddc679933-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b\" (UID: \"c3de0b41-a670-44ce-9641-575ddc679933\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b" Apr 17 18:24:48.721980 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:48.721817 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c3de0b41-a670-44ce-9641-575ddc679933-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b\" (UID: \"c3de0b41-a670-44ce-9641-575ddc679933\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b" Apr 17 18:24:48.721980 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:48.721850 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c3de0b41-a670-44ce-9641-575ddc679933-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b\" (UID: \"c3de0b41-a670-44ce-9641-575ddc679933\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b" Apr 17 18:24:48.721980 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:48.721879 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qlhp4\" (UniqueName: \"kubernetes.io/projected/c3de0b41-a670-44ce-9641-575ddc679933-kube-api-access-qlhp4\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b\" (UID: \"c3de0b41-a670-44ce-9641-575ddc679933\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b" Apr 17 18:24:48.721980 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:24:48.721927 2566 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-serving-cert: secret "isvc-sklearn-s3-tls-global-fail-predictor-serving-cert" not found Apr 17 18:24:48.722167 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:24:48.722012 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3de0b41-a670-44ce-9641-575ddc679933-proxy-tls podName:c3de0b41-a670-44ce-9641-575ddc679933 nodeName:}" failed. No retries permitted until 2026-04-17 18:24:49.221991871 +0000 UTC m=+3600.892071512 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/c3de0b41-a670-44ce-9641-575ddc679933-proxy-tls") pod "isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b" (UID: "c3de0b41-a670-44ce-9641-575ddc679933") : secret "isvc-sklearn-s3-tls-global-fail-predictor-serving-cert" not found Apr 17 18:24:48.722225 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:48.722210 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c3de0b41-a670-44ce-9641-575ddc679933-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b\" (UID: \"c3de0b41-a670-44ce-9641-575ddc679933\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b" Apr 17 18:24:48.722560 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:48.722541 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c3de0b41-a670-44ce-9641-575ddc679933-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b\" (UID: \"c3de0b41-a670-44ce-9641-575ddc679933\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b" Apr 17 18:24:48.731165 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:48.731140 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlhp4\" (UniqueName: \"kubernetes.io/projected/c3de0b41-a670-44ce-9641-575ddc679933-kube-api-access-qlhp4\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b\" (UID: \"c3de0b41-a670-44ce-9641-575ddc679933\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b" Apr 17 18:24:49.226986 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:49.226952 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c3de0b41-a670-44ce-9641-575ddc679933-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b\" (UID: \"c3de0b41-a670-44ce-9641-575ddc679933\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b" Apr 17 18:24:49.229311 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:49.229288 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c3de0b41-a670-44ce-9641-575ddc679933-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b\" (UID: \"c3de0b41-a670-44ce-9641-575ddc679933\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b" Apr 17 18:24:49.397810 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:49.397776 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b" Apr 17 18:24:49.521750 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:49.521665 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b"] Apr 17 18:24:49.524426 ip-10-0-140-147 kubenswrapper[2566]: W0417 18:24:49.524395 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3de0b41_a670_44ce_9641_575ddc679933.slice/crio-2e0d5c6cf71ac97a1dea439069b5f00b2d916c523477a655a08ff71afed69ffa WatchSource:0}: Error finding container 2e0d5c6cf71ac97a1dea439069b5f00b2d916c523477a655a08ff71afed69ffa: Status 404 returned error can't find the container with id 2e0d5c6cf71ac97a1dea439069b5f00b2d916c523477a655a08ff71afed69ffa Apr 17 18:24:49.948753 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:49.948717 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b" event={"ID":"c3de0b41-a670-44ce-9641-575ddc679933","Type":"ContainerStarted","Data":"175e043222e9bc6d4df890341f5f4d4ad604829dd15950ad5366ca0f1bbe8c31"} Apr 17 18:24:49.948753 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:49.948754 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b" event={"ID":"c3de0b41-a670-44ce-9641-575ddc679933","Type":"ContainerStarted","Data":"2e0d5c6cf71ac97a1dea439069b5f00b2d916c523477a655a08ff71afed69ffa"} Apr 17 18:24:51.666184 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:51.666162 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft" Apr 17 18:24:51.747101 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:51.747034 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9f85dfa4-6262-4962-a062-5464113e0214-proxy-tls\") pod \"9f85dfa4-6262-4962-a062-5464113e0214\" (UID: \"9f85dfa4-6262-4962-a062-5464113e0214\") " Apr 17 18:24:51.747229 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:51.747130 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9f85dfa4-6262-4962-a062-5464113e0214-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"9f85dfa4-6262-4962-a062-5464113e0214\" (UID: \"9f85dfa4-6262-4962-a062-5464113e0214\") " Apr 17 18:24:51.747305 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:51.747290 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tzvv\" (UniqueName: \"kubernetes.io/projected/9f85dfa4-6262-4962-a062-5464113e0214-kube-api-access-4tzvv\") pod \"9f85dfa4-6262-4962-a062-5464113e0214\" (UID: \"9f85dfa4-6262-4962-a062-5464113e0214\") " Apr 17 18:24:51.747378 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:51.747364 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9f85dfa4-6262-4962-a062-5464113e0214-cabundle-cert\") pod \"9f85dfa4-6262-4962-a062-5464113e0214\" (UID: \"9f85dfa4-6262-4962-a062-5464113e0214\") " Apr 17 18:24:51.747415 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:51.747398 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f85dfa4-6262-4962-a062-5464113e0214-kserve-provision-location\") pod \"9f85dfa4-6262-4962-a062-5464113e0214\" (UID: \"9f85dfa4-6262-4962-a062-5464113e0214\") " Apr 17 18:24:51.747590 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:51.747560 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f85dfa4-6262-4962-a062-5464113e0214-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config") pod "9f85dfa4-6262-4962-a062-5464113e0214" (UID: "9f85dfa4-6262-4962-a062-5464113e0214"). InnerVolumeSpecName "isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:24:51.747778 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:51.747748 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f85dfa4-6262-4962-a062-5464113e0214-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "9f85dfa4-6262-4962-a062-5464113e0214" (UID: "9f85dfa4-6262-4962-a062-5464113e0214"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:24:51.747884 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:51.747748 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f85dfa4-6262-4962-a062-5464113e0214-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9f85dfa4-6262-4962-a062-5464113e0214" (UID: "9f85dfa4-6262-4962-a062-5464113e0214"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:24:51.749288 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:51.749266 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f85dfa4-6262-4962-a062-5464113e0214-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9f85dfa4-6262-4962-a062-5464113e0214" (UID: "9f85dfa4-6262-4962-a062-5464113e0214"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:24:51.749380 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:51.749354 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f85dfa4-6262-4962-a062-5464113e0214-kube-api-access-4tzvv" (OuterVolumeSpecName: "kube-api-access-4tzvv") pod "9f85dfa4-6262-4962-a062-5464113e0214" (UID: "9f85dfa4-6262-4962-a062-5464113e0214"). InnerVolumeSpecName "kube-api-access-4tzvv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:24:51.847999 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:51.847976 2566 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9f85dfa4-6262-4962-a062-5464113e0214-cabundle-cert\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:24:51.847999 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:51.847998 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f85dfa4-6262-4962-a062-5464113e0214-kserve-provision-location\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:24:51.848140 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:51.848009 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9f85dfa4-6262-4962-a062-5464113e0214-proxy-tls\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:24:51.848140 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:51.848019 2566 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9f85dfa4-6262-4962-a062-5464113e0214-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:24:51.848140 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:51.848030 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4tzvv\" (UniqueName: \"kubernetes.io/projected/9f85dfa4-6262-4962-a062-5464113e0214-kube-api-access-4tzvv\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:24:51.956067 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:51.956035 2566 generic.go:358] "Generic (PLEG): container finished" podID="9f85dfa4-6262-4962-a062-5464113e0214" containerID="9effdf04b34aaf99cfeb40d15523641b01cc337d5e715df57225816972cd442f" exitCode=0 Apr 17 18:24:51.956207 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:51.956073 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft" event={"ID":"9f85dfa4-6262-4962-a062-5464113e0214","Type":"ContainerDied","Data":"9effdf04b34aaf99cfeb40d15523641b01cc337d5e715df57225816972cd442f"} Apr 17 18:24:51.956207 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:51.956101 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft" event={"ID":"9f85dfa4-6262-4962-a062-5464113e0214","Type":"ContainerDied","Data":"0389175ff8884d931b8d730412e336a2a97459ddcfca48792cb102fc55de06b2"} Apr 17 18:24:51.956207 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:51.956111 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft" Apr 17 18:24:51.956207 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:51.956119 2566 scope.go:117] "RemoveContainer" containerID="9ceaa7ee45d22478133fde68373a9a503f58d348e7c9cbc7f5ac8c471717e425" Apr 17 18:24:51.964609 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:51.964588 2566 scope.go:117] "RemoveContainer" containerID="9effdf04b34aaf99cfeb40d15523641b01cc337d5e715df57225816972cd442f" Apr 17 18:24:51.971774 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:51.971760 2566 scope.go:117] "RemoveContainer" containerID="d1f07b3e99e838d0ece02637980c7ccd3f5494f1f5ee85c82d0d16aea794b7f3" Apr 17 18:24:51.978151 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:51.978124 2566 scope.go:117] "RemoveContainer" containerID="9ceaa7ee45d22478133fde68373a9a503f58d348e7c9cbc7f5ac8c471717e425" Apr 17 18:24:51.978474 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:24:51.978446 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ceaa7ee45d22478133fde68373a9a503f58d348e7c9cbc7f5ac8c471717e425\": container with ID starting with 9ceaa7ee45d22478133fde68373a9a503f58d348e7c9cbc7f5ac8c471717e425 not found: ID does not exist" containerID="9ceaa7ee45d22478133fde68373a9a503f58d348e7c9cbc7f5ac8c471717e425" Apr 17 18:24:51.978565 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:51.978476 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ceaa7ee45d22478133fde68373a9a503f58d348e7c9cbc7f5ac8c471717e425"} err="failed to get container status \"9ceaa7ee45d22478133fde68373a9a503f58d348e7c9cbc7f5ac8c471717e425\": rpc error: code = NotFound desc = could not find container \"9ceaa7ee45d22478133fde68373a9a503f58d348e7c9cbc7f5ac8c471717e425\": container with ID starting with 9ceaa7ee45d22478133fde68373a9a503f58d348e7c9cbc7f5ac8c471717e425 not found: ID does not exist" Apr 17 18:24:51.978565 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:51.978498 2566 scope.go:117] "RemoveContainer" containerID="9effdf04b34aaf99cfeb40d15523641b01cc337d5e715df57225816972cd442f" Apr 17 18:24:51.978860 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:24:51.978837 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9effdf04b34aaf99cfeb40d15523641b01cc337d5e715df57225816972cd442f\": container with ID starting with 9effdf04b34aaf99cfeb40d15523641b01cc337d5e715df57225816972cd442f not found: ID does not exist" containerID="9effdf04b34aaf99cfeb40d15523641b01cc337d5e715df57225816972cd442f" Apr 17 18:24:51.978921 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:51.978870 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9effdf04b34aaf99cfeb40d15523641b01cc337d5e715df57225816972cd442f"} err="failed to get container status \"9effdf04b34aaf99cfeb40d15523641b01cc337d5e715df57225816972cd442f\": rpc error: code = NotFound desc = could not find container \"9effdf04b34aaf99cfeb40d15523641b01cc337d5e715df57225816972cd442f\": container with ID starting with 9effdf04b34aaf99cfeb40d15523641b01cc337d5e715df57225816972cd442f not found: ID does not exist" Apr 17 18:24:51.978921 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:51.978890 2566 scope.go:117] "RemoveContainer" containerID="d1f07b3e99e838d0ece02637980c7ccd3f5494f1f5ee85c82d0d16aea794b7f3" Apr 17 18:24:51.979319 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:24:51.979298 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1f07b3e99e838d0ece02637980c7ccd3f5494f1f5ee85c82d0d16aea794b7f3\": container with ID starting with d1f07b3e99e838d0ece02637980c7ccd3f5494f1f5ee85c82d0d16aea794b7f3 not found: ID does not exist" containerID="d1f07b3e99e838d0ece02637980c7ccd3f5494f1f5ee85c82d0d16aea794b7f3" Apr 17 18:24:51.979446 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:51.979326 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1f07b3e99e838d0ece02637980c7ccd3f5494f1f5ee85c82d0d16aea794b7f3"} err="failed to get container status \"d1f07b3e99e838d0ece02637980c7ccd3f5494f1f5ee85c82d0d16aea794b7f3\": rpc error: code = NotFound desc = could not find container \"d1f07b3e99e838d0ece02637980c7ccd3f5494f1f5ee85c82d0d16aea794b7f3\": container with ID starting with d1f07b3e99e838d0ece02637980c7ccd3f5494f1f5ee85c82d0d16aea794b7f3 not found: ID does not exist" Apr 17 18:24:51.980623 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:51.980605 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft"] Apr 17 18:24:51.983750 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:51.983727 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-wq9ft"] Apr 17 18:24:52.838174 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:52.838146 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f85dfa4-6262-4962-a062-5464113e0214" path="/var/lib/kubelet/pods/9f85dfa4-6262-4962-a062-5464113e0214/volumes" Apr 17 18:24:52.959583 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:52.959557 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b_c3de0b41-a670-44ce-9641-575ddc679933/storage-initializer/0.log" Apr 17 18:24:52.959741 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:52.959593 2566 generic.go:358] "Generic (PLEG): container finished" podID="c3de0b41-a670-44ce-9641-575ddc679933" containerID="175e043222e9bc6d4df890341f5f4d4ad604829dd15950ad5366ca0f1bbe8c31" exitCode=1 Apr 17 18:24:52.959741 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:52.959673 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b" event={"ID":"c3de0b41-a670-44ce-9641-575ddc679933","Type":"ContainerDied","Data":"175e043222e9bc6d4df890341f5f4d4ad604829dd15950ad5366ca0f1bbe8c31"} Apr 17 18:24:53.964944 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:53.964919 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b_c3de0b41-a670-44ce-9641-575ddc679933/storage-initializer/0.log" Apr 17 18:24:53.965368 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:53.964997 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b" event={"ID":"c3de0b41-a670-44ce-9641-575ddc679933","Type":"ContainerStarted","Data":"9395c62846294a6d8fdd8a3775b742c183348c0ad931366d356f94d9d2d36939"} Apr 17 18:24:56.973269 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:56.973236 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b_c3de0b41-a670-44ce-9641-575ddc679933/storage-initializer/1.log" Apr 17 18:24:56.973714 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:56.973586 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b_c3de0b41-a670-44ce-9641-575ddc679933/storage-initializer/0.log" Apr 17 18:24:56.973714 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:56.973622 2566 generic.go:358] "Generic (PLEG): container finished" podID="c3de0b41-a670-44ce-9641-575ddc679933" containerID="9395c62846294a6d8fdd8a3775b742c183348c0ad931366d356f94d9d2d36939" exitCode=1 Apr 17 18:24:56.973714 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:56.973686 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b" event={"ID":"c3de0b41-a670-44ce-9641-575ddc679933","Type":"ContainerDied","Data":"9395c62846294a6d8fdd8a3775b742c183348c0ad931366d356f94d9d2d36939"} Apr 17 18:24:56.973862 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:56.973724 2566 scope.go:117] "RemoveContainer" containerID="175e043222e9bc6d4df890341f5f4d4ad604829dd15950ad5366ca0f1bbe8c31" Apr 17 18:24:56.974095 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:56.974077 2566 scope.go:117] "RemoveContainer" containerID="175e043222e9bc6d4df890341f5f4d4ad604829dd15950ad5366ca0f1bbe8c31" Apr 17 18:24:56.983788 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:24:56.983762 2566 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b_kserve-ci-e2e-test_c3de0b41-a670-44ce-9641-575ddc679933_0 in pod sandbox 2e0d5c6cf71ac97a1dea439069b5f00b2d916c523477a655a08ff71afed69ffa from index: no such id: '175e043222e9bc6d4df890341f5f4d4ad604829dd15950ad5366ca0f1bbe8c31'" containerID="175e043222e9bc6d4df890341f5f4d4ad604829dd15950ad5366ca0f1bbe8c31" Apr 17 18:24:56.983875 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:56.983795 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"175e043222e9bc6d4df890341f5f4d4ad604829dd15950ad5366ca0f1bbe8c31"} err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b_kserve-ci-e2e-test_c3de0b41-a670-44ce-9641-575ddc679933_0 in pod sandbox 2e0d5c6cf71ac97a1dea439069b5f00b2d916c523477a655a08ff71afed69ffa from index: no such id: '175e043222e9bc6d4df890341f5f4d4ad604829dd15950ad5366ca0f1bbe8c31'" Apr 17 18:24:56.984018 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:24:56.983997 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b_kserve-ci-e2e-test(c3de0b41-a670-44ce-9641-575ddc679933)\"" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b" podUID="c3de0b41-a670-44ce-9641-575ddc679933" Apr 17 18:24:57.978099 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:57.978073 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b_c3de0b41-a670-44ce-9641-575ddc679933/storage-initializer/1.log" Apr 17 18:24:58.472861 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:58.472831 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b"] Apr 17 18:24:58.596837 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:58.596817 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b_c3de0b41-a670-44ce-9641-575ddc679933/storage-initializer/1.log" Apr 17 18:24:58.596966 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:58.596882 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b" Apr 17 18:24:58.704048 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:58.704020 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlhp4\" (UniqueName: \"kubernetes.io/projected/c3de0b41-a670-44ce-9641-575ddc679933-kube-api-access-qlhp4\") pod \"c3de0b41-a670-44ce-9641-575ddc679933\" (UID: \"c3de0b41-a670-44ce-9641-575ddc679933\") " Apr 17 18:24:58.704220 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:58.704101 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c3de0b41-a670-44ce-9641-575ddc679933-proxy-tls\") pod \"c3de0b41-a670-44ce-9641-575ddc679933\" (UID: \"c3de0b41-a670-44ce-9641-575ddc679933\") " Apr 17 18:24:58.704220 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:58.704201 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c3de0b41-a670-44ce-9641-575ddc679933-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"c3de0b41-a670-44ce-9641-575ddc679933\" (UID: \"c3de0b41-a670-44ce-9641-575ddc679933\") " Apr 17 18:24:58.704363 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:58.704276 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c3de0b41-a670-44ce-9641-575ddc679933-kserve-provision-location\") pod \"c3de0b41-a670-44ce-9641-575ddc679933\" (UID: \"c3de0b41-a670-44ce-9641-575ddc679933\") " Apr 17 18:24:58.704529 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:58.704501 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3de0b41-a670-44ce-9641-575ddc679933-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c3de0b41-a670-44ce-9641-575ddc679933" (UID: "c3de0b41-a670-44ce-9641-575ddc679933"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:24:58.704610 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:58.704590 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3de0b41-a670-44ce-9641-575ddc679933-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config") pod "c3de0b41-a670-44ce-9641-575ddc679933" (UID: "c3de0b41-a670-44ce-9641-575ddc679933"). InnerVolumeSpecName "isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:24:58.706087 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:58.706065 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3de0b41-a670-44ce-9641-575ddc679933-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c3de0b41-a670-44ce-9641-575ddc679933" (UID: "c3de0b41-a670-44ce-9641-575ddc679933"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:24:58.706141 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:58.706105 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3de0b41-a670-44ce-9641-575ddc679933-kube-api-access-qlhp4" (OuterVolumeSpecName: "kube-api-access-qlhp4") pod "c3de0b41-a670-44ce-9641-575ddc679933" (UID: "c3de0b41-a670-44ce-9641-575ddc679933"). InnerVolumeSpecName "kube-api-access-qlhp4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:24:58.805799 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:58.805725 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qlhp4\" (UniqueName: \"kubernetes.io/projected/c3de0b41-a670-44ce-9641-575ddc679933-kube-api-access-qlhp4\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:24:58.805799 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:58.805751 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c3de0b41-a670-44ce-9641-575ddc679933-proxy-tls\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:24:58.805799 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:58.805762 2566 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c3de0b41-a670-44ce-9641-575ddc679933-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:24:58.805799 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:58.805773 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c3de0b41-a670-44ce-9641-575ddc679933-kserve-provision-location\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:24:58.982215 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:58.982188 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b_c3de0b41-a670-44ce-9641-575ddc679933/storage-initializer/1.log" Apr 17 18:24:58.982614 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:58.982325 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b" Apr 17 18:24:58.982614 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:58.982320 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b" event={"ID":"c3de0b41-a670-44ce-9641-575ddc679933","Type":"ContainerDied","Data":"2e0d5c6cf71ac97a1dea439069b5f00b2d916c523477a655a08ff71afed69ffa"} Apr 17 18:24:58.982614 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:58.982438 2566 scope.go:117] "RemoveContainer" containerID="9395c62846294a6d8fdd8a3775b742c183348c0ad931366d356f94d9d2d36939" Apr 17 18:24:59.012957 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:59.012929 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b"] Apr 17 18:24:59.016715 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:59.016695 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-h4j9b"] Apr 17 18:24:59.571995 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:59.571966 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp"] Apr 17 18:24:59.572309 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:59.572243 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c3de0b41-a670-44ce-9641-575ddc679933" containerName="storage-initializer" Apr 17 18:24:59.572309 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:59.572284 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3de0b41-a670-44ce-9641-575ddc679933" containerName="storage-initializer" Apr 17 18:24:59.572309 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:59.572301 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9f85dfa4-6262-4962-a062-5464113e0214" containerName="kube-rbac-proxy" Apr 17 18:24:59.572309 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:59.572309 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f85dfa4-6262-4962-a062-5464113e0214" containerName="kube-rbac-proxy" Apr 17 18:24:59.572568 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:59.572356 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9f85dfa4-6262-4962-a062-5464113e0214" containerName="storage-initializer" Apr 17 18:24:59.572568 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:59.572366 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f85dfa4-6262-4962-a062-5464113e0214" containerName="storage-initializer" Apr 17 18:24:59.572568 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:59.572373 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9f85dfa4-6262-4962-a062-5464113e0214" containerName="kserve-container" Apr 17 18:24:59.572568 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:59.572380 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f85dfa4-6262-4962-a062-5464113e0214" containerName="kserve-container" Apr 17 18:24:59.572568 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:59.572427 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="c3de0b41-a670-44ce-9641-575ddc679933" containerName="storage-initializer" Apr 17 18:24:59.572568 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:59.572438 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="9f85dfa4-6262-4962-a062-5464113e0214" containerName="kserve-container" Apr 17 18:24:59.572568 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:59.572452 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="9f85dfa4-6262-4962-a062-5464113e0214" containerName="kube-rbac-proxy" Apr 17 18:24:59.572568 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:59.572509 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c3de0b41-a670-44ce-9641-575ddc679933" containerName="storage-initializer" Apr 17 18:24:59.572568 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:59.572515 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3de0b41-a670-44ce-9641-575ddc679933" containerName="storage-initializer" Apr 17 18:24:59.572989 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:59.572574 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="c3de0b41-a670-44ce-9641-575ddc679933" containerName="storage-initializer" Apr 17 18:24:59.576778 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:59.576757 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp" Apr 17 18:24:59.579459 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:59.579404 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-pass-predictor-serving-cert\"" Apr 17 18:24:59.579599 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:59.579485 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 17 18:24:59.579599 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:59.579487 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 17 18:24:59.579684 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:59.579598 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\"" Apr 17 18:24:59.579684 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:59.579599 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 17 18:24:59.580334 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:59.580316 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-b25c4\"" Apr 17 18:24:59.580454 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:59.580316 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 17 18:24:59.587911 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:59.587887 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp"] Apr 17 18:24:59.711864 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:59.711837 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b03256a3-ec2d-4b64-9922-7269c901288f-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp\" (UID: \"b03256a3-ec2d-4b64-9922-7269c901288f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp" Apr 17 18:24:59.712067 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:59.711875 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b03256a3-ec2d-4b64-9922-7269c901288f-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp\" (UID: \"b03256a3-ec2d-4b64-9922-7269c901288f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp" Apr 17 18:24:59.712067 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:59.711898 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46v2r\" (UniqueName: \"kubernetes.io/projected/b03256a3-ec2d-4b64-9922-7269c901288f-kube-api-access-46v2r\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp\" (UID: \"b03256a3-ec2d-4b64-9922-7269c901288f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp" Apr 17 18:24:59.712067 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:59.711946 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/b03256a3-ec2d-4b64-9922-7269c901288f-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp\" (UID: \"b03256a3-ec2d-4b64-9922-7269c901288f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp" Apr 17 18:24:59.712067 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:59.711988 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b03256a3-ec2d-4b64-9922-7269c901288f-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp\" (UID: \"b03256a3-ec2d-4b64-9922-7269c901288f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp" Apr 17 18:24:59.812895 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:59.812866 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b03256a3-ec2d-4b64-9922-7269c901288f-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp\" (UID: \"b03256a3-ec2d-4b64-9922-7269c901288f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp" Apr 17 18:24:59.813072 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:59.812904 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b03256a3-ec2d-4b64-9922-7269c901288f-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp\" (UID: \"b03256a3-ec2d-4b64-9922-7269c901288f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp" Apr 17 18:24:59.813072 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:59.812924 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-46v2r\" (UniqueName: \"kubernetes.io/projected/b03256a3-ec2d-4b64-9922-7269c901288f-kube-api-access-46v2r\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp\" (UID: \"b03256a3-ec2d-4b64-9922-7269c901288f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp" Apr 17 18:24:59.813072 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:59.812945 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/b03256a3-ec2d-4b64-9922-7269c901288f-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp\" (UID: \"b03256a3-ec2d-4b64-9922-7269c901288f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp" Apr 17 18:24:59.813072 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:59.812962 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b03256a3-ec2d-4b64-9922-7269c901288f-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp\" (UID: \"b03256a3-ec2d-4b64-9922-7269c901288f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp" Apr 17 18:24:59.813495 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:59.813468 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b03256a3-ec2d-4b64-9922-7269c901288f-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp\" (UID: \"b03256a3-ec2d-4b64-9922-7269c901288f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp" Apr 17 18:24:59.813741 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:59.813723 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b03256a3-ec2d-4b64-9922-7269c901288f-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp\" (UID: \"b03256a3-ec2d-4b64-9922-7269c901288f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp" Apr 17 18:24:59.813797 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:59.813732 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/b03256a3-ec2d-4b64-9922-7269c901288f-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp\" (UID: \"b03256a3-ec2d-4b64-9922-7269c901288f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp" Apr 17 18:24:59.815285 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:59.815244 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b03256a3-ec2d-4b64-9922-7269c901288f-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp\" (UID: \"b03256a3-ec2d-4b64-9922-7269c901288f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp" Apr 17 18:24:59.823820 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:59.823755 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-46v2r\" (UniqueName: \"kubernetes.io/projected/b03256a3-ec2d-4b64-9922-7269c901288f-kube-api-access-46v2r\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp\" (UID: \"b03256a3-ec2d-4b64-9922-7269c901288f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp" Apr 17 18:24:59.886966 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:24:59.886939 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp" Apr 17 18:25:00.004424 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:25:00.004397 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp"] Apr 17 18:25:00.006883 ip-10-0-140-147 kubenswrapper[2566]: W0417 18:25:00.006855 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb03256a3_ec2d_4b64_9922_7269c901288f.slice/crio-6c4a95f4ed2aec684ac5c1900dfe9e8dea84a772a9834b7be6c04a97979dd5a8 WatchSource:0}: Error finding container 6c4a95f4ed2aec684ac5c1900dfe9e8dea84a772a9834b7be6c04a97979dd5a8: Status 404 returned error can't find the container with id 6c4a95f4ed2aec684ac5c1900dfe9e8dea84a772a9834b7be6c04a97979dd5a8 Apr 17 18:25:00.838301 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:25:00.838270 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3de0b41-a670-44ce-9641-575ddc679933" path="/var/lib/kubelet/pods/c3de0b41-a670-44ce-9641-575ddc679933/volumes" Apr 17 18:25:00.990715 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:25:00.990688 2566 generic.go:358] "Generic (PLEG): container finished" podID="b03256a3-ec2d-4b64-9922-7269c901288f" containerID="8f246bdee039f3f11a4dd235cc4e4d0293049d979f1803f3966b5a32e334d3b9" exitCode=0 Apr 17 18:25:00.990813 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:25:00.990730 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp" event={"ID":"b03256a3-ec2d-4b64-9922-7269c901288f","Type":"ContainerDied","Data":"8f246bdee039f3f11a4dd235cc4e4d0293049d979f1803f3966b5a32e334d3b9"} Apr 17 18:25:00.990813 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:25:00.990758 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp" event={"ID":"b03256a3-ec2d-4b64-9922-7269c901288f","Type":"ContainerStarted","Data":"6c4a95f4ed2aec684ac5c1900dfe9e8dea84a772a9834b7be6c04a97979dd5a8"} Apr 17 18:25:01.995878 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:25:01.995840 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp" event={"ID":"b03256a3-ec2d-4b64-9922-7269c901288f","Type":"ContainerStarted","Data":"2428054df9bcf24181b5e4e2133dd0f872ac1fc5fd5d4c9e0c20c68d9c7f81aa"} Apr 17 18:25:01.995878 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:25:01.995875 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp" event={"ID":"b03256a3-ec2d-4b64-9922-7269c901288f","Type":"ContainerStarted","Data":"af58ba4ac831b5536efca51c51e3c72ca4fc335e5532fb8aa6d76bab1ed1218f"} Apr 17 18:25:01.996389 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:25:01.996011 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp" Apr 17 18:25:02.017476 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:25:02.017429 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp" podStartSLOduration=3.017415533 podStartE2EDuration="3.017415533s" podCreationTimestamp="2026-04-17 18:24:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:25:02.015215988 +0000 UTC m=+3613.685295631" watchObservedRunningTime="2026-04-17 18:25:02.017415533 +0000 UTC m=+3613.687495175" Apr 17 18:25:03.001668 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:25:03.001636 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp" Apr 17 18:25:03.002824 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:25:03.002796 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp" podUID="b03256a3-ec2d-4b64-9922-7269c901288f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.61:8080: connect: connection refused" Apr 17 18:25:04.004442 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:25:04.004402 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp" podUID="b03256a3-ec2d-4b64-9922-7269c901288f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.61:8080: connect: connection refused" Apr 17 18:25:09.008771 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:25:09.008742 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp" Apr 17 18:25:09.009321 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:25:09.009294 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp" podUID="b03256a3-ec2d-4b64-9922-7269c901288f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.61:8080: connect: connection refused" Apr 17 18:25:13.502515 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:25:13.502485 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz777_35fe1bc2-e1a4-4744-8f67-3cc38747e567/ovn-acl-logging/0.log" Apr 17 18:25:13.506538 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:25:13.506518 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz777_35fe1bc2-e1a4-4744-8f67-3cc38747e567/ovn-acl-logging/0.log" Apr 17 18:25:19.010041 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:25:19.009999 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp" podUID="b03256a3-ec2d-4b64-9922-7269c901288f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.61:8080: connect: connection refused" Apr 17 18:25:29.009331 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:25:29.009292 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp" podUID="b03256a3-ec2d-4b64-9922-7269c901288f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.61:8080: connect: connection refused" Apr 17 18:25:39.009310 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:25:39.009270 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp" podUID="b03256a3-ec2d-4b64-9922-7269c901288f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.61:8080: connect: connection refused" Apr 17 18:25:49.009432 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:25:49.009392 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp" podUID="b03256a3-ec2d-4b64-9922-7269c901288f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.61:8080: connect: connection refused" Apr 17 18:25:59.009848 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:25:59.009759 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp" podUID="b03256a3-ec2d-4b64-9922-7269c901288f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.61:8080: connect: connection refused" Apr 17 18:26:09.010111 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:09.010085 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp" Apr 17 18:26:09.614714 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:09.614680 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp"] Apr 17 18:26:09.615000 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:09.614974 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp" podUID="b03256a3-ec2d-4b64-9922-7269c901288f" containerName="kserve-container" containerID="cri-o://af58ba4ac831b5536efca51c51e3c72ca4fc335e5532fb8aa6d76bab1ed1218f" gracePeriod=30 Apr 17 18:26:09.615079 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:09.615052 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp" podUID="b03256a3-ec2d-4b64-9922-7269c901288f" containerName="kube-rbac-proxy" containerID="cri-o://2428054df9bcf24181b5e4e2133dd0f872ac1fc5fd5d4c9e0c20c68d9c7f81aa" gracePeriod=30 Apr 17 18:26:10.203884 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:10.203853 2566 generic.go:358] "Generic (PLEG): container finished" podID="b03256a3-ec2d-4b64-9922-7269c901288f" containerID="2428054df9bcf24181b5e4e2133dd0f872ac1fc5fd5d4c9e0c20c68d9c7f81aa" exitCode=2 Apr 17 18:26:10.203884 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:10.203882 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp" event={"ID":"b03256a3-ec2d-4b64-9922-7269c901288f","Type":"ContainerDied","Data":"2428054df9bcf24181b5e4e2133dd0f872ac1fc5fd5d4c9e0c20c68d9c7f81aa"} Apr 17 18:26:10.678629 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:10.678600 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-9g7n5"] Apr 17 18:26:10.682013 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:10.681995 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-9g7n5" Apr 17 18:26:10.684461 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:10.684434 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\"" Apr 17 18:26:10.684461 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:10.684437 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-fail-predictor-serving-cert\"" Apr 17 18:26:10.694389 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:10.694365 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-9g7n5"] Apr 17 18:26:10.696636 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:10.696610 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/82ebece8-5e9c-4bc6-a077-b96fa52986ec-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-9g7n5\" (UID: \"82ebece8-5e9c-4bc6-a077-b96fa52986ec\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-9g7n5" Apr 17 18:26:10.696751 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:10.696702 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/82ebece8-5e9c-4bc6-a077-b96fa52986ec-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-9g7n5\" (UID: \"82ebece8-5e9c-4bc6-a077-b96fa52986ec\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-9g7n5" Apr 17 18:26:10.696751 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:10.696737 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/82ebece8-5e9c-4bc6-a077-b96fa52986ec-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-9g7n5\" (UID: \"82ebece8-5e9c-4bc6-a077-b96fa52986ec\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-9g7n5" Apr 17 18:26:10.696877 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:10.696847 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnbp8\" (UniqueName: \"kubernetes.io/projected/82ebece8-5e9c-4bc6-a077-b96fa52986ec-kube-api-access-xnbp8\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-9g7n5\" (UID: \"82ebece8-5e9c-4bc6-a077-b96fa52986ec\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-9g7n5" Apr 17 18:26:10.797945 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:10.797909 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/82ebece8-5e9c-4bc6-a077-b96fa52986ec-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-9g7n5\" (UID: \"82ebece8-5e9c-4bc6-a077-b96fa52986ec\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-9g7n5" Apr 17 18:26:10.797945 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:10.797943 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/82ebece8-5e9c-4bc6-a077-b96fa52986ec-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-9g7n5\" (UID: \"82ebece8-5e9c-4bc6-a077-b96fa52986ec\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-9g7n5" Apr 17 18:26:10.798171 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:10.797991 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xnbp8\" (UniqueName: \"kubernetes.io/projected/82ebece8-5e9c-4bc6-a077-b96fa52986ec-kube-api-access-xnbp8\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-9g7n5\" (UID: \"82ebece8-5e9c-4bc6-a077-b96fa52986ec\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-9g7n5" Apr 17 18:26:10.798171 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:10.798012 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/82ebece8-5e9c-4bc6-a077-b96fa52986ec-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-9g7n5\" (UID: \"82ebece8-5e9c-4bc6-a077-b96fa52986ec\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-9g7n5" Apr 17 18:26:10.798508 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:10.798482 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/82ebece8-5e9c-4bc6-a077-b96fa52986ec-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-9g7n5\" (UID: \"82ebece8-5e9c-4bc6-a077-b96fa52986ec\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-9g7n5" Apr 17 18:26:10.798778 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:10.798756 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/82ebece8-5e9c-4bc6-a077-b96fa52986ec-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-9g7n5\" (UID: \"82ebece8-5e9c-4bc6-a077-b96fa52986ec\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-9g7n5" Apr 17 18:26:10.800469 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:10.800445 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/82ebece8-5e9c-4bc6-a077-b96fa52986ec-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-9g7n5\" (UID: \"82ebece8-5e9c-4bc6-a077-b96fa52986ec\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-9g7n5" Apr 17 18:26:10.806221 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:10.806200 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnbp8\" (UniqueName: \"kubernetes.io/projected/82ebece8-5e9c-4bc6-a077-b96fa52986ec-kube-api-access-xnbp8\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-9g7n5\" (UID: \"82ebece8-5e9c-4bc6-a077-b96fa52986ec\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-9g7n5" Apr 17 18:26:10.993983 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:10.993904 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-9g7n5" Apr 17 18:26:11.114061 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:11.114032 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-9g7n5"] Apr 17 18:26:11.116636 ip-10-0-140-147 kubenswrapper[2566]: W0417 18:26:11.116608 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82ebece8_5e9c_4bc6_a077_b96fa52986ec.slice/crio-aefb9d41b5df5781e64b9fb52846762cbe4f117457a3c20eff2c3b6d804e46ab WatchSource:0}: Error finding container aefb9d41b5df5781e64b9fb52846762cbe4f117457a3c20eff2c3b6d804e46ab: Status 404 returned error can't find the container with id aefb9d41b5df5781e64b9fb52846762cbe4f117457a3c20eff2c3b6d804e46ab Apr 17 18:26:11.118954 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:11.118935 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 18:26:11.208048 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:11.208017 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-9g7n5" event={"ID":"82ebece8-5e9c-4bc6-a077-b96fa52986ec","Type":"ContainerStarted","Data":"99b4303dca874bad96839c0eccdef0a5222581d41a623c2ad79d164c1fbd6cec"} Apr 17 18:26:11.208048 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:11.208052 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-9g7n5" event={"ID":"82ebece8-5e9c-4bc6-a077-b96fa52986ec","Type":"ContainerStarted","Data":"aefb9d41b5df5781e64b9fb52846762cbe4f117457a3c20eff2c3b6d804e46ab"} Apr 17 18:26:13.955317 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:13.955294 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp" Apr 17 18:26:14.026101 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:14.026023 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b03256a3-ec2d-4b64-9922-7269c901288f-proxy-tls\") pod \"b03256a3-ec2d-4b64-9922-7269c901288f\" (UID: \"b03256a3-ec2d-4b64-9922-7269c901288f\") " Apr 17 18:26:14.026101 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:14.026053 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46v2r\" (UniqueName: \"kubernetes.io/projected/b03256a3-ec2d-4b64-9922-7269c901288f-kube-api-access-46v2r\") pod \"b03256a3-ec2d-4b64-9922-7269c901288f\" (UID: \"b03256a3-ec2d-4b64-9922-7269c901288f\") " Apr 17 18:26:14.026101 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:14.026083 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b03256a3-ec2d-4b64-9922-7269c901288f-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"b03256a3-ec2d-4b64-9922-7269c901288f\" (UID: \"b03256a3-ec2d-4b64-9922-7269c901288f\") " Apr 17 18:26:14.026392 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:14.026109 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/b03256a3-ec2d-4b64-9922-7269c901288f-cabundle-cert\") pod \"b03256a3-ec2d-4b64-9922-7269c901288f\" (UID: \"b03256a3-ec2d-4b64-9922-7269c901288f\") " Apr 17 18:26:14.026392 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:14.026174 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b03256a3-ec2d-4b64-9922-7269c901288f-kserve-provision-location\") pod \"b03256a3-ec2d-4b64-9922-7269c901288f\" (UID: \"b03256a3-ec2d-4b64-9922-7269c901288f\") " Apr 17 18:26:14.026534 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:14.026506 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b03256a3-ec2d-4b64-9922-7269c901288f-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config") pod "b03256a3-ec2d-4b64-9922-7269c901288f" (UID: "b03256a3-ec2d-4b64-9922-7269c901288f"). InnerVolumeSpecName "isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:26:14.026617 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:14.026539 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b03256a3-ec2d-4b64-9922-7269c901288f-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "b03256a3-ec2d-4b64-9922-7269c901288f" (UID: "b03256a3-ec2d-4b64-9922-7269c901288f"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:26:14.026617 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:14.026555 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b03256a3-ec2d-4b64-9922-7269c901288f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b03256a3-ec2d-4b64-9922-7269c901288f" (UID: "b03256a3-ec2d-4b64-9922-7269c901288f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:26:14.028189 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:14.028170 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b03256a3-ec2d-4b64-9922-7269c901288f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b03256a3-ec2d-4b64-9922-7269c901288f" (UID: "b03256a3-ec2d-4b64-9922-7269c901288f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:26:14.028276 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:14.028236 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b03256a3-ec2d-4b64-9922-7269c901288f-kube-api-access-46v2r" (OuterVolumeSpecName: "kube-api-access-46v2r") pod "b03256a3-ec2d-4b64-9922-7269c901288f" (UID: "b03256a3-ec2d-4b64-9922-7269c901288f"). InnerVolumeSpecName "kube-api-access-46v2r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:26:14.126901 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:14.126864 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b03256a3-ec2d-4b64-9922-7269c901288f-kserve-provision-location\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:26:14.126901 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:14.126894 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b03256a3-ec2d-4b64-9922-7269c901288f-proxy-tls\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:26:14.126901 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:14.126904 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-46v2r\" (UniqueName: \"kubernetes.io/projected/b03256a3-ec2d-4b64-9922-7269c901288f-kube-api-access-46v2r\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:26:14.127204 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:14.126915 2566 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b03256a3-ec2d-4b64-9922-7269c901288f-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:26:14.127204 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:14.126925 2566 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/b03256a3-ec2d-4b64-9922-7269c901288f-cabundle-cert\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:26:14.221498 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:14.221464 2566 generic.go:358] "Generic (PLEG): container finished" podID="b03256a3-ec2d-4b64-9922-7269c901288f" containerID="af58ba4ac831b5536efca51c51e3c72ca4fc335e5532fb8aa6d76bab1ed1218f" exitCode=0 Apr 17 18:26:14.221646 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:14.221512 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp" event={"ID":"b03256a3-ec2d-4b64-9922-7269c901288f","Type":"ContainerDied","Data":"af58ba4ac831b5536efca51c51e3c72ca4fc335e5532fb8aa6d76bab1ed1218f"} Apr 17 18:26:14.221646 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:14.221535 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp" event={"ID":"b03256a3-ec2d-4b64-9922-7269c901288f","Type":"ContainerDied","Data":"6c4a95f4ed2aec684ac5c1900dfe9e8dea84a772a9834b7be6c04a97979dd5a8"} Apr 17 18:26:14.221646 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:14.221542 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp" Apr 17 18:26:14.221646 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:14.221550 2566 scope.go:117] "RemoveContainer" containerID="2428054df9bcf24181b5e4e2133dd0f872ac1fc5fd5d4c9e0c20c68d9c7f81aa" Apr 17 18:26:14.230139 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:14.230123 2566 scope.go:117] "RemoveContainer" containerID="af58ba4ac831b5536efca51c51e3c72ca4fc335e5532fb8aa6d76bab1ed1218f" Apr 17 18:26:14.236878 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:14.236862 2566 scope.go:117] "RemoveContainer" containerID="8f246bdee039f3f11a4dd235cc4e4d0293049d979f1803f3966b5a32e334d3b9" Apr 17 18:26:14.243370 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:14.243352 2566 scope.go:117] "RemoveContainer" containerID="2428054df9bcf24181b5e4e2133dd0f872ac1fc5fd5d4c9e0c20c68d9c7f81aa" Apr 17 18:26:14.243705 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:26:14.243678 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2428054df9bcf24181b5e4e2133dd0f872ac1fc5fd5d4c9e0c20c68d9c7f81aa\": container with ID starting with 2428054df9bcf24181b5e4e2133dd0f872ac1fc5fd5d4c9e0c20c68d9c7f81aa not found: ID does not exist" containerID="2428054df9bcf24181b5e4e2133dd0f872ac1fc5fd5d4c9e0c20c68d9c7f81aa" Apr 17 18:26:14.243872 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:14.243715 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2428054df9bcf24181b5e4e2133dd0f872ac1fc5fd5d4c9e0c20c68d9c7f81aa"} err="failed to get container status \"2428054df9bcf24181b5e4e2133dd0f872ac1fc5fd5d4c9e0c20c68d9c7f81aa\": rpc error: code = NotFound desc = could not find container \"2428054df9bcf24181b5e4e2133dd0f872ac1fc5fd5d4c9e0c20c68d9c7f81aa\": container with ID starting with 2428054df9bcf24181b5e4e2133dd0f872ac1fc5fd5d4c9e0c20c68d9c7f81aa not found: ID does not exist" Apr 17 18:26:14.243872 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:14.243739 2566 scope.go:117] "RemoveContainer" containerID="af58ba4ac831b5536efca51c51e3c72ca4fc335e5532fb8aa6d76bab1ed1218f" Apr 17 18:26:14.244112 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:26:14.244084 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af58ba4ac831b5536efca51c51e3c72ca4fc335e5532fb8aa6d76bab1ed1218f\": container with ID starting with af58ba4ac831b5536efca51c51e3c72ca4fc335e5532fb8aa6d76bab1ed1218f not found: ID does not exist" containerID="af58ba4ac831b5536efca51c51e3c72ca4fc335e5532fb8aa6d76bab1ed1218f" Apr 17 18:26:14.244247 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:14.244121 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af58ba4ac831b5536efca51c51e3c72ca4fc335e5532fb8aa6d76bab1ed1218f"} err="failed to get container status \"af58ba4ac831b5536efca51c51e3c72ca4fc335e5532fb8aa6d76bab1ed1218f\": rpc error: code = NotFound desc = could not find container \"af58ba4ac831b5536efca51c51e3c72ca4fc335e5532fb8aa6d76bab1ed1218f\": container with ID starting with af58ba4ac831b5536efca51c51e3c72ca4fc335e5532fb8aa6d76bab1ed1218f not found: ID does not exist" Apr 17 18:26:14.244247 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:14.244142 2566 scope.go:117] "RemoveContainer" containerID="8f246bdee039f3f11a4dd235cc4e4d0293049d979f1803f3966b5a32e334d3b9" Apr 17 18:26:14.244506 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:26:14.244478 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f246bdee039f3f11a4dd235cc4e4d0293049d979f1803f3966b5a32e334d3b9\": container with ID starting with 8f246bdee039f3f11a4dd235cc4e4d0293049d979f1803f3966b5a32e334d3b9 not found: ID does not exist" containerID="8f246bdee039f3f11a4dd235cc4e4d0293049d979f1803f3966b5a32e334d3b9" Apr 17 18:26:14.244588 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:14.244510 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f246bdee039f3f11a4dd235cc4e4d0293049d979f1803f3966b5a32e334d3b9"} err="failed to get container status \"8f246bdee039f3f11a4dd235cc4e4d0293049d979f1803f3966b5a32e334d3b9\": rpc error: code = NotFound desc = could not find container \"8f246bdee039f3f11a4dd235cc4e4d0293049d979f1803f3966b5a32e334d3b9\": container with ID starting with 8f246bdee039f3f11a4dd235cc4e4d0293049d979f1803f3966b5a32e334d3b9 not found: ID does not exist" Apr 17 18:26:14.245982 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:14.245965 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp"] Apr 17 18:26:14.250615 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:14.250596 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-j2cxp"] Apr 17 18:26:14.837894 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:14.837865 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b03256a3-ec2d-4b64-9922-7269c901288f" path="/var/lib/kubelet/pods/b03256a3-ec2d-4b64-9922-7269c901288f/volumes" Apr 17 18:26:17.232758 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:17.232731 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-9g7n5_82ebece8-5e9c-4bc6-a077-b96fa52986ec/storage-initializer/0.log" Apr 17 18:26:17.233127 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:17.232767 2566 generic.go:358] "Generic (PLEG): container finished" podID="82ebece8-5e9c-4bc6-a077-b96fa52986ec" containerID="99b4303dca874bad96839c0eccdef0a5222581d41a623c2ad79d164c1fbd6cec" exitCode=1 Apr 17 18:26:17.233127 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:17.232846 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-9g7n5" event={"ID":"82ebece8-5e9c-4bc6-a077-b96fa52986ec","Type":"ContainerDied","Data":"99b4303dca874bad96839c0eccdef0a5222581d41a623c2ad79d164c1fbd6cec"} Apr 17 18:26:18.238747 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:18.238719 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-9g7n5_82ebece8-5e9c-4bc6-a077-b96fa52986ec/storage-initializer/0.log" Apr 17 18:26:18.239123 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:18.238834 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-9g7n5" event={"ID":"82ebece8-5e9c-4bc6-a077-b96fa52986ec","Type":"ContainerStarted","Data":"101d904853202d69564f12d7ffa8c950f70da97905a66b6aee4cafc7bca6634b"} Apr 17 18:26:20.689423 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:20.689389 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-9g7n5"] Apr 17 18:26:20.689811 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:20.689717 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-9g7n5" podUID="82ebece8-5e9c-4bc6-a077-b96fa52986ec" containerName="storage-initializer" containerID="cri-o://101d904853202d69564f12d7ffa8c950f70da97905a66b6aee4cafc7bca6634b" gracePeriod=30 Apr 17 18:26:21.761781 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:21.761722 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s"] Apr 17 18:26:21.762169 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:21.762037 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b03256a3-ec2d-4b64-9922-7269c901288f" containerName="kserve-container" Apr 17 18:26:21.762169 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:21.762048 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="b03256a3-ec2d-4b64-9922-7269c901288f" containerName="kserve-container" Apr 17 18:26:21.762169 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:21.762059 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b03256a3-ec2d-4b64-9922-7269c901288f" containerName="storage-initializer" Apr 17 18:26:21.762169 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:21.762067 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="b03256a3-ec2d-4b64-9922-7269c901288f" containerName="storage-initializer" Apr 17 18:26:21.762169 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:21.762085 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b03256a3-ec2d-4b64-9922-7269c901288f" containerName="kube-rbac-proxy" Apr 17 18:26:21.762169 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:21.762091 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="b03256a3-ec2d-4b64-9922-7269c901288f" containerName="kube-rbac-proxy" Apr 17 18:26:21.762169 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:21.762138 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="b03256a3-ec2d-4b64-9922-7269c901288f" containerName="kube-rbac-proxy" Apr 17 18:26:21.762169 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:21.762148 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="b03256a3-ec2d-4b64-9922-7269c901288f" containerName="kserve-container" Apr 17 18:26:21.764995 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:21.764978 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s" Apr 17 18:26:21.767065 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:21.767035 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-pass-predictor-serving-cert\"" Apr 17 18:26:21.767163 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:21.767060 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\"" Apr 17 18:26:21.767163 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:21.767103 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 17 18:26:21.775656 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:21.775633 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s"] Apr 17 18:26:21.894440 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:21.894407 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktvng\" (UniqueName: \"kubernetes.io/projected/bb65dac7-d83f-4063-bb76-0cf2a991cd3e-kube-api-access-ktvng\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s\" (UID: \"bb65dac7-d83f-4063-bb76-0cf2a991cd3e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s" Apr 17 18:26:21.894638 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:21.894481 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bb65dac7-d83f-4063-bb76-0cf2a991cd3e-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s\" (UID: \"bb65dac7-d83f-4063-bb76-0cf2a991cd3e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s" Apr 17 18:26:21.894638 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:21.894531 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/bb65dac7-d83f-4063-bb76-0cf2a991cd3e-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s\" (UID: \"bb65dac7-d83f-4063-bb76-0cf2a991cd3e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s" Apr 17 18:26:21.894638 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:21.894601 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bb65dac7-d83f-4063-bb76-0cf2a991cd3e-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s\" (UID: \"bb65dac7-d83f-4063-bb76-0cf2a991cd3e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s" Apr 17 18:26:21.894765 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:21.894640 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bb65dac7-d83f-4063-bb76-0cf2a991cd3e-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s\" (UID: \"bb65dac7-d83f-4063-bb76-0cf2a991cd3e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s" Apr 17 18:26:21.995265 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:21.995228 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bb65dac7-d83f-4063-bb76-0cf2a991cd3e-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s\" (UID: \"bb65dac7-d83f-4063-bb76-0cf2a991cd3e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s" Apr 17 18:26:21.995460 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:21.995288 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/bb65dac7-d83f-4063-bb76-0cf2a991cd3e-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s\" (UID: \"bb65dac7-d83f-4063-bb76-0cf2a991cd3e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s" Apr 17 18:26:21.995460 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:21.995311 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bb65dac7-d83f-4063-bb76-0cf2a991cd3e-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s\" (UID: \"bb65dac7-d83f-4063-bb76-0cf2a991cd3e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s" Apr 17 18:26:21.995460 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:21.995336 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bb65dac7-d83f-4063-bb76-0cf2a991cd3e-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s\" (UID: \"bb65dac7-d83f-4063-bb76-0cf2a991cd3e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s" Apr 17 18:26:21.995460 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:21.995375 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ktvng\" (UniqueName: \"kubernetes.io/projected/bb65dac7-d83f-4063-bb76-0cf2a991cd3e-kube-api-access-ktvng\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s\" (UID: \"bb65dac7-d83f-4063-bb76-0cf2a991cd3e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s" Apr 17 18:26:21.995702 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:26:21.995472 2566 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-serving-cert: secret "isvc-sklearn-s3-tls-serving-pass-predictor-serving-cert" not found Apr 17 18:26:21.995702 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:26:21.995549 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb65dac7-d83f-4063-bb76-0cf2a991cd3e-proxy-tls podName:bb65dac7-d83f-4063-bb76-0cf2a991cd3e nodeName:}" failed. No retries permitted until 2026-04-17 18:26:22.49552782 +0000 UTC m=+3694.165607444 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/bb65dac7-d83f-4063-bb76-0cf2a991cd3e-proxy-tls") pod "isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s" (UID: "bb65dac7-d83f-4063-bb76-0cf2a991cd3e") : secret "isvc-sklearn-s3-tls-serving-pass-predictor-serving-cert" not found Apr 17 18:26:21.995702 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:21.995667 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bb65dac7-d83f-4063-bb76-0cf2a991cd3e-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s\" (UID: \"bb65dac7-d83f-4063-bb76-0cf2a991cd3e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s" Apr 17 18:26:21.995955 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:21.995933 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/bb65dac7-d83f-4063-bb76-0cf2a991cd3e-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s\" (UID: \"bb65dac7-d83f-4063-bb76-0cf2a991cd3e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s" Apr 17 18:26:21.996077 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:21.996058 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bb65dac7-d83f-4063-bb76-0cf2a991cd3e-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s\" (UID: \"bb65dac7-d83f-4063-bb76-0cf2a991cd3e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s" Apr 17 18:26:22.004075 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:22.004047 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktvng\" (UniqueName: \"kubernetes.io/projected/bb65dac7-d83f-4063-bb76-0cf2a991cd3e-kube-api-access-ktvng\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s\" (UID: \"bb65dac7-d83f-4063-bb76-0cf2a991cd3e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s" Apr 17 18:26:22.254476 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:22.254451 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-9g7n5_82ebece8-5e9c-4bc6-a077-b96fa52986ec/storage-initializer/1.log" Apr 17 18:26:22.254836 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:22.254817 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-9g7n5_82ebece8-5e9c-4bc6-a077-b96fa52986ec/storage-initializer/0.log" Apr 17 18:26:22.254932 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:22.254856 2566 generic.go:358] "Generic (PLEG): container finished" podID="82ebece8-5e9c-4bc6-a077-b96fa52986ec" containerID="101d904853202d69564f12d7ffa8c950f70da97905a66b6aee4cafc7bca6634b" exitCode=1 Apr 17 18:26:22.254932 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:22.254911 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-9g7n5" event={"ID":"82ebece8-5e9c-4bc6-a077-b96fa52986ec","Type":"ContainerDied","Data":"101d904853202d69564f12d7ffa8c950f70da97905a66b6aee4cafc7bca6634b"} Apr 17 18:26:22.255043 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:22.254952 2566 scope.go:117] "RemoveContainer" containerID="99b4303dca874bad96839c0eccdef0a5222581d41a623c2ad79d164c1fbd6cec" Apr 17 18:26:22.317886 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:22.317866 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-9g7n5_82ebece8-5e9c-4bc6-a077-b96fa52986ec/storage-initializer/1.log" Apr 17 18:26:22.318003 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:22.317926 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-9g7n5" Apr 17 18:26:22.399163 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:22.399130 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/82ebece8-5e9c-4bc6-a077-b96fa52986ec-proxy-tls\") pod \"82ebece8-5e9c-4bc6-a077-b96fa52986ec\" (UID: \"82ebece8-5e9c-4bc6-a077-b96fa52986ec\") " Apr 17 18:26:22.399339 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:22.399180 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/82ebece8-5e9c-4bc6-a077-b96fa52986ec-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"82ebece8-5e9c-4bc6-a077-b96fa52986ec\" (UID: \"82ebece8-5e9c-4bc6-a077-b96fa52986ec\") " Apr 17 18:26:22.399339 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:22.399209 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnbp8\" (UniqueName: \"kubernetes.io/projected/82ebece8-5e9c-4bc6-a077-b96fa52986ec-kube-api-access-xnbp8\") pod \"82ebece8-5e9c-4bc6-a077-b96fa52986ec\" (UID: \"82ebece8-5e9c-4bc6-a077-b96fa52986ec\") " Apr 17 18:26:22.399339 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:22.399299 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/82ebece8-5e9c-4bc6-a077-b96fa52986ec-kserve-provision-location\") pod \"82ebece8-5e9c-4bc6-a077-b96fa52986ec\" (UID: \"82ebece8-5e9c-4bc6-a077-b96fa52986ec\") " Apr 17 18:26:22.399622 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:22.399600 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82ebece8-5e9c-4bc6-a077-b96fa52986ec-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "82ebece8-5e9c-4bc6-a077-b96fa52986ec" (UID: "82ebece8-5e9c-4bc6-a077-b96fa52986ec"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:26:22.399681 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:22.399632 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82ebece8-5e9c-4bc6-a077-b96fa52986ec-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config") pod "82ebece8-5e9c-4bc6-a077-b96fa52986ec" (UID: "82ebece8-5e9c-4bc6-a077-b96fa52986ec"). InnerVolumeSpecName "isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:26:22.401223 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:22.401201 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82ebece8-5e9c-4bc6-a077-b96fa52986ec-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "82ebece8-5e9c-4bc6-a077-b96fa52986ec" (UID: "82ebece8-5e9c-4bc6-a077-b96fa52986ec"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:26:22.401316 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:22.401228 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82ebece8-5e9c-4bc6-a077-b96fa52986ec-kube-api-access-xnbp8" (OuterVolumeSpecName: "kube-api-access-xnbp8") pod "82ebece8-5e9c-4bc6-a077-b96fa52986ec" (UID: "82ebece8-5e9c-4bc6-a077-b96fa52986ec"). InnerVolumeSpecName "kube-api-access-xnbp8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:26:22.500591 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:22.500507 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bb65dac7-d83f-4063-bb76-0cf2a991cd3e-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s\" (UID: \"bb65dac7-d83f-4063-bb76-0cf2a991cd3e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s" Apr 17 18:26:22.500591 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:22.500566 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/82ebece8-5e9c-4bc6-a077-b96fa52986ec-kserve-provision-location\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:26:22.500591 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:22.500579 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/82ebece8-5e9c-4bc6-a077-b96fa52986ec-proxy-tls\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:26:22.500591 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:22.500590 2566 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/82ebece8-5e9c-4bc6-a077-b96fa52986ec-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:26:22.500837 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:22.500601 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xnbp8\" (UniqueName: \"kubernetes.io/projected/82ebece8-5e9c-4bc6-a077-b96fa52986ec-kube-api-access-xnbp8\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:26:22.502854 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:22.502825 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bb65dac7-d83f-4063-bb76-0cf2a991cd3e-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s\" (UID: \"bb65dac7-d83f-4063-bb76-0cf2a991cd3e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s" Apr 17 18:26:22.674274 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:22.674215 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s" Apr 17 18:26:22.795379 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:22.795326 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s"] Apr 17 18:26:22.799023 ip-10-0-140-147 kubenswrapper[2566]: W0417 18:26:22.798990 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb65dac7_d83f_4063_bb76_0cf2a991cd3e.slice/crio-ad831386cc57ffebb49cd67d36ebd7a713d9d87af0c813fc5d5bb7c3538fedc0 WatchSource:0}: Error finding container ad831386cc57ffebb49cd67d36ebd7a713d9d87af0c813fc5d5bb7c3538fedc0: Status 404 returned error can't find the container with id ad831386cc57ffebb49cd67d36ebd7a713d9d87af0c813fc5d5bb7c3538fedc0 Apr 17 18:26:23.259245 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:23.259211 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-9g7n5_82ebece8-5e9c-4bc6-a077-b96fa52986ec/storage-initializer/1.log" Apr 17 18:26:23.259449 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:23.259380 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-9g7n5" event={"ID":"82ebece8-5e9c-4bc6-a077-b96fa52986ec","Type":"ContainerDied","Data":"aefb9d41b5df5781e64b9fb52846762cbe4f117457a3c20eff2c3b6d804e46ab"} Apr 17 18:26:23.259449 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:23.259406 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-9g7n5" Apr 17 18:26:23.259449 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:23.259419 2566 scope.go:117] "RemoveContainer" containerID="101d904853202d69564f12d7ffa8c950f70da97905a66b6aee4cafc7bca6634b" Apr 17 18:26:23.261081 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:23.261051 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s" event={"ID":"bb65dac7-d83f-4063-bb76-0cf2a991cd3e","Type":"ContainerStarted","Data":"225dd66cfee899f8b8c8855cd1ca6a6cd8fed9f6929ce67a69098d0c5ac9e378"} Apr 17 18:26:23.261190 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:23.261091 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s" event={"ID":"bb65dac7-d83f-4063-bb76-0cf2a991cd3e","Type":"ContainerStarted","Data":"ad831386cc57ffebb49cd67d36ebd7a713d9d87af0c813fc5d5bb7c3538fedc0"} Apr 17 18:26:23.313888 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:23.313830 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-9g7n5"] Apr 17 18:26:23.315997 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:23.315973 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-9g7n5"] Apr 17 18:26:24.265215 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:24.265178 2566 generic.go:358] "Generic (PLEG): container finished" podID="bb65dac7-d83f-4063-bb76-0cf2a991cd3e" containerID="225dd66cfee899f8b8c8855cd1ca6a6cd8fed9f6929ce67a69098d0c5ac9e378" exitCode=0 Apr 17 18:26:24.265640 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:24.265270 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s" event={"ID":"bb65dac7-d83f-4063-bb76-0cf2a991cd3e","Type":"ContainerDied","Data":"225dd66cfee899f8b8c8855cd1ca6a6cd8fed9f6929ce67a69098d0c5ac9e378"} Apr 17 18:26:24.838272 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:24.838222 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82ebece8-5e9c-4bc6-a077-b96fa52986ec" path="/var/lib/kubelet/pods/82ebece8-5e9c-4bc6-a077-b96fa52986ec/volumes" Apr 17 18:26:25.271004 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:25.270918 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s" event={"ID":"bb65dac7-d83f-4063-bb76-0cf2a991cd3e","Type":"ContainerStarted","Data":"4bf3a7ba30f9992e1ea743e848b99c2625c84d1fade4f7c7fbe21fdc3e3d2b73"} Apr 17 18:26:25.271004 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:25.270957 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s" event={"ID":"bb65dac7-d83f-4063-bb76-0cf2a991cd3e","Type":"ContainerStarted","Data":"468140bb1eaf8d8110f235de3bbf4cc4a69bff0b1dfd4b4f4e12e4e700ac1905"} Apr 17 18:26:25.271469 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:25.271102 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s" Apr 17 18:26:25.292027 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:25.291976 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s" podStartSLOduration=4.291961533 podStartE2EDuration="4.291961533s" podCreationTimestamp="2026-04-17 18:26:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:26:25.289565813 +0000 UTC m=+3696.959645455" watchObservedRunningTime="2026-04-17 18:26:25.291961533 +0000 UTC m=+3696.962041175" Apr 17 18:26:26.274659 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:26.274626 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s" Apr 17 18:26:26.275807 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:26.275783 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s" podUID="bb65dac7-d83f-4063-bb76-0cf2a991cd3e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.63:8080: connect: connection refused" Apr 17 18:26:27.277857 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:27.277815 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s" podUID="bb65dac7-d83f-4063-bb76-0cf2a991cd3e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.63:8080: connect: connection refused" Apr 17 18:26:32.283351 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:32.283324 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s" Apr 17 18:26:32.283916 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:32.283889 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s" podUID="bb65dac7-d83f-4063-bb76-0cf2a991cd3e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.63:8080: connect: connection refused" Apr 17 18:26:42.284161 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:42.284118 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s" podUID="bb65dac7-d83f-4063-bb76-0cf2a991cd3e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.63:8080: connect: connection refused" Apr 17 18:26:52.284223 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:26:52.284183 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s" podUID="bb65dac7-d83f-4063-bb76-0cf2a991cd3e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.63:8080: connect: connection refused" Apr 17 18:27:02.284374 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:02.284332 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s" podUID="bb65dac7-d83f-4063-bb76-0cf2a991cd3e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.63:8080: connect: connection refused" Apr 17 18:27:12.284437 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:12.284397 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s" podUID="bb65dac7-d83f-4063-bb76-0cf2a991cd3e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.63:8080: connect: connection refused" Apr 17 18:27:22.283920 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:22.283876 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s" podUID="bb65dac7-d83f-4063-bb76-0cf2a991cd3e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.63:8080: connect: connection refused" Apr 17 18:27:32.284405 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:32.284371 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s" Apr 17 18:27:41.787631 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:41.787596 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s"] Apr 17 18:27:41.788021 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:41.787928 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s" podUID="bb65dac7-d83f-4063-bb76-0cf2a991cd3e" containerName="kserve-container" containerID="cri-o://468140bb1eaf8d8110f235de3bbf4cc4a69bff0b1dfd4b4f4e12e4e700ac1905" gracePeriod=30 Apr 17 18:27:41.788021 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:41.787978 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s" podUID="bb65dac7-d83f-4063-bb76-0cf2a991cd3e" containerName="kube-rbac-proxy" containerID="cri-o://4bf3a7ba30f9992e1ea743e848b99c2625c84d1fade4f7c7fbe21fdc3e3d2b73" gracePeriod=30 Apr 17 18:27:42.278399 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:42.278352 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s" podUID="bb65dac7-d83f-4063-bb76-0cf2a991cd3e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.63:8643/healthz\": dial tcp 10.133.0.63:8643: connect: connection refused" Apr 17 18:27:42.284078 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:42.284055 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s" podUID="bb65dac7-d83f-4063-bb76-0cf2a991cd3e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.63:8080: connect: connection refused" Apr 17 18:27:42.492005 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:42.491975 2566 generic.go:358] "Generic (PLEG): container finished" podID="bb65dac7-d83f-4063-bb76-0cf2a991cd3e" containerID="4bf3a7ba30f9992e1ea743e848b99c2625c84d1fade4f7c7fbe21fdc3e3d2b73" exitCode=2 Apr 17 18:27:42.492200 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:42.492014 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s" event={"ID":"bb65dac7-d83f-4063-bb76-0cf2a991cd3e","Type":"ContainerDied","Data":"4bf3a7ba30f9992e1ea743e848b99c2625c84d1fade4f7c7fbe21fdc3e3d2b73"} Apr 17 18:27:42.852781 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:42.852752 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-p2f8w"] Apr 17 18:27:42.853151 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:42.853065 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="82ebece8-5e9c-4bc6-a077-b96fa52986ec" containerName="storage-initializer" Apr 17 18:27:42.853151 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:42.853076 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="82ebece8-5e9c-4bc6-a077-b96fa52986ec" containerName="storage-initializer" Apr 17 18:27:42.853151 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:42.853125 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="82ebece8-5e9c-4bc6-a077-b96fa52986ec" containerName="storage-initializer" Apr 17 18:27:42.853151 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:42.853139 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="82ebece8-5e9c-4bc6-a077-b96fa52986ec" containerName="storage-initializer" Apr 17 18:27:42.853337 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:42.853216 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="82ebece8-5e9c-4bc6-a077-b96fa52986ec" containerName="storage-initializer" Apr 17 18:27:42.853337 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:42.853228 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="82ebece8-5e9c-4bc6-a077-b96fa52986ec" containerName="storage-initializer" Apr 17 18:27:42.856131 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:42.856112 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-p2f8w" Apr 17 18:27:42.858465 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:42.858450 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert\"" Apr 17 18:27:42.858526 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:42.858458 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\"" Apr 17 18:27:42.866339 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:42.866316 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-p2f8w"] Apr 17 18:27:42.880511 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:42.880487 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/09a34602-dd68-43d3-8b1d-51ae628c1b9a-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-p2f8w\" (UID: \"09a34602-dd68-43d3-8b1d-51ae628c1b9a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-p2f8w" Apr 17 18:27:42.880624 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:42.880523 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8287q\" (UniqueName: \"kubernetes.io/projected/09a34602-dd68-43d3-8b1d-51ae628c1b9a-kube-api-access-8287q\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-p2f8w\" (UID: \"09a34602-dd68-43d3-8b1d-51ae628c1b9a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-p2f8w" Apr 17 18:27:42.880624 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:42.880546 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/09a34602-dd68-43d3-8b1d-51ae628c1b9a-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-p2f8w\" (UID: \"09a34602-dd68-43d3-8b1d-51ae628c1b9a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-p2f8w" Apr 17 18:27:42.880624 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:42.880591 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/09a34602-dd68-43d3-8b1d-51ae628c1b9a-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-p2f8w\" (UID: \"09a34602-dd68-43d3-8b1d-51ae628c1b9a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-p2f8w" Apr 17 18:27:42.981436 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:42.981403 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/09a34602-dd68-43d3-8b1d-51ae628c1b9a-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-p2f8w\" (UID: \"09a34602-dd68-43d3-8b1d-51ae628c1b9a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-p2f8w" Apr 17 18:27:42.981620 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:42.981448 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8287q\" (UniqueName: \"kubernetes.io/projected/09a34602-dd68-43d3-8b1d-51ae628c1b9a-kube-api-access-8287q\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-p2f8w\" (UID: \"09a34602-dd68-43d3-8b1d-51ae628c1b9a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-p2f8w" Apr 17 18:27:42.981620 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:42.981471 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/09a34602-dd68-43d3-8b1d-51ae628c1b9a-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-p2f8w\" (UID: \"09a34602-dd68-43d3-8b1d-51ae628c1b9a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-p2f8w" Apr 17 18:27:42.981620 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:42.981487 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/09a34602-dd68-43d3-8b1d-51ae628c1b9a-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-p2f8w\" (UID: \"09a34602-dd68-43d3-8b1d-51ae628c1b9a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-p2f8w" Apr 17 18:27:42.981869 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:42.981851 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/09a34602-dd68-43d3-8b1d-51ae628c1b9a-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-p2f8w\" (UID: \"09a34602-dd68-43d3-8b1d-51ae628c1b9a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-p2f8w" Apr 17 18:27:42.982096 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:42.982071 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/09a34602-dd68-43d3-8b1d-51ae628c1b9a-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-p2f8w\" (UID: \"09a34602-dd68-43d3-8b1d-51ae628c1b9a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-p2f8w" Apr 17 18:27:42.984008 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:42.983986 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/09a34602-dd68-43d3-8b1d-51ae628c1b9a-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-p2f8w\" (UID: \"09a34602-dd68-43d3-8b1d-51ae628c1b9a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-p2f8w" Apr 17 18:27:42.989366 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:42.989345 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8287q\" (UniqueName: \"kubernetes.io/projected/09a34602-dd68-43d3-8b1d-51ae628c1b9a-kube-api-access-8287q\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-p2f8w\" (UID: \"09a34602-dd68-43d3-8b1d-51ae628c1b9a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-p2f8w" Apr 17 18:27:43.165703 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:43.165626 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-p2f8w" Apr 17 18:27:43.283750 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:43.283725 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-p2f8w"] Apr 17 18:27:43.285651 ip-10-0-140-147 kubenswrapper[2566]: W0417 18:27:43.285615 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09a34602_dd68_43d3_8b1d_51ae628c1b9a.slice/crio-31b9928e10f74f41d15ffe492b918846e8bcb5536fffd3f189e515a88f3197dc WatchSource:0}: Error finding container 31b9928e10f74f41d15ffe492b918846e8bcb5536fffd3f189e515a88f3197dc: Status 404 returned error can't find the container with id 31b9928e10f74f41d15ffe492b918846e8bcb5536fffd3f189e515a88f3197dc Apr 17 18:27:43.502300 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:43.502212 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-p2f8w" event={"ID":"09a34602-dd68-43d3-8b1d-51ae628c1b9a","Type":"ContainerStarted","Data":"44d397f47166576bf4e919f8ebafdf73da637f4dbc9e719b34ce746a4e55728a"} Apr 17 18:27:43.502300 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:43.502264 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-p2f8w" event={"ID":"09a34602-dd68-43d3-8b1d-51ae628c1b9a","Type":"ContainerStarted","Data":"31b9928e10f74f41d15ffe492b918846e8bcb5536fffd3f189e515a88f3197dc"} Apr 17 18:27:46.029736 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:46.029712 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s" Apr 17 18:27:46.102452 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:46.102374 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bb65dac7-d83f-4063-bb76-0cf2a991cd3e-kserve-provision-location\") pod \"bb65dac7-d83f-4063-bb76-0cf2a991cd3e\" (UID: \"bb65dac7-d83f-4063-bb76-0cf2a991cd3e\") " Apr 17 18:27:46.102452 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:46.102418 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/bb65dac7-d83f-4063-bb76-0cf2a991cd3e-cabundle-cert\") pod \"bb65dac7-d83f-4063-bb76-0cf2a991cd3e\" (UID: \"bb65dac7-d83f-4063-bb76-0cf2a991cd3e\") " Apr 17 18:27:46.102452 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:46.102446 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bb65dac7-d83f-4063-bb76-0cf2a991cd3e-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"bb65dac7-d83f-4063-bb76-0cf2a991cd3e\" (UID: \"bb65dac7-d83f-4063-bb76-0cf2a991cd3e\") " Apr 17 18:27:46.102709 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:46.102500 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bb65dac7-d83f-4063-bb76-0cf2a991cd3e-proxy-tls\") pod \"bb65dac7-d83f-4063-bb76-0cf2a991cd3e\" (UID: \"bb65dac7-d83f-4063-bb76-0cf2a991cd3e\") " Apr 17 18:27:46.102709 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:46.102541 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktvng\" (UniqueName: \"kubernetes.io/projected/bb65dac7-d83f-4063-bb76-0cf2a991cd3e-kube-api-access-ktvng\") pod \"bb65dac7-d83f-4063-bb76-0cf2a991cd3e\" (UID: \"bb65dac7-d83f-4063-bb76-0cf2a991cd3e\") " Apr 17 18:27:46.102828 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:46.102799 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb65dac7-d83f-4063-bb76-0cf2a991cd3e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bb65dac7-d83f-4063-bb76-0cf2a991cd3e" (UID: "bb65dac7-d83f-4063-bb76-0cf2a991cd3e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:27:46.102828 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:46.102814 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb65dac7-d83f-4063-bb76-0cf2a991cd3e-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "bb65dac7-d83f-4063-bb76-0cf2a991cd3e" (UID: "bb65dac7-d83f-4063-bb76-0cf2a991cd3e"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:27:46.102977 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:46.102872 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb65dac7-d83f-4063-bb76-0cf2a991cd3e-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config") pod "bb65dac7-d83f-4063-bb76-0cf2a991cd3e" (UID: "bb65dac7-d83f-4063-bb76-0cf2a991cd3e"). InnerVolumeSpecName "isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:27:46.104614 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:46.104588 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb65dac7-d83f-4063-bb76-0cf2a991cd3e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "bb65dac7-d83f-4063-bb76-0cf2a991cd3e" (UID: "bb65dac7-d83f-4063-bb76-0cf2a991cd3e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:27:46.104690 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:46.104612 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb65dac7-d83f-4063-bb76-0cf2a991cd3e-kube-api-access-ktvng" (OuterVolumeSpecName: "kube-api-access-ktvng") pod "bb65dac7-d83f-4063-bb76-0cf2a991cd3e" (UID: "bb65dac7-d83f-4063-bb76-0cf2a991cd3e"). InnerVolumeSpecName "kube-api-access-ktvng". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:27:46.203945 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:46.203902 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ktvng\" (UniqueName: \"kubernetes.io/projected/bb65dac7-d83f-4063-bb76-0cf2a991cd3e-kube-api-access-ktvng\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:27:46.203945 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:46.203939 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bb65dac7-d83f-4063-bb76-0cf2a991cd3e-kserve-provision-location\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:27:46.203945 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:46.203951 2566 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/bb65dac7-d83f-4063-bb76-0cf2a991cd3e-cabundle-cert\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:27:46.203945 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:46.203962 2566 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bb65dac7-d83f-4063-bb76-0cf2a991cd3e-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:27:46.204229 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:46.203972 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bb65dac7-d83f-4063-bb76-0cf2a991cd3e-proxy-tls\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:27:46.511768 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:46.511686 2566 generic.go:358] "Generic (PLEG): container finished" podID="bb65dac7-d83f-4063-bb76-0cf2a991cd3e" containerID="468140bb1eaf8d8110f235de3bbf4cc4a69bff0b1dfd4b4f4e12e4e700ac1905" exitCode=0 Apr 17 18:27:46.511768 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:46.511733 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s" event={"ID":"bb65dac7-d83f-4063-bb76-0cf2a991cd3e","Type":"ContainerDied","Data":"468140bb1eaf8d8110f235de3bbf4cc4a69bff0b1dfd4b4f4e12e4e700ac1905"} Apr 17 18:27:46.511768 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:46.511767 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s" Apr 17 18:27:46.511991 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:46.511772 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s" event={"ID":"bb65dac7-d83f-4063-bb76-0cf2a991cd3e","Type":"ContainerDied","Data":"ad831386cc57ffebb49cd67d36ebd7a713d9d87af0c813fc5d5bb7c3538fedc0"} Apr 17 18:27:46.511991 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:46.511788 2566 scope.go:117] "RemoveContainer" containerID="4bf3a7ba30f9992e1ea743e848b99c2625c84d1fade4f7c7fbe21fdc3e3d2b73" Apr 17 18:27:46.519780 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:46.519764 2566 scope.go:117] "RemoveContainer" containerID="468140bb1eaf8d8110f235de3bbf4cc4a69bff0b1dfd4b4f4e12e4e700ac1905" Apr 17 18:27:46.526629 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:46.526612 2566 scope.go:117] "RemoveContainer" containerID="225dd66cfee899f8b8c8855cd1ca6a6cd8fed9f6929ce67a69098d0c5ac9e378" Apr 17 18:27:46.533077 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:46.533060 2566 scope.go:117] "RemoveContainer" containerID="4bf3a7ba30f9992e1ea743e848b99c2625c84d1fade4f7c7fbe21fdc3e3d2b73" Apr 17 18:27:46.533352 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:27:46.533330 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bf3a7ba30f9992e1ea743e848b99c2625c84d1fade4f7c7fbe21fdc3e3d2b73\": container with ID starting with 4bf3a7ba30f9992e1ea743e848b99c2625c84d1fade4f7c7fbe21fdc3e3d2b73 not found: ID does not exist" containerID="4bf3a7ba30f9992e1ea743e848b99c2625c84d1fade4f7c7fbe21fdc3e3d2b73" Apr 17 18:27:46.533464 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:46.533362 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bf3a7ba30f9992e1ea743e848b99c2625c84d1fade4f7c7fbe21fdc3e3d2b73"} err="failed to get container status \"4bf3a7ba30f9992e1ea743e848b99c2625c84d1fade4f7c7fbe21fdc3e3d2b73\": rpc error: code = NotFound desc = could not find container \"4bf3a7ba30f9992e1ea743e848b99c2625c84d1fade4f7c7fbe21fdc3e3d2b73\": container with ID starting with 4bf3a7ba30f9992e1ea743e848b99c2625c84d1fade4f7c7fbe21fdc3e3d2b73 not found: ID does not exist" Apr 17 18:27:46.533464 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:46.533385 2566 scope.go:117] "RemoveContainer" containerID="468140bb1eaf8d8110f235de3bbf4cc4a69bff0b1dfd4b4f4e12e4e700ac1905" Apr 17 18:27:46.533683 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:27:46.533662 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"468140bb1eaf8d8110f235de3bbf4cc4a69bff0b1dfd4b4f4e12e4e700ac1905\": container with ID starting with 468140bb1eaf8d8110f235de3bbf4cc4a69bff0b1dfd4b4f4e12e4e700ac1905 not found: ID does not exist" containerID="468140bb1eaf8d8110f235de3bbf4cc4a69bff0b1dfd4b4f4e12e4e700ac1905" Apr 17 18:27:46.533724 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:46.533694 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"468140bb1eaf8d8110f235de3bbf4cc4a69bff0b1dfd4b4f4e12e4e700ac1905"} err="failed to get container status \"468140bb1eaf8d8110f235de3bbf4cc4a69bff0b1dfd4b4f4e12e4e700ac1905\": rpc error: code = NotFound desc = could not find container \"468140bb1eaf8d8110f235de3bbf4cc4a69bff0b1dfd4b4f4e12e4e700ac1905\": container with ID starting with 468140bb1eaf8d8110f235de3bbf4cc4a69bff0b1dfd4b4f4e12e4e700ac1905 not found: ID does not exist" Apr 17 18:27:46.533724 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:46.533717 2566 scope.go:117] "RemoveContainer" containerID="225dd66cfee899f8b8c8855cd1ca6a6cd8fed9f6929ce67a69098d0c5ac9e378" Apr 17 18:27:46.533989 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:27:46.533970 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"225dd66cfee899f8b8c8855cd1ca6a6cd8fed9f6929ce67a69098d0c5ac9e378\": container with ID starting with 225dd66cfee899f8b8c8855cd1ca6a6cd8fed9f6929ce67a69098d0c5ac9e378 not found: ID does not exist" containerID="225dd66cfee899f8b8c8855cd1ca6a6cd8fed9f6929ce67a69098d0c5ac9e378" Apr 17 18:27:46.534033 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:46.533994 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"225dd66cfee899f8b8c8855cd1ca6a6cd8fed9f6929ce67a69098d0c5ac9e378"} err="failed to get container status \"225dd66cfee899f8b8c8855cd1ca6a6cd8fed9f6929ce67a69098d0c5ac9e378\": rpc error: code = NotFound desc = could not find container \"225dd66cfee899f8b8c8855cd1ca6a6cd8fed9f6929ce67a69098d0c5ac9e378\": container with ID starting with 225dd66cfee899f8b8c8855cd1ca6a6cd8fed9f6929ce67a69098d0c5ac9e378 not found: ID does not exist" Apr 17 18:27:46.534683 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:46.534667 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s"] Apr 17 18:27:46.538408 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:46.538389 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-v544s"] Apr 17 18:27:46.838499 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:46.838471 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb65dac7-d83f-4063-bb76-0cf2a991cd3e" path="/var/lib/kubelet/pods/bb65dac7-d83f-4063-bb76-0cf2a991cd3e/volumes" Apr 17 18:27:49.521679 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:49.521602 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-p2f8w_09a34602-dd68-43d3-8b1d-51ae628c1b9a/storage-initializer/0.log" Apr 17 18:27:49.521679 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:49.521643 2566 generic.go:358] "Generic (PLEG): container finished" podID="09a34602-dd68-43d3-8b1d-51ae628c1b9a" containerID="44d397f47166576bf4e919f8ebafdf73da637f4dbc9e719b34ce746a4e55728a" exitCode=1 Apr 17 18:27:49.522102 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:49.521720 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-p2f8w" event={"ID":"09a34602-dd68-43d3-8b1d-51ae628c1b9a","Type":"ContainerDied","Data":"44d397f47166576bf4e919f8ebafdf73da637f4dbc9e719b34ce746a4e55728a"} Apr 17 18:27:50.526081 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:50.526054 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-p2f8w_09a34602-dd68-43d3-8b1d-51ae628c1b9a/storage-initializer/0.log" Apr 17 18:27:50.526560 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:50.526122 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-p2f8w" event={"ID":"09a34602-dd68-43d3-8b1d-51ae628c1b9a","Type":"ContainerStarted","Data":"5920ade69f5b340f20fd878e1b40f77014e5e6f62beacf43e1889b9735c27a50"} Apr 17 18:27:52.842965 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:52.842933 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-p2f8w"] Apr 17 18:27:52.843354 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:52.843225 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-p2f8w" podUID="09a34602-dd68-43d3-8b1d-51ae628c1b9a" containerName="storage-initializer" containerID="cri-o://5920ade69f5b340f20fd878e1b40f77014e5e6f62beacf43e1889b9735c27a50" gracePeriod=30 Apr 17 18:27:54.078389 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:54.078367 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-p2f8w_09a34602-dd68-43d3-8b1d-51ae628c1b9a/storage-initializer/1.log" Apr 17 18:27:54.078741 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:54.078725 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-p2f8w_09a34602-dd68-43d3-8b1d-51ae628c1b9a/storage-initializer/0.log" Apr 17 18:27:54.078801 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:54.078791 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-p2f8w" Apr 17 18:27:54.164899 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:54.164873 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/09a34602-dd68-43d3-8b1d-51ae628c1b9a-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"09a34602-dd68-43d3-8b1d-51ae628c1b9a\" (UID: \"09a34602-dd68-43d3-8b1d-51ae628c1b9a\") " Apr 17 18:27:54.165051 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:54.164933 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/09a34602-dd68-43d3-8b1d-51ae628c1b9a-proxy-tls\") pod \"09a34602-dd68-43d3-8b1d-51ae628c1b9a\" (UID: \"09a34602-dd68-43d3-8b1d-51ae628c1b9a\") " Apr 17 18:27:54.165051 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:54.164962 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/09a34602-dd68-43d3-8b1d-51ae628c1b9a-kserve-provision-location\") pod \"09a34602-dd68-43d3-8b1d-51ae628c1b9a\" (UID: \"09a34602-dd68-43d3-8b1d-51ae628c1b9a\") " Apr 17 18:27:54.165051 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:54.165011 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8287q\" (UniqueName: \"kubernetes.io/projected/09a34602-dd68-43d3-8b1d-51ae628c1b9a-kube-api-access-8287q\") pod \"09a34602-dd68-43d3-8b1d-51ae628c1b9a\" (UID: \"09a34602-dd68-43d3-8b1d-51ae628c1b9a\") " Apr 17 18:27:54.165236 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:54.165210 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09a34602-dd68-43d3-8b1d-51ae628c1b9a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "09a34602-dd68-43d3-8b1d-51ae628c1b9a" (UID: "09a34602-dd68-43d3-8b1d-51ae628c1b9a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:27:54.165318 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:54.165236 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09a34602-dd68-43d3-8b1d-51ae628c1b9a-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config") pod "09a34602-dd68-43d3-8b1d-51ae628c1b9a" (UID: "09a34602-dd68-43d3-8b1d-51ae628c1b9a"). InnerVolumeSpecName "isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:27:54.167033 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:54.167009 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09a34602-dd68-43d3-8b1d-51ae628c1b9a-kube-api-access-8287q" (OuterVolumeSpecName: "kube-api-access-8287q") pod "09a34602-dd68-43d3-8b1d-51ae628c1b9a" (UID: "09a34602-dd68-43d3-8b1d-51ae628c1b9a"). InnerVolumeSpecName "kube-api-access-8287q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:27:54.167033 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:54.167015 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09a34602-dd68-43d3-8b1d-51ae628c1b9a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "09a34602-dd68-43d3-8b1d-51ae628c1b9a" (UID: "09a34602-dd68-43d3-8b1d-51ae628c1b9a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:27:54.265873 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:54.265808 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8287q\" (UniqueName: \"kubernetes.io/projected/09a34602-dd68-43d3-8b1d-51ae628c1b9a-kube-api-access-8287q\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:27:54.265873 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:54.265836 2566 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/09a34602-dd68-43d3-8b1d-51ae628c1b9a-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:27:54.265873 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:54.265848 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/09a34602-dd68-43d3-8b1d-51ae628c1b9a-proxy-tls\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:27:54.265873 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:54.265857 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/09a34602-dd68-43d3-8b1d-51ae628c1b9a-kserve-provision-location\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:27:54.540837 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:54.540758 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-p2f8w_09a34602-dd68-43d3-8b1d-51ae628c1b9a/storage-initializer/1.log" Apr 17 18:27:54.541121 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:54.541104 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-p2f8w_09a34602-dd68-43d3-8b1d-51ae628c1b9a/storage-initializer/0.log" Apr 17 18:27:54.541178 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:54.541140 2566 generic.go:358] "Generic (PLEG): container finished" podID="09a34602-dd68-43d3-8b1d-51ae628c1b9a" containerID="5920ade69f5b340f20fd878e1b40f77014e5e6f62beacf43e1889b9735c27a50" exitCode=1 Apr 17 18:27:54.541216 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:54.541171 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-p2f8w" event={"ID":"09a34602-dd68-43d3-8b1d-51ae628c1b9a","Type":"ContainerDied","Data":"5920ade69f5b340f20fd878e1b40f77014e5e6f62beacf43e1889b9735c27a50"} Apr 17 18:27:54.541216 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:54.541209 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-p2f8w" event={"ID":"09a34602-dd68-43d3-8b1d-51ae628c1b9a","Type":"ContainerDied","Data":"31b9928e10f74f41d15ffe492b918846e8bcb5536fffd3f189e515a88f3197dc"} Apr 17 18:27:54.541306 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:54.541212 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-p2f8w" Apr 17 18:27:54.541306 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:54.541225 2566 scope.go:117] "RemoveContainer" containerID="5920ade69f5b340f20fd878e1b40f77014e5e6f62beacf43e1889b9735c27a50" Apr 17 18:27:54.549359 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:54.549272 2566 scope.go:117] "RemoveContainer" containerID="44d397f47166576bf4e919f8ebafdf73da637f4dbc9e719b34ce746a4e55728a" Apr 17 18:27:54.555856 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:54.555837 2566 scope.go:117] "RemoveContainer" containerID="5920ade69f5b340f20fd878e1b40f77014e5e6f62beacf43e1889b9735c27a50" Apr 17 18:27:54.556078 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:27:54.556058 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5920ade69f5b340f20fd878e1b40f77014e5e6f62beacf43e1889b9735c27a50\": container with ID starting with 5920ade69f5b340f20fd878e1b40f77014e5e6f62beacf43e1889b9735c27a50 not found: ID does not exist" containerID="5920ade69f5b340f20fd878e1b40f77014e5e6f62beacf43e1889b9735c27a50" Apr 17 18:27:54.556125 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:54.556086 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5920ade69f5b340f20fd878e1b40f77014e5e6f62beacf43e1889b9735c27a50"} err="failed to get container status \"5920ade69f5b340f20fd878e1b40f77014e5e6f62beacf43e1889b9735c27a50\": rpc error: code = NotFound desc = could not find container \"5920ade69f5b340f20fd878e1b40f77014e5e6f62beacf43e1889b9735c27a50\": container with ID starting with 5920ade69f5b340f20fd878e1b40f77014e5e6f62beacf43e1889b9735c27a50 not found: ID does not exist" Apr 17 18:27:54.556125 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:54.556104 2566 scope.go:117] "RemoveContainer" containerID="44d397f47166576bf4e919f8ebafdf73da637f4dbc9e719b34ce746a4e55728a" Apr 17 18:27:54.556293 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:27:54.556276 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44d397f47166576bf4e919f8ebafdf73da637f4dbc9e719b34ce746a4e55728a\": container with ID starting with 44d397f47166576bf4e919f8ebafdf73da637f4dbc9e719b34ce746a4e55728a not found: ID does not exist" containerID="44d397f47166576bf4e919f8ebafdf73da637f4dbc9e719b34ce746a4e55728a" Apr 17 18:27:54.556339 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:54.556298 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44d397f47166576bf4e919f8ebafdf73da637f4dbc9e719b34ce746a4e55728a"} err="failed to get container status \"44d397f47166576bf4e919f8ebafdf73da637f4dbc9e719b34ce746a4e55728a\": rpc error: code = NotFound desc = could not find container \"44d397f47166576bf4e919f8ebafdf73da637f4dbc9e719b34ce746a4e55728a\": container with ID starting with 44d397f47166576bf4e919f8ebafdf73da637f4dbc9e719b34ce746a4e55728a not found: ID does not exist" Apr 17 18:27:54.578955 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:54.578931 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-p2f8w"] Apr 17 18:27:54.582985 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:54.582967 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-p2f8w"] Apr 17 18:27:54.841863 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:54.841831 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09a34602-dd68-43d3-8b1d-51ae628c1b9a" path="/var/lib/kubelet/pods/09a34602-dd68-43d3-8b1d-51ae628c1b9a/volumes" Apr 17 18:27:55.183894 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:55.183813 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-h8nsf/must-gather-7jfbw"] Apr 17 18:27:55.184244 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:55.184089 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="09a34602-dd68-43d3-8b1d-51ae628c1b9a" containerName="storage-initializer" Apr 17 18:27:55.184244 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:55.184099 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="09a34602-dd68-43d3-8b1d-51ae628c1b9a" containerName="storage-initializer" Apr 17 18:27:55.184244 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:55.184111 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb65dac7-d83f-4063-bb76-0cf2a991cd3e" containerName="kube-rbac-proxy" Apr 17 18:27:55.184244 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:55.184117 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb65dac7-d83f-4063-bb76-0cf2a991cd3e" containerName="kube-rbac-proxy" Apr 17 18:27:55.184244 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:55.184122 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="09a34602-dd68-43d3-8b1d-51ae628c1b9a" containerName="storage-initializer" Apr 17 18:27:55.184244 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:55.184128 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="09a34602-dd68-43d3-8b1d-51ae628c1b9a" containerName="storage-initializer" Apr 17 18:27:55.184244 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:55.184140 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb65dac7-d83f-4063-bb76-0cf2a991cd3e" containerName="storage-initializer" Apr 17 18:27:55.184244 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:55.184145 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb65dac7-d83f-4063-bb76-0cf2a991cd3e" containerName="storage-initializer" Apr 17 18:27:55.184244 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:55.184152 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb65dac7-d83f-4063-bb76-0cf2a991cd3e" containerName="kserve-container" Apr 17 18:27:55.184244 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:55.184158 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb65dac7-d83f-4063-bb76-0cf2a991cd3e" containerName="kserve-container" Apr 17 18:27:55.184244 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:55.184201 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="bb65dac7-d83f-4063-bb76-0cf2a991cd3e" containerName="kube-rbac-proxy" Apr 17 18:27:55.184244 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:55.184208 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="09a34602-dd68-43d3-8b1d-51ae628c1b9a" containerName="storage-initializer" Apr 17 18:27:55.184244 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:55.184218 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="bb65dac7-d83f-4063-bb76-0cf2a991cd3e" containerName="kserve-container" Apr 17 18:27:55.184891 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:55.184310 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="09a34602-dd68-43d3-8b1d-51ae628c1b9a" containerName="storage-initializer" Apr 17 18:27:55.188496 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:55.188477 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h8nsf/must-gather-7jfbw" Apr 17 18:27:55.190784 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:55.190759 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-h8nsf\"/\"kube-root-ca.crt\"" Apr 17 18:27:55.190895 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:55.190768 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-h8nsf\"/\"openshift-service-ca.crt\"" Apr 17 18:27:55.196771 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:55.196743 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h8nsf/must-gather-7jfbw"] Apr 17 18:27:55.272178 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:55.272148 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e3e34830-b33d-40b3-b1b2-47916f8017fd-must-gather-output\") pod \"must-gather-7jfbw\" (UID: \"e3e34830-b33d-40b3-b1b2-47916f8017fd\") " pod="openshift-must-gather-h8nsf/must-gather-7jfbw" Apr 17 18:27:55.272333 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:55.272185 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtnfz\" (UniqueName: \"kubernetes.io/projected/e3e34830-b33d-40b3-b1b2-47916f8017fd-kube-api-access-qtnfz\") pod \"must-gather-7jfbw\" (UID: \"e3e34830-b33d-40b3-b1b2-47916f8017fd\") " pod="openshift-must-gather-h8nsf/must-gather-7jfbw" Apr 17 18:27:55.373520 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:55.373475 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e3e34830-b33d-40b3-b1b2-47916f8017fd-must-gather-output\") pod \"must-gather-7jfbw\" (UID: \"e3e34830-b33d-40b3-b1b2-47916f8017fd\") " pod="openshift-must-gather-h8nsf/must-gather-7jfbw" Apr 17 18:27:55.373520 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:55.373529 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qtnfz\" (UniqueName: \"kubernetes.io/projected/e3e34830-b33d-40b3-b1b2-47916f8017fd-kube-api-access-qtnfz\") pod \"must-gather-7jfbw\" (UID: \"e3e34830-b33d-40b3-b1b2-47916f8017fd\") " pod="openshift-must-gather-h8nsf/must-gather-7jfbw" Apr 17 18:27:55.373802 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:55.373783 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e3e34830-b33d-40b3-b1b2-47916f8017fd-must-gather-output\") pod \"must-gather-7jfbw\" (UID: \"e3e34830-b33d-40b3-b1b2-47916f8017fd\") " pod="openshift-must-gather-h8nsf/must-gather-7jfbw" Apr 17 18:27:55.381504 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:55.381479 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtnfz\" (UniqueName: \"kubernetes.io/projected/e3e34830-b33d-40b3-b1b2-47916f8017fd-kube-api-access-qtnfz\") pod \"must-gather-7jfbw\" (UID: \"e3e34830-b33d-40b3-b1b2-47916f8017fd\") " pod="openshift-must-gather-h8nsf/must-gather-7jfbw" Apr 17 18:27:55.512682 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:55.512602 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h8nsf/must-gather-7jfbw" Apr 17 18:27:55.626660 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:55.626638 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h8nsf/must-gather-7jfbw"] Apr 17 18:27:55.628867 ip-10-0-140-147 kubenswrapper[2566]: W0417 18:27:55.628836 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3e34830_b33d_40b3_b1b2_47916f8017fd.slice/crio-1b74b8b5c7aa8813641e8dd11f167f771ccf0accf88330c31fcaff4d1848129b WatchSource:0}: Error finding container 1b74b8b5c7aa8813641e8dd11f167f771ccf0accf88330c31fcaff4d1848129b: Status 404 returned error can't find the container with id 1b74b8b5c7aa8813641e8dd11f167f771ccf0accf88330c31fcaff4d1848129b Apr 17 18:27:56.550330 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:27:56.550278 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h8nsf/must-gather-7jfbw" event={"ID":"e3e34830-b33d-40b3-b1b2-47916f8017fd","Type":"ContainerStarted","Data":"1b74b8b5c7aa8813641e8dd11f167f771ccf0accf88330c31fcaff4d1848129b"} Apr 17 18:28:00.563718 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:00.563679 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h8nsf/must-gather-7jfbw" event={"ID":"e3e34830-b33d-40b3-b1b2-47916f8017fd","Type":"ContainerStarted","Data":"dea4e9e45a1ebc67cee0b450c1b0c15e4711963c1f5f0a479cec6ea593d01fdb"} Apr 17 18:28:00.563718 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:00.563723 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h8nsf/must-gather-7jfbw" event={"ID":"e3e34830-b33d-40b3-b1b2-47916f8017fd","Type":"ContainerStarted","Data":"ed7d58ffd7767e00609e704b7fcac153a40ae1360388b564cab27887a9f57ce9"} Apr 17 18:28:00.580687 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:00.580645 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-h8nsf/must-gather-7jfbw" podStartSLOduration=1.066152271 podStartE2EDuration="5.58063051s" podCreationTimestamp="2026-04-17 18:27:55 +0000 UTC" firstStartedPulling="2026-04-17 18:27:55.630304901 +0000 UTC m=+3787.300384520" lastFinishedPulling="2026-04-17 18:28:00.144783126 +0000 UTC m=+3791.814862759" observedRunningTime="2026-04-17 18:28:00.579060843 +0000 UTC m=+3792.249140487" watchObservedRunningTime="2026-04-17 18:28:00.58063051 +0000 UTC m=+3792.250710151" Apr 17 18:28:21.627326 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:21.627293 2566 generic.go:358] "Generic (PLEG): container finished" podID="e3e34830-b33d-40b3-b1b2-47916f8017fd" containerID="ed7d58ffd7767e00609e704b7fcac153a40ae1360388b564cab27887a9f57ce9" exitCode=0 Apr 17 18:28:21.627748 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:21.627366 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h8nsf/must-gather-7jfbw" event={"ID":"e3e34830-b33d-40b3-b1b2-47916f8017fd","Type":"ContainerDied","Data":"ed7d58ffd7767e00609e704b7fcac153a40ae1360388b564cab27887a9f57ce9"} Apr 17 18:28:21.627748 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:21.627690 2566 scope.go:117] "RemoveContainer" containerID="ed7d58ffd7767e00609e704b7fcac153a40ae1360388b564cab27887a9f57ce9" Apr 17 18:28:22.306050 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:22.306022 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-h8nsf_must-gather-7jfbw_e3e34830-b33d-40b3-b1b2-47916f8017fd/gather/0.log" Apr 17 18:28:22.956934 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:22.956902 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2d5hj/must-gather-l275n"] Apr 17 18:28:22.960116 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:22.960100 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2d5hj/must-gather-l275n" Apr 17 18:28:22.962351 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:22.962319 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-2d5hj\"/\"default-dockercfg-gs5n6\"" Apr 17 18:28:22.962483 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:22.962401 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2d5hj\"/\"openshift-service-ca.crt\"" Apr 17 18:28:22.962483 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:22.962427 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2d5hj\"/\"kube-root-ca.crt\"" Apr 17 18:28:22.969123 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:22.969105 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2d5hj/must-gather-l275n"] Apr 17 18:28:23.111504 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:23.111471 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsn84\" (UniqueName: \"kubernetes.io/projected/ef8af271-626d-4227-9601-87c2e4185164-kube-api-access-jsn84\") pod \"must-gather-l275n\" (UID: \"ef8af271-626d-4227-9601-87c2e4185164\") " pod="openshift-must-gather-2d5hj/must-gather-l275n" Apr 17 18:28:23.111679 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:23.111525 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ef8af271-626d-4227-9601-87c2e4185164-must-gather-output\") pod \"must-gather-l275n\" (UID: \"ef8af271-626d-4227-9601-87c2e4185164\") " pod="openshift-must-gather-2d5hj/must-gather-l275n" Apr 17 18:28:23.212081 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:23.212008 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jsn84\" (UniqueName: \"kubernetes.io/projected/ef8af271-626d-4227-9601-87c2e4185164-kube-api-access-jsn84\") pod \"must-gather-l275n\" (UID: \"ef8af271-626d-4227-9601-87c2e4185164\") " pod="openshift-must-gather-2d5hj/must-gather-l275n" Apr 17 18:28:23.212081 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:23.212050 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ef8af271-626d-4227-9601-87c2e4185164-must-gather-output\") pod \"must-gather-l275n\" (UID: \"ef8af271-626d-4227-9601-87c2e4185164\") " pod="openshift-must-gather-2d5hj/must-gather-l275n" Apr 17 18:28:23.212362 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:23.212347 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ef8af271-626d-4227-9601-87c2e4185164-must-gather-output\") pod \"must-gather-l275n\" (UID: \"ef8af271-626d-4227-9601-87c2e4185164\") " pod="openshift-must-gather-2d5hj/must-gather-l275n" Apr 17 18:28:23.220767 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:23.220734 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsn84\" (UniqueName: \"kubernetes.io/projected/ef8af271-626d-4227-9601-87c2e4185164-kube-api-access-jsn84\") pod \"must-gather-l275n\" (UID: \"ef8af271-626d-4227-9601-87c2e4185164\") " pod="openshift-must-gather-2d5hj/must-gather-l275n" Apr 17 18:28:23.269697 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:23.269670 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2d5hj/must-gather-l275n" Apr 17 18:28:23.388915 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:23.388878 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2d5hj/must-gather-l275n"] Apr 17 18:28:23.391605 ip-10-0-140-147 kubenswrapper[2566]: W0417 18:28:23.391574 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef8af271_626d_4227_9601_87c2e4185164.slice/crio-db89f8612ac036aa25b35312457d0f5daefbdbe9c1a7665a08ca5191fa47f1a4 WatchSource:0}: Error finding container db89f8612ac036aa25b35312457d0f5daefbdbe9c1a7665a08ca5191fa47f1a4: Status 404 returned error can't find the container with id db89f8612ac036aa25b35312457d0f5daefbdbe9c1a7665a08ca5191fa47f1a4 Apr 17 18:28:23.636115 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:23.636080 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2d5hj/must-gather-l275n" event={"ID":"ef8af271-626d-4227-9601-87c2e4185164","Type":"ContainerStarted","Data":"db89f8612ac036aa25b35312457d0f5daefbdbe9c1a7665a08ca5191fa47f1a4"} Apr 17 18:28:24.641372 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:24.641336 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2d5hj/must-gather-l275n" event={"ID":"ef8af271-626d-4227-9601-87c2e4185164","Type":"ContainerStarted","Data":"a136b8b1ad00ef4b2987f451609fbbdcbab73496a0fbdc1889ae6267e751faa7"} Apr 17 18:28:24.641372 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:24.641376 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2d5hj/must-gather-l275n" event={"ID":"ef8af271-626d-4227-9601-87c2e4185164","Type":"ContainerStarted","Data":"824069d31e1b76b365c91eeea2942f0eb1794378c9a637d66c92b86eb89c3d0a"} Apr 17 18:28:25.823601 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:25.823557 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-q4gqn_b2a4b11e-5add-4df7-8e69-5b3342e010fe/global-pull-secret-syncer/0.log" Apr 17 18:28:25.967353 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:25.967323 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-q9m54_348f9f1d-49f3-4771-9e69-3b4ba90f29e9/konnectivity-agent/0.log" Apr 17 18:28:26.042200 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:26.042169 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-140-147.ec2.internal_cf7a4b638e7a30f012eeb3bf5b7f1ece/haproxy/0.log" Apr 17 18:28:27.802345 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:27.802280 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2d5hj/must-gather-l275n" podStartSLOduration=4.992279802 podStartE2EDuration="5.802239103s" podCreationTimestamp="2026-04-17 18:28:22 +0000 UTC" firstStartedPulling="2026-04-17 18:28:23.393325165 +0000 UTC m=+3815.063404784" lastFinishedPulling="2026-04-17 18:28:24.203284462 +0000 UTC m=+3815.873364085" observedRunningTime="2026-04-17 18:28:24.658664087 +0000 UTC m=+3816.328743730" watchObservedRunningTime="2026-04-17 18:28:27.802239103 +0000 UTC m=+3819.472318745" Apr 17 18:28:27.803630 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:27.803598 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-h8nsf/must-gather-7jfbw"] Apr 17 18:28:27.803871 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:27.803846 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-h8nsf/must-gather-7jfbw" podUID="e3e34830-b33d-40b3-b1b2-47916f8017fd" containerName="copy" containerID="cri-o://dea4e9e45a1ebc67cee0b450c1b0c15e4711963c1f5f0a479cec6ea593d01fdb" gracePeriod=2 Apr 17 18:28:27.808326 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:27.808303 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-h8nsf/must-gather-7jfbw"] Apr 17 18:28:27.808758 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:27.808733 2566 status_manager.go:895] "Failed to get status for pod" podUID="e3e34830-b33d-40b3-b1b2-47916f8017fd" pod="openshift-must-gather-h8nsf/must-gather-7jfbw" err="pods \"must-gather-7jfbw\" is forbidden: User \"system:node:ip-10-0-140-147.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-h8nsf\": no relationship found between node 'ip-10-0-140-147.ec2.internal' and this object" Apr 17 18:28:28.212272 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:28.211203 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-h8nsf_must-gather-7jfbw_e3e34830-b33d-40b3-b1b2-47916f8017fd/copy/0.log" Apr 17 18:28:28.212272 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:28.211622 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h8nsf/must-gather-7jfbw" Apr 17 18:28:28.214500 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:28.214463 2566 status_manager.go:895] "Failed to get status for pod" podUID="e3e34830-b33d-40b3-b1b2-47916f8017fd" pod="openshift-must-gather-h8nsf/must-gather-7jfbw" err="pods \"must-gather-7jfbw\" is forbidden: User \"system:node:ip-10-0-140-147.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-h8nsf\": no relationship found between node 'ip-10-0-140-147.ec2.internal' and this object" Apr 17 18:28:28.368336 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:28.368293 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e3e34830-b33d-40b3-b1b2-47916f8017fd-must-gather-output\") pod \"e3e34830-b33d-40b3-b1b2-47916f8017fd\" (UID: \"e3e34830-b33d-40b3-b1b2-47916f8017fd\") " Apr 17 18:28:28.368543 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:28.368410 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtnfz\" (UniqueName: \"kubernetes.io/projected/e3e34830-b33d-40b3-b1b2-47916f8017fd-kube-api-access-qtnfz\") pod \"e3e34830-b33d-40b3-b1b2-47916f8017fd\" (UID: \"e3e34830-b33d-40b3-b1b2-47916f8017fd\") " Apr 17 18:28:28.370478 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:28.370438 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3e34830-b33d-40b3-b1b2-47916f8017fd-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "e3e34830-b33d-40b3-b1b2-47916f8017fd" (UID: "e3e34830-b33d-40b3-b1b2-47916f8017fd"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:28:28.371595 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:28.371558 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3e34830-b33d-40b3-b1b2-47916f8017fd-kube-api-access-qtnfz" (OuterVolumeSpecName: "kube-api-access-qtnfz") pod "e3e34830-b33d-40b3-b1b2-47916f8017fd" (UID: "e3e34830-b33d-40b3-b1b2-47916f8017fd"). InnerVolumeSpecName "kube-api-access-qtnfz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:28:28.469872 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:28.469761 2566 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e3e34830-b33d-40b3-b1b2-47916f8017fd-must-gather-output\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:28:28.469872 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:28.469804 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qtnfz\" (UniqueName: \"kubernetes.io/projected/e3e34830-b33d-40b3-b1b2-47916f8017fd-kube-api-access-qtnfz\") on node \"ip-10-0-140-147.ec2.internal\" DevicePath \"\"" Apr 17 18:28:28.661515 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:28.661481 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-h8nsf_must-gather-7jfbw_e3e34830-b33d-40b3-b1b2-47916f8017fd/copy/0.log" Apr 17 18:28:28.661909 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:28.661882 2566 generic.go:358] "Generic (PLEG): container finished" podID="e3e34830-b33d-40b3-b1b2-47916f8017fd" containerID="dea4e9e45a1ebc67cee0b450c1b0c15e4711963c1f5f0a479cec6ea593d01fdb" exitCode=143 Apr 17 18:28:28.661979 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:28.661944 2566 scope.go:117] "RemoveContainer" containerID="dea4e9e45a1ebc67cee0b450c1b0c15e4711963c1f5f0a479cec6ea593d01fdb" Apr 17 18:28:28.662084 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:28.662069 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h8nsf/must-gather-7jfbw" Apr 17 18:28:28.665470 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:28.665433 2566 status_manager.go:895] "Failed to get status for pod" podUID="e3e34830-b33d-40b3-b1b2-47916f8017fd" pod="openshift-must-gather-h8nsf/must-gather-7jfbw" err="pods \"must-gather-7jfbw\" is forbidden: User \"system:node:ip-10-0-140-147.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-h8nsf\": no relationship found between node 'ip-10-0-140-147.ec2.internal' and this object" Apr 17 18:28:28.679777 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:28.679756 2566 scope.go:117] "RemoveContainer" containerID="ed7d58ffd7767e00609e704b7fcac153a40ae1360388b564cab27887a9f57ce9" Apr 17 18:28:28.708479 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:28.708324 2566 status_manager.go:895] "Failed to get status for pod" podUID="e3e34830-b33d-40b3-b1b2-47916f8017fd" pod="openshift-must-gather-h8nsf/must-gather-7jfbw" err="pods \"must-gather-7jfbw\" is forbidden: User \"system:node:ip-10-0-140-147.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-h8nsf\": no relationship found between node 'ip-10-0-140-147.ec2.internal' and this object" Apr 17 18:28:28.717013 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:28.716987 2566 scope.go:117] "RemoveContainer" containerID="dea4e9e45a1ebc67cee0b450c1b0c15e4711963c1f5f0a479cec6ea593d01fdb" Apr 17 18:28:28.719982 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:28:28.719901 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dea4e9e45a1ebc67cee0b450c1b0c15e4711963c1f5f0a479cec6ea593d01fdb\": container with ID starting with dea4e9e45a1ebc67cee0b450c1b0c15e4711963c1f5f0a479cec6ea593d01fdb not found: ID does not exist" containerID="dea4e9e45a1ebc67cee0b450c1b0c15e4711963c1f5f0a479cec6ea593d01fdb" Apr 17 18:28:28.720103 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:28.719976 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dea4e9e45a1ebc67cee0b450c1b0c15e4711963c1f5f0a479cec6ea593d01fdb"} err="failed to get container status \"dea4e9e45a1ebc67cee0b450c1b0c15e4711963c1f5f0a479cec6ea593d01fdb\": rpc error: code = NotFound desc = could not find container \"dea4e9e45a1ebc67cee0b450c1b0c15e4711963c1f5f0a479cec6ea593d01fdb\": container with ID starting with dea4e9e45a1ebc67cee0b450c1b0c15e4711963c1f5f0a479cec6ea593d01fdb not found: ID does not exist" Apr 17 18:28:28.720103 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:28.720002 2566 scope.go:117] "RemoveContainer" containerID="ed7d58ffd7767e00609e704b7fcac153a40ae1360388b564cab27887a9f57ce9" Apr 17 18:28:28.720661 ip-10-0-140-147 kubenswrapper[2566]: E0417 18:28:28.720325 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed7d58ffd7767e00609e704b7fcac153a40ae1360388b564cab27887a9f57ce9\": container with ID starting with ed7d58ffd7767e00609e704b7fcac153a40ae1360388b564cab27887a9f57ce9 not found: ID does not exist" containerID="ed7d58ffd7767e00609e704b7fcac153a40ae1360388b564cab27887a9f57ce9" Apr 17 18:28:28.720661 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:28.720362 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed7d58ffd7767e00609e704b7fcac153a40ae1360388b564cab27887a9f57ce9"} err="failed to get container status \"ed7d58ffd7767e00609e704b7fcac153a40ae1360388b564cab27887a9f57ce9\": rpc error: code = NotFound desc = could not find container \"ed7d58ffd7767e00609e704b7fcac153a40ae1360388b564cab27887a9f57ce9\": container with ID starting with ed7d58ffd7767e00609e704b7fcac153a40ae1360388b564cab27887a9f57ce9 not found: ID does not exist" Apr 17 18:28:28.839589 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:28.839537 2566 status_manager.go:895] "Failed to get status for pod" podUID="e3e34830-b33d-40b3-b1b2-47916f8017fd" pod="openshift-must-gather-h8nsf/must-gather-7jfbw" err="pods \"must-gather-7jfbw\" is forbidden: User \"system:node:ip-10-0-140-147.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-h8nsf\": no relationship found between node 'ip-10-0-140-147.ec2.internal' and this object" Apr 17 18:28:28.845266 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:28.845212 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3e34830-b33d-40b3-b1b2-47916f8017fd" path="/var/lib/kubelet/pods/e3e34830-b33d-40b3-b1b2-47916f8017fd/volumes" Apr 17 18:28:29.307117 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:29.307079 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8bebf9bf-4deb-4828-8e6f-cf7ab5513932/alertmanager/0.log" Apr 17 18:28:29.340832 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:29.340781 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8bebf9bf-4deb-4828-8e6f-cf7ab5513932/config-reloader/0.log" Apr 17 18:28:29.371167 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:29.371120 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8bebf9bf-4deb-4828-8e6f-cf7ab5513932/kube-rbac-proxy-web/0.log" Apr 17 18:28:29.401138 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:29.401109 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8bebf9bf-4deb-4828-8e6f-cf7ab5513932/kube-rbac-proxy/0.log" Apr 17 18:28:29.424791 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:29.424757 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8bebf9bf-4deb-4828-8e6f-cf7ab5513932/kube-rbac-proxy-metric/0.log" Apr 17 18:28:29.453433 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:29.453399 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8bebf9bf-4deb-4828-8e6f-cf7ab5513932/prom-label-proxy/0.log" Apr 17 18:28:29.481342 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:29.481304 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8bebf9bf-4deb-4828-8e6f-cf7ab5513932/init-config-reloader/0.log" Apr 17 18:28:29.572813 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:29.572781 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-2ss72_24cc7964-d06f-4bc3-a617-a62c30500317/kube-state-metrics/0.log" Apr 17 18:28:29.598957 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:29.598930 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-2ss72_24cc7964-d06f-4bc3-a617-a62c30500317/kube-rbac-proxy-main/0.log" Apr 17 18:28:29.628566 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:29.628533 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-2ss72_24cc7964-d06f-4bc3-a617-a62c30500317/kube-rbac-proxy-self/0.log" Apr 17 18:28:29.892956 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:29.892913 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-kfhmc_8c08d72f-60f0-4355-8f32-3f4cb23cf290/node-exporter/0.log" Apr 17 18:28:29.917303 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:29.917248 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-kfhmc_8c08d72f-60f0-4355-8f32-3f4cb23cf290/kube-rbac-proxy/0.log" Apr 17 18:28:29.942723 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:29.942695 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-kfhmc_8c08d72f-60f0-4355-8f32-3f4cb23cf290/init-textfile/0.log" Apr 17 18:28:29.974346 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:29.974316 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-8882s_83bdc5ec-af0a-4592-ab21-2e80793d42c9/kube-rbac-proxy-main/0.log" Apr 17 18:28:29.998755 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:29.998722 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-8882s_83bdc5ec-af0a-4592-ab21-2e80793d42c9/kube-rbac-proxy-self/0.log" Apr 17 18:28:30.021095 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:30.021064 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-8882s_83bdc5ec-af0a-4592-ab21-2e80793d42c9/openshift-state-metrics/0.log" Apr 17 18:28:30.078699 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:30.078663 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_90ed7337-b3e7-477c-967b-b847ef1e7833/prometheus/0.log" Apr 17 18:28:30.099227 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:30.099201 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_90ed7337-b3e7-477c-967b-b847ef1e7833/config-reloader/0.log" Apr 17 18:28:30.123959 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:30.123878 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_90ed7337-b3e7-477c-967b-b847ef1e7833/thanos-sidecar/0.log" Apr 17 18:28:30.148992 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:30.148959 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_90ed7337-b3e7-477c-967b-b847ef1e7833/kube-rbac-proxy-web/0.log" Apr 17 18:28:30.172119 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:30.172089 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_90ed7337-b3e7-477c-967b-b847ef1e7833/kube-rbac-proxy/0.log" Apr 17 18:28:30.193846 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:30.193805 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_90ed7337-b3e7-477c-967b-b847ef1e7833/kube-rbac-proxy-thanos/0.log" Apr 17 18:28:30.219935 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:30.219906 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_90ed7337-b3e7-477c-967b-b847ef1e7833/init-config-reloader/0.log" Apr 17 18:28:30.334732 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:30.334702 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-8b49859d5-x88sb_bd58a1a6-2c41-40e0-adfd-e35fe24330cb/telemeter-client/0.log" Apr 17 18:28:30.360526 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:30.360496 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-8b49859d5-x88sb_bd58a1a6-2c41-40e0-adfd-e35fe24330cb/reload/0.log" Apr 17 18:28:30.381496 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:30.381420 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-8b49859d5-x88sb_bd58a1a6-2c41-40e0-adfd-e35fe24330cb/kube-rbac-proxy/0.log" Apr 17 18:28:32.814322 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:32.814288 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2d5hj/perf-node-gather-daemonset-8288b"] Apr 17 18:28:32.814878 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:32.814779 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e3e34830-b33d-40b3-b1b2-47916f8017fd" containerName="gather" Apr 17 18:28:32.814878 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:32.814796 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3e34830-b33d-40b3-b1b2-47916f8017fd" containerName="gather" Apr 17 18:28:32.814878 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:32.814827 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e3e34830-b33d-40b3-b1b2-47916f8017fd" containerName="copy" Apr 17 18:28:32.814878 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:32.814835 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3e34830-b33d-40b3-b1b2-47916f8017fd" containerName="copy" Apr 17 18:28:32.815032 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:32.814900 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="e3e34830-b33d-40b3-b1b2-47916f8017fd" containerName="copy" Apr 17 18:28:32.815032 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:32.814915 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="e3e34830-b33d-40b3-b1b2-47916f8017fd" containerName="gather" Apr 17 18:28:32.819523 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:32.819499 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-8288b" Apr 17 18:28:32.826408 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:32.826273 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2d5hj/perf-node-gather-daemonset-8288b"] Apr 17 18:28:32.915078 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:32.915046 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9f25d1ce-61cd-4714-9d02-6734bfc7ab53-sys\") pod \"perf-node-gather-daemonset-8288b\" (UID: \"9f25d1ce-61cd-4714-9d02-6734bfc7ab53\") " pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-8288b" Apr 17 18:28:32.915281 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:32.915085 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9f25d1ce-61cd-4714-9d02-6734bfc7ab53-podres\") pod \"perf-node-gather-daemonset-8288b\" (UID: \"9f25d1ce-61cd-4714-9d02-6734bfc7ab53\") " pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-8288b" Apr 17 18:28:32.915281 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:32.915119 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxs2w\" (UniqueName: \"kubernetes.io/projected/9f25d1ce-61cd-4714-9d02-6734bfc7ab53-kube-api-access-bxs2w\") pod \"perf-node-gather-daemonset-8288b\" (UID: \"9f25d1ce-61cd-4714-9d02-6734bfc7ab53\") " pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-8288b" Apr 17 18:28:32.915281 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:32.915149 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9f25d1ce-61cd-4714-9d02-6734bfc7ab53-proc\") pod \"perf-node-gather-daemonset-8288b\" (UID: \"9f25d1ce-61cd-4714-9d02-6734bfc7ab53\") " pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-8288b" Apr 17 18:28:32.915281 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:32.915173 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9f25d1ce-61cd-4714-9d02-6734bfc7ab53-lib-modules\") pod \"perf-node-gather-daemonset-8288b\" (UID: \"9f25d1ce-61cd-4714-9d02-6734bfc7ab53\") " pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-8288b" Apr 17 18:28:33.016135 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:33.016098 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9f25d1ce-61cd-4714-9d02-6734bfc7ab53-lib-modules\") pod \"perf-node-gather-daemonset-8288b\" (UID: \"9f25d1ce-61cd-4714-9d02-6734bfc7ab53\") " pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-8288b" Apr 17 18:28:33.016340 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:33.016220 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9f25d1ce-61cd-4714-9d02-6734bfc7ab53-sys\") pod \"perf-node-gather-daemonset-8288b\" (UID: \"9f25d1ce-61cd-4714-9d02-6734bfc7ab53\") " pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-8288b" Apr 17 18:28:33.016340 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:33.016289 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9f25d1ce-61cd-4714-9d02-6734bfc7ab53-podres\") pod \"perf-node-gather-daemonset-8288b\" (UID: \"9f25d1ce-61cd-4714-9d02-6734bfc7ab53\") " pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-8288b" Apr 17 18:28:33.016340 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:33.016328 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bxs2w\" (UniqueName: \"kubernetes.io/projected/9f25d1ce-61cd-4714-9d02-6734bfc7ab53-kube-api-access-bxs2w\") pod \"perf-node-gather-daemonset-8288b\" (UID: \"9f25d1ce-61cd-4714-9d02-6734bfc7ab53\") " pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-8288b" Apr 17 18:28:33.016518 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:33.016356 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9f25d1ce-61cd-4714-9d02-6734bfc7ab53-proc\") pod \"perf-node-gather-daemonset-8288b\" (UID: \"9f25d1ce-61cd-4714-9d02-6734bfc7ab53\") " pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-8288b" Apr 17 18:28:33.016518 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:33.016454 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9f25d1ce-61cd-4714-9d02-6734bfc7ab53-proc\") pod \"perf-node-gather-daemonset-8288b\" (UID: \"9f25d1ce-61cd-4714-9d02-6734bfc7ab53\") " pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-8288b" Apr 17 18:28:33.016518 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:33.016506 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9f25d1ce-61cd-4714-9d02-6734bfc7ab53-sys\") pod \"perf-node-gather-daemonset-8288b\" (UID: \"9f25d1ce-61cd-4714-9d02-6734bfc7ab53\") " pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-8288b" Apr 17 18:28:33.016687 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:33.016580 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9f25d1ce-61cd-4714-9d02-6734bfc7ab53-podres\") pod \"perf-node-gather-daemonset-8288b\" (UID: \"9f25d1ce-61cd-4714-9d02-6734bfc7ab53\") " pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-8288b" Apr 17 18:28:33.016947 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:33.016913 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9f25d1ce-61cd-4714-9d02-6734bfc7ab53-lib-modules\") pod \"perf-node-gather-daemonset-8288b\" (UID: \"9f25d1ce-61cd-4714-9d02-6734bfc7ab53\") " pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-8288b" Apr 17 18:28:33.025052 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:33.025022 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxs2w\" (UniqueName: \"kubernetes.io/projected/9f25d1ce-61cd-4714-9d02-6734bfc7ab53-kube-api-access-bxs2w\") pod \"perf-node-gather-daemonset-8288b\" (UID: \"9f25d1ce-61cd-4714-9d02-6734bfc7ab53\") " pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-8288b" Apr 17 18:28:33.130949 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:33.130868 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-8288b" Apr 17 18:28:33.269056 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:33.269022 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2d5hj/perf-node-gather-daemonset-8288b"] Apr 17 18:28:33.272313 ip-10-0-140-147 kubenswrapper[2566]: W0417 18:28:33.272282 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod9f25d1ce_61cd_4714_9d02_6734bfc7ab53.slice/crio-4bcba47765f2edbb703b989d094d7a449716f08840ffb370693b9b376511ca6e WatchSource:0}: Error finding container 4bcba47765f2edbb703b989d094d7a449716f08840ffb370693b9b376511ca6e: Status 404 returned error can't find the container with id 4bcba47765f2edbb703b989d094d7a449716f08840ffb370693b9b376511ca6e Apr 17 18:28:33.655915 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:33.655883 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-45bfq_624a9915-ae5a-4744-8433-7bcca05df5bf/dns/0.log" Apr 17 18:28:33.677827 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:33.677796 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-45bfq_624a9915-ae5a-4744-8433-7bcca05df5bf/kube-rbac-proxy/0.log" Apr 17 18:28:33.681896 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:33.681871 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-8288b" event={"ID":"9f25d1ce-61cd-4714-9d02-6734bfc7ab53","Type":"ContainerStarted","Data":"e460e8ec211126d8b6e7ad53b4fe19ab9521c66afcb92125b634e6a9155ced33"} Apr 17 18:28:33.682000 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:33.681903 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-8288b" event={"ID":"9f25d1ce-61cd-4714-9d02-6734bfc7ab53","Type":"ContainerStarted","Data":"4bcba47765f2edbb703b989d094d7a449716f08840ffb370693b9b376511ca6e"} Apr 17 18:28:33.682000 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:33.681948 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-8288b" Apr 17 18:28:33.699502 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:33.699466 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-8288b" podStartSLOduration=1.699453733 podStartE2EDuration="1.699453733s" podCreationTimestamp="2026-04-17 18:28:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:28:33.697511558 +0000 UTC m=+3825.367591202" watchObservedRunningTime="2026-04-17 18:28:33.699453733 +0000 UTC m=+3825.369533419" Apr 17 18:28:33.849940 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:33.849913 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-td48j_9b5aeedb-5fe5-4d2a-bc4d-db986c41c8fd/dns-node-resolver/0.log" Apr 17 18:28:34.284611 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:34.284569 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-5848fc8888-kv722_3bc8d11b-297a-4446-8745-9da775945615/registry/0.log" Apr 17 18:28:34.357383 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:34.357348 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-ttt97_5a3ce8ab-c625-49a4-a457-49fae7f24c9a/node-ca/0.log" Apr 17 18:28:35.392317 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:35.392287 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-6bftx_44fc3e42-b872-4c53-99af-7d32682facb5/serve-healthcheck-canary/0.log" Apr 17 18:28:35.813165 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:35.813134 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-sdn9m_808d341a-97c0-4c38-aece-d3877936bcb6/kube-rbac-proxy/0.log" Apr 17 18:28:35.833860 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:35.833832 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-sdn9m_808d341a-97c0-4c38-aece-d3877936bcb6/exporter/0.log" Apr 17 18:28:35.856045 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:35.856024 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-sdn9m_808d341a-97c0-4c38-aece-d3877936bcb6/extractor/0.log" Apr 17 18:28:38.247242 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:38.247206 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-bgpkt_7bae9142-97fb-4c94-92a0-080475df4674/s3-init/0.log" Apr 17 18:28:38.317037 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:38.317009 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-custom-tgzvp_3d2d0aca-d91c-46e6-b056-c050639d3ba6/s3-tls-init-custom/0.log" Apr 17 18:28:38.338881 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:38.338852 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-serving-s2mjg_b45b0d0d-ce34-4f8e-b448-0d33f7523226/s3-tls-init-serving/0.log" Apr 17 18:28:39.696204 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:39.696175 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-8288b" Apr 17 18:28:44.058394 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:44.058366 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qn8t8_968b8801-64dd-454d-b1af-675ad2d36924/kube-multus-additional-cni-plugins/0.log" Apr 17 18:28:44.084144 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:44.084105 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qn8t8_968b8801-64dd-454d-b1af-675ad2d36924/egress-router-binary-copy/0.log" Apr 17 18:28:44.106313 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:44.106291 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qn8t8_968b8801-64dd-454d-b1af-675ad2d36924/cni-plugins/0.log" Apr 17 18:28:44.133148 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:44.133118 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qn8t8_968b8801-64dd-454d-b1af-675ad2d36924/bond-cni-plugin/0.log" Apr 17 18:28:44.155660 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:44.155634 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qn8t8_968b8801-64dd-454d-b1af-675ad2d36924/routeoverride-cni/0.log" Apr 17 18:28:44.178584 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:44.178557 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qn8t8_968b8801-64dd-454d-b1af-675ad2d36924/whereabouts-cni-bincopy/0.log" Apr 17 18:28:44.229932 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:44.229906 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qn8t8_968b8801-64dd-454d-b1af-675ad2d36924/whereabouts-cni/0.log" Apr 17 18:28:44.485994 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:44.485966 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xxqdc_ce658e37-54cb-4560-9405-21ba87bc0d35/kube-multus/0.log" Apr 17 18:28:44.559534 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:44.559501 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-wgmfp_b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a/network-metrics-daemon/0.log" Apr 17 18:28:44.575864 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:44.575831 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-wgmfp_b1844a3f-8bb0-4d0d-92b3-6ec5fa1b443a/kube-rbac-proxy/0.log" Apr 17 18:28:45.966575 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:45.966544 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz777_35fe1bc2-e1a4-4744-8f67-3cc38747e567/ovn-controller/0.log" Apr 17 18:28:45.983687 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:45.983647 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz777_35fe1bc2-e1a4-4744-8f67-3cc38747e567/ovn-acl-logging/0.log" Apr 17 18:28:46.015937 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:46.015901 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz777_35fe1bc2-e1a4-4744-8f67-3cc38747e567/ovn-acl-logging/1.log" Apr 17 18:28:46.040050 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:46.040022 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz777_35fe1bc2-e1a4-4744-8f67-3cc38747e567/kube-rbac-proxy-node/0.log" Apr 17 18:28:46.060367 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:46.060339 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz777_35fe1bc2-e1a4-4744-8f67-3cc38747e567/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 18:28:46.080408 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:46.080388 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz777_35fe1bc2-e1a4-4744-8f67-3cc38747e567/northd/0.log" Apr 17 18:28:46.100995 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:46.100972 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz777_35fe1bc2-e1a4-4744-8f67-3cc38747e567/nbdb/0.log" Apr 17 18:28:46.122445 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:46.122422 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz777_35fe1bc2-e1a4-4744-8f67-3cc38747e567/sbdb/0.log" Apr 17 18:28:46.328766 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:46.328730 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz777_35fe1bc2-e1a4-4744-8f67-3cc38747e567/ovnkube-controller/0.log" Apr 17 18:28:47.491262 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:47.491215 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-vdhlz_d29a1468-8ac5-454a-a993-6a5055191ec4/network-check-target-container/0.log" Apr 17 18:28:48.412936 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:48.412913 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-kc5w4_b10257f3-c35f-4a14-bf62-ab474fc1eeae/iptables-alerter/0.log" Apr 17 18:28:49.117639 ip-10-0-140-147 kubenswrapper[2566]: I0417 18:28:49.117581 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-nmmln_a21d46e7-a070-48be-a4ce-d2af9bd52539/tuned/0.log"