Apr 23 17:49:53.330910 ip-10-0-131-144 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 23 17:49:53.330921 ip-10-0-131-144 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 23 17:49:53.330930 ip-10-0-131-144 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 23 17:49:53.331251 ip-10-0-131-144 systemd[1]: Failed to start Kubernetes Kubelet. Apr 23 17:50:03.345495 ip-10-0-131-144 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 23 17:50:03.345507 ip-10-0-131-144 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 6237d49f20d1402aa9866385b82a27a9 -- Apr 23 17:52:21.968173 ip-10-0-131-144 systemd[1]: Starting Kubernetes Kubelet... Apr 23 17:52:22.386708 ip-10-0-131-144 kubenswrapper[2568]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 17:52:22.386708 ip-10-0-131-144 kubenswrapper[2568]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 17:52:22.386708 ip-10-0-131-144 kubenswrapper[2568]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 17:52:22.386708 ip-10-0-131-144 kubenswrapper[2568]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 17:52:22.386708 ip-10-0-131-144 kubenswrapper[2568]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 17:52:22.388398 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.388312 2568 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 17:52:22.391407 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391392 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:52:22.391407 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391407 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:52:22.391470 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391411 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:52:22.391470 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391414 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:52:22.391470 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391417 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:52:22.391470 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391421 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:52:22.391470 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391424 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:52:22.391470 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391427 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:52:22.391470 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391430 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:52:22.391470 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391432 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:52:22.391470 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391435 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:52:22.391470 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391438 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:52:22.391470 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391440 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:52:22.391470 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391443 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:52:22.391470 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391445 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:52:22.391470 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391448 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:52:22.391470 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391451 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:52:22.391470 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391458 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:52:22.391470 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391461 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:52:22.391470 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391464 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:52:22.391470 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391467 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:52:22.391470 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391469 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:52:22.391992 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391472 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:52:22.391992 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391475 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:52:22.391992 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391478 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:52:22.391992 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391481 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:52:22.391992 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391485 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:52:22.391992 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391489 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:52:22.391992 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391494 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:52:22.391992 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391498 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:52:22.391992 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391502 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:52:22.391992 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391505 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:52:22.391992 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391507 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:52:22.391992 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391510 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:52:22.391992 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391513 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:52:22.391992 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391515 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:52:22.391992 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391518 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:52:22.391992 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391520 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:52:22.391992 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391523 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:52:22.391992 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391525 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:52:22.391992 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391528 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:52:22.392456 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391530 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:52:22.392456 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391533 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:52:22.392456 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391536 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:52:22.392456 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391541 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:52:22.392456 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391543 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:52:22.392456 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391546 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:52:22.392456 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391549 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:52:22.392456 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391551 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:52:22.392456 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391554 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:52:22.392456 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391557 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:52:22.392456 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391559 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:52:22.392456 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391562 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:52:22.392456 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391565 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:52:22.392456 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391568 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:52:22.392456 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391571 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:52:22.392456 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391574 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:52:22.392456 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391577 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:52:22.392456 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391580 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:52:22.392456 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391582 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:52:22.392456 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391585 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:52:22.392953 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391588 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:52:22.392953 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391590 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:52:22.392953 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391592 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:52:22.392953 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391595 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:52:22.392953 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391598 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:52:22.392953 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391600 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:52:22.392953 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391603 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:52:22.392953 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391606 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:52:22.392953 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391608 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:52:22.392953 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391610 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:52:22.392953 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391613 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:52:22.392953 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391616 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:52:22.392953 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391618 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:52:22.392953 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391621 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:52:22.392953 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391623 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:52:22.392953 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391626 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:52:22.392953 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391628 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:52:22.392953 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391631 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:52:22.392953 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391634 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:52:22.392953 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391637 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:52:22.393441 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391639 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:52:22.393441 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391641 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:52:22.393441 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391644 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:52:22.393441 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391647 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:52:22.393441 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.391651 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:52:22.393441 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392040 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:52:22.393441 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392047 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:52:22.393441 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392050 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:52:22.393441 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392053 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:52:22.393441 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392056 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:52:22.393441 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392058 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:52:22.393441 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392061 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:52:22.393441 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392064 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:52:22.393441 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392066 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:52:22.393441 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392069 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:52:22.393441 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392072 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:52:22.393441 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392074 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:52:22.393441 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392078 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:52:22.393441 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392080 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:52:22.393906 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392083 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:52:22.393906 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392086 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:52:22.393906 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392088 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:52:22.393906 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392091 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:52:22.393906 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392094 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:52:22.393906 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392097 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:52:22.393906 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392099 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:52:22.393906 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392102 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:52:22.393906 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392105 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:52:22.393906 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392108 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:52:22.393906 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392110 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:52:22.393906 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392113 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:52:22.393906 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392116 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:52:22.393906 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392119 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:52:22.393906 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392121 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:52:22.393906 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392124 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:52:22.393906 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392127 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:52:22.393906 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392129 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:52:22.393906 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392132 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:52:22.393906 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392134 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:52:22.394456 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392137 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:52:22.394456 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392139 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:52:22.394456 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392142 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:52:22.394456 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392146 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:52:22.394456 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392149 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:52:22.394456 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392152 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:52:22.394456 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392154 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:52:22.394456 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392157 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:52:22.394456 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392159 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:52:22.394456 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392162 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:52:22.394456 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392164 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:52:22.394456 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392167 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:52:22.394456 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392169 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:52:22.394456 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392172 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:52:22.394456 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392174 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:52:22.394456 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392176 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:52:22.394456 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392180 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:52:22.394456 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392182 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:52:22.394456 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392185 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:52:22.394456 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392188 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:52:22.394960 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392190 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:52:22.394960 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392193 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:52:22.394960 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392195 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:52:22.394960 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392198 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:52:22.394960 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392200 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:52:22.394960 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392203 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:52:22.394960 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392205 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:52:22.394960 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392208 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:52:22.394960 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392211 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:52:22.394960 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392213 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:52:22.394960 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392216 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:52:22.394960 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392218 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:52:22.394960 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392221 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:52:22.394960 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392224 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:52:22.394960 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392227 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:52:22.394960 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392229 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:52:22.394960 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392232 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:52:22.394960 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392234 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:52:22.394960 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392237 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:52:22.394960 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392239 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:52:22.395459 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392242 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:52:22.395459 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392245 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:52:22.395459 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392247 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:52:22.395459 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392250 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:52:22.395459 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392252 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:52:22.395459 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392255 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:52:22.395459 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392257 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:52:22.395459 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392259 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:52:22.395459 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392262 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:52:22.395459 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392266 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:52:22.395459 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392270 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:52:22.395459 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392273 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:52:22.395459 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392336 2568 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 17:52:22.395459 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392343 2568 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 17:52:22.395459 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392349 2568 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 17:52:22.395459 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392353 2568 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 17:52:22.395459 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392357 2568 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 17:52:22.395459 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392361 2568 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 17:52:22.395459 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392365 2568 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 17:52:22.395459 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392370 2568 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 17:52:22.395459 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392373 2568 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 17:52:22.395991 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392376 2568 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 17:52:22.395991 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392380 2568 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 17:52:22.395991 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392383 2568 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 17:52:22.395991 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392386 2568 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 17:52:22.395991 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392389 2568 flags.go:64] FLAG: --cgroup-root="" Apr 23 17:52:22.395991 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392391 2568 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 17:52:22.395991 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392395 2568 flags.go:64] FLAG: --client-ca-file="" Apr 23 17:52:22.395991 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392397 2568 flags.go:64] FLAG: --cloud-config="" Apr 23 17:52:22.395991 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392400 2568 flags.go:64] FLAG: --cloud-provider="external" Apr 23 17:52:22.395991 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392403 2568 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 17:52:22.395991 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392409 2568 flags.go:64] FLAG: --cluster-domain="" Apr 23 17:52:22.395991 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392412 2568 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 17:52:22.395991 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392415 2568 flags.go:64] FLAG: --config-dir="" Apr 23 17:52:22.395991 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392418 2568 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 17:52:22.395991 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392421 2568 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 17:52:22.395991 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392425 2568 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 17:52:22.395991 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392428 2568 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 17:52:22.395991 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392431 2568 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 17:52:22.395991 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392435 2568 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 17:52:22.395991 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392438 2568 flags.go:64] FLAG: --contention-profiling="false" Apr 23 17:52:22.395991 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392441 2568 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 17:52:22.395991 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392445 2568 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 17:52:22.395991 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392448 2568 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 17:52:22.395991 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392451 2568 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 17:52:22.395991 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392455 2568 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 17:52:22.396809 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392458 2568 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 17:52:22.396809 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392461 2568 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 17:52:22.396809 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392464 2568 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 17:52:22.396809 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392467 2568 flags.go:64] FLAG: --enable-server="true" Apr 23 17:52:22.396809 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392470 2568 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 17:52:22.396809 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392475 2568 flags.go:64] FLAG: --event-burst="100" Apr 23 17:52:22.396809 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392478 2568 flags.go:64] FLAG: --event-qps="50" Apr 23 17:52:22.396809 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392481 2568 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 17:52:22.396809 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392488 2568 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 17:52:22.396809 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392491 2568 flags.go:64] FLAG: --eviction-hard="" Apr 23 17:52:22.396809 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392495 2568 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 17:52:22.396809 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392497 2568 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 17:52:22.396809 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392500 2568 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 17:52:22.396809 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392503 2568 flags.go:64] FLAG: --eviction-soft="" Apr 23 17:52:22.396809 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392506 2568 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 17:52:22.396809 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392509 2568 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 17:52:22.396809 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392512 2568 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 17:52:22.396809 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392516 2568 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 17:52:22.396809 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392519 2568 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 17:52:22.396809 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392522 2568 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 17:52:22.396809 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392525 2568 flags.go:64] FLAG: --feature-gates="" Apr 23 17:52:22.396809 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392528 2568 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 17:52:22.396809 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392531 2568 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 17:52:22.396809 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392539 2568 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 17:52:22.396809 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392542 2568 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 17:52:22.397646 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392545 2568 flags.go:64] FLAG: --healthz-port="10248" Apr 23 17:52:22.397646 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392548 2568 flags.go:64] FLAG: --help="false" Apr 23 17:52:22.397646 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392551 2568 flags.go:64] FLAG: --hostname-override="ip-10-0-131-144.ec2.internal" Apr 23 17:52:22.397646 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392554 2568 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 17:52:22.397646 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392557 2568 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 17:52:22.397646 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392560 2568 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 17:52:22.397646 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392563 2568 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 17:52:22.397646 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392566 2568 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 17:52:22.397646 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392569 2568 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 17:52:22.397646 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392572 2568 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 17:52:22.397646 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392575 2568 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 17:52:22.397646 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392578 2568 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 17:52:22.397646 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392581 2568 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 17:52:22.397646 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392583 2568 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 17:52:22.397646 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392587 2568 flags.go:64] FLAG: --kube-reserved="" Apr 23 17:52:22.397646 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392590 2568 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 17:52:22.397646 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392593 2568 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 17:52:22.397646 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392596 2568 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 17:52:22.397646 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392599 2568 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 17:52:22.397646 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392602 2568 flags.go:64] FLAG: --lock-file="" Apr 23 17:52:22.397646 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392605 2568 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 17:52:22.397646 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392607 2568 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 17:52:22.397646 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392610 2568 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 17:52:22.397646 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392616 2568 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 17:52:22.398268 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392620 2568 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 17:52:22.398268 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392623 2568 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 17:52:22.398268 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392626 2568 flags.go:64] FLAG: --logging-format="text" Apr 23 17:52:22.398268 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392628 2568 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 17:52:22.398268 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392632 2568 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 17:52:22.398268 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392634 2568 flags.go:64] FLAG: --manifest-url="" Apr 23 17:52:22.398268 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392638 2568 flags.go:64] FLAG: --manifest-url-header="" Apr 23 17:52:22.398268 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392642 2568 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 17:52:22.398268 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392645 2568 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 17:52:22.398268 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392649 2568 flags.go:64] FLAG: --max-pods="110" Apr 23 17:52:22.398268 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392652 2568 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 17:52:22.398268 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392655 2568 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 17:52:22.398268 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392658 2568 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 17:52:22.398268 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392661 2568 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 17:52:22.398268 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392664 2568 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 17:52:22.398268 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392667 2568 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 17:52:22.398268 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392670 2568 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 17:52:22.398268 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392677 2568 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 17:52:22.398268 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392680 2568 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 17:52:22.398268 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392683 2568 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 17:52:22.398268 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392686 2568 flags.go:64] FLAG: --pod-cidr="" Apr 23 17:52:22.398268 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392702 2568 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 17:52:22.398268 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392708 2568 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 17:52:22.398861 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392711 2568 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 17:52:22.398861 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392714 2568 flags.go:64] FLAG: --pods-per-core="0" Apr 23 17:52:22.398861 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392717 2568 flags.go:64] FLAG: --port="10250" Apr 23 17:52:22.398861 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392720 2568 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 17:52:22.398861 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392723 2568 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0314ab5bcba2e417b" Apr 23 17:52:22.398861 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392726 2568 flags.go:64] FLAG: --qos-reserved="" Apr 23 17:52:22.398861 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392729 2568 flags.go:64] FLAG: --read-only-port="10255" Apr 23 17:52:22.398861 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392732 2568 flags.go:64] FLAG: --register-node="true" Apr 23 17:52:22.398861 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392735 2568 flags.go:64] FLAG: --register-schedulable="true" Apr 23 17:52:22.398861 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392739 2568 flags.go:64] FLAG: --register-with-taints="" Apr 23 17:52:22.398861 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392743 2568 flags.go:64] FLAG: --registry-burst="10" Apr 23 17:52:22.398861 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392745 2568 flags.go:64] FLAG: --registry-qps="5" Apr 23 17:52:22.398861 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392748 2568 flags.go:64] FLAG: --reserved-cpus="" Apr 23 17:52:22.398861 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392751 2568 flags.go:64] FLAG: --reserved-memory="" Apr 23 17:52:22.398861 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392755 2568 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 17:52:22.398861 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392758 2568 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 17:52:22.398861 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392762 2568 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 17:52:22.398861 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392765 2568 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 17:52:22.398861 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392768 2568 flags.go:64] FLAG: --runonce="false" Apr 23 17:52:22.398861 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392771 2568 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 17:52:22.398861 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392774 2568 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 17:52:22.398861 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392777 2568 flags.go:64] FLAG: --seccomp-default="false" Apr 23 17:52:22.398861 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392780 2568 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 17:52:22.398861 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392783 2568 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 17:52:22.398861 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392785 2568 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 17:52:22.398861 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392788 2568 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 17:52:22.399495 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392791 2568 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 17:52:22.399495 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392794 2568 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 17:52:22.399495 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392797 2568 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 17:52:22.399495 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392800 2568 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 17:52:22.399495 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392803 2568 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 17:52:22.399495 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392806 2568 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 17:52:22.399495 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392808 2568 flags.go:64] FLAG: --system-cgroups="" Apr 23 17:52:22.399495 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392811 2568 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 17:52:22.399495 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392816 2568 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 17:52:22.399495 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392820 2568 flags.go:64] FLAG: --tls-cert-file="" Apr 23 17:52:22.399495 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392822 2568 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 17:52:22.399495 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392826 2568 flags.go:64] FLAG: --tls-min-version="" Apr 23 17:52:22.399495 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392829 2568 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 17:52:22.399495 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392832 2568 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 17:52:22.399495 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392834 2568 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 17:52:22.399495 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392839 2568 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 17:52:22.399495 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392842 2568 flags.go:64] FLAG: --v="2" Apr 23 17:52:22.399495 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392846 2568 flags.go:64] FLAG: --version="false" Apr 23 17:52:22.399495 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392849 2568 flags.go:64] FLAG: --vmodule="" Apr 23 17:52:22.399495 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392853 2568 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 17:52:22.399495 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.392856 2568 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 17:52:22.399495 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392954 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:52:22.399495 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392959 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:52:22.399495 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392962 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:52:22.400078 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392965 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:52:22.400078 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392968 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:52:22.400078 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392971 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:52:22.400078 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392973 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:52:22.400078 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392976 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:52:22.400078 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392979 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:52:22.400078 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392981 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:52:22.400078 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392984 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:52:22.400078 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392986 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:52:22.400078 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392989 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:52:22.400078 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392992 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:52:22.400078 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392994 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:52:22.400078 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.392997 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:52:22.400078 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393000 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:52:22.400078 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393002 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:52:22.400078 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393005 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:52:22.400078 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393007 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:52:22.400078 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393010 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:52:22.400078 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393013 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:52:22.400078 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393015 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:52:22.400613 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393018 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:52:22.400613 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393021 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:52:22.400613 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393023 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:52:22.400613 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393027 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:52:22.400613 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393031 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:52:22.400613 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393046 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:52:22.400613 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393050 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:52:22.400613 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393052 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:52:22.400613 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393055 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:52:22.400613 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393057 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:52:22.400613 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393061 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:52:22.400613 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393064 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:52:22.400613 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393066 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:52:22.400613 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393069 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:52:22.400613 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393072 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:52:22.400613 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393075 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:52:22.400613 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393078 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:52:22.400613 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393080 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:52:22.400613 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393083 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:52:22.400613 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393086 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:52:22.401132 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393088 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:52:22.401132 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393091 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:52:22.401132 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393094 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:52:22.401132 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393096 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:52:22.401132 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393099 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:52:22.401132 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393101 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:52:22.401132 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393104 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:52:22.401132 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393107 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:52:22.401132 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393109 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:52:22.401132 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393112 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:52:22.401132 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393117 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:52:22.401132 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393121 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:52:22.401132 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393123 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:52:22.401132 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393126 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:52:22.401132 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393128 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:52:22.401132 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393135 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:52:22.401132 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393138 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:52:22.401132 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393141 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:52:22.401132 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393143 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:52:22.401132 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393146 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:52:22.401625 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393149 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:52:22.401625 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393151 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:52:22.401625 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393155 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:52:22.401625 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393157 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:52:22.401625 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393160 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:52:22.401625 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393162 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:52:22.401625 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393165 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:52:22.401625 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393168 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:52:22.401625 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393170 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:52:22.401625 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393173 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:52:22.401625 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393175 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:52:22.401625 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393177 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:52:22.401625 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393180 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:52:22.401625 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393183 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:52:22.401625 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393185 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:52:22.401625 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393187 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:52:22.401625 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393190 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:52:22.401625 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393194 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:52:22.401625 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393197 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:52:22.402153 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393200 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:52:22.402153 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393203 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:52:22.402153 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393206 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:52:22.402153 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.393208 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:52:22.402153 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.393214 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 17:52:22.402153 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.399673 2568 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 17:52:22.402153 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.399776 2568 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 17:52:22.402153 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399820 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:52:22.402153 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399825 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:52:22.402153 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399829 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:52:22.402153 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399832 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:52:22.402153 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399835 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:52:22.402153 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399837 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:52:22.402153 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399840 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:52:22.402153 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399843 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:52:22.402153 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399846 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:52:22.402556 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399849 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:52:22.402556 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399851 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:52:22.402556 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399855 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:52:22.402556 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399857 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:52:22.402556 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399860 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:52:22.402556 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399862 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:52:22.402556 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399865 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:52:22.402556 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399867 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:52:22.402556 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399870 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:52:22.402556 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399872 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:52:22.402556 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399875 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:52:22.402556 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399878 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:52:22.402556 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399880 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:52:22.402556 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399883 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:52:22.402556 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399885 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:52:22.402556 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399888 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:52:22.402556 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399891 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:52:22.402556 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399893 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:52:22.402556 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399896 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:52:22.402556 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399899 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:52:22.403083 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399902 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:52:22.403083 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399904 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:52:22.403083 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399907 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:52:22.403083 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399910 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:52:22.403083 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399913 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:52:22.403083 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399915 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:52:22.403083 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399918 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:52:22.403083 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399921 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:52:22.403083 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399923 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:52:22.403083 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399942 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:52:22.403083 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399946 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:52:22.403083 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399949 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:52:22.403083 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399951 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:52:22.403083 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399954 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:52:22.403083 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399957 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:52:22.403083 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399960 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:52:22.403083 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399962 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:52:22.403083 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399965 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:52:22.403083 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399968 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:52:22.403083 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399970 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:52:22.403600 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399973 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:52:22.403600 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399975 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:52:22.403600 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399978 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:52:22.403600 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399980 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:52:22.403600 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399983 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:52:22.403600 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399985 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:52:22.403600 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399988 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:52:22.403600 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399991 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:52:22.403600 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399994 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:52:22.403600 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399996 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:52:22.403600 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.399999 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:52:22.403600 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400002 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:52:22.403600 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400004 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:52:22.403600 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400007 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:52:22.403600 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400010 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:52:22.403600 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400014 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:52:22.403600 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400017 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:52:22.403600 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400020 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:52:22.403600 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400023 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:52:22.403600 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400025 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:52:22.404195 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400028 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:52:22.404195 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400030 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:52:22.404195 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400033 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:52:22.404195 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400035 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:52:22.404195 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400038 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:52:22.404195 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400040 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:52:22.404195 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400043 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:52:22.404195 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400046 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:52:22.404195 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400048 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:52:22.404195 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400051 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:52:22.404195 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400054 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:52:22.404195 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400056 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:52:22.404195 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400059 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:52:22.404195 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400061 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:52:22.404195 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400064 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:52:22.404195 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400067 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:52:22.404195 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400072 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:52:22.404619 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.400077 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 17:52:22.404619 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400189 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:52:22.404619 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400194 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:52:22.404619 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400197 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:52:22.404619 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400200 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:52:22.404619 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400203 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:52:22.404619 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400206 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:52:22.404619 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400208 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:52:22.404619 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400211 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:52:22.404619 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400214 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:52:22.404619 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400216 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:52:22.404619 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400219 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:52:22.404619 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400221 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:52:22.404619 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400225 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:52:22.404619 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400227 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:52:22.404619 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400230 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:52:22.405116 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400232 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:52:22.405116 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400234 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:52:22.405116 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400237 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:52:22.405116 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400239 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:52:22.405116 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400242 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:52:22.405116 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400244 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:52:22.405116 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400247 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:52:22.405116 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400250 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:52:22.405116 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400253 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:52:22.405116 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400255 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:52:22.405116 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400258 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:52:22.405116 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400260 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:52:22.405116 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400262 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:52:22.405116 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400265 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:52:22.405116 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400267 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:52:22.405116 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400270 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:52:22.405116 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400272 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:52:22.405116 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400275 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:52:22.405116 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400277 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:52:22.405116 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400279 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:52:22.405638 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400282 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:52:22.405638 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400284 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:52:22.405638 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400287 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:52:22.405638 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400290 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:52:22.405638 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400292 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:52:22.405638 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400295 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:52:22.405638 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400297 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:52:22.405638 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400300 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:52:22.405638 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400303 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:52:22.405638 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400305 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:52:22.405638 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400308 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:52:22.405638 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400311 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:52:22.405638 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400313 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:52:22.405638 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400316 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:52:22.405638 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400318 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:52:22.405638 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400321 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:52:22.405638 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400323 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:52:22.405638 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400326 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:52:22.405638 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400328 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:52:22.405638 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400331 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:52:22.406275 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400335 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:52:22.406275 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400337 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:52:22.406275 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400340 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:52:22.406275 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400342 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:52:22.406275 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400345 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:52:22.406275 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400347 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:52:22.406275 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400350 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:52:22.406275 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400352 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:52:22.406275 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400355 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:52:22.406275 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400358 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:52:22.406275 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400360 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:52:22.406275 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400363 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:52:22.406275 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400366 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:52:22.406275 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400368 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:52:22.406275 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400371 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:52:22.406275 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400373 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:52:22.406275 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400376 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:52:22.406275 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400378 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:52:22.406275 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400381 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:52:22.406275 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400383 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:52:22.406990 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400386 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:52:22.406990 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400388 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:52:22.406990 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400391 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:52:22.406990 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400393 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:52:22.406990 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400396 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:52:22.406990 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400398 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:52:22.406990 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400401 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:52:22.406990 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400406 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:52:22.406990 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400411 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:52:22.406990 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400414 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:52:22.406990 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:22.400418 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:52:22.406990 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.400423 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 17:52:22.406990 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.401123 2568 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 17:52:22.406990 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.402952 2568 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 17:52:22.407353 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.403873 2568 server.go:1019] "Starting client certificate rotation" Apr 23 17:52:22.407353 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.403988 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 17:52:22.407353 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.404811 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 17:52:22.426017 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.426001 2568 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 17:52:22.430600 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.430582 2568 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 17:52:22.444858 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.444834 2568 log.go:25] "Validated CRI v1 runtime API" Apr 23 17:52:22.450402 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.450384 2568 log.go:25] "Validated CRI v1 image API" Apr 23 17:52:22.454175 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.454161 2568 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 17:52:22.456358 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.456339 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 17:52:22.458115 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.458096 2568 fs.go:135] Filesystem UUIDs: map[2a358010-c1c7-4dea-af40-abb98a66bec6:/dev/nvme0n1p3 2d00d75f-1a0b-4e02-8d53-f76674d302c3:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 23 17:52:22.458162 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.458116 2568 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 17:52:22.464224 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.464124 2568 manager.go:217] Machine: {Timestamp:2026-04-23 17:52:22.462258853 +0000 UTC m=+0.382351320 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3105438 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2d70a716feff83b4f21dbbe82ba0db SystemUUID:ec2d70a7-16fe-ff83-b4f2-1dbbe82ba0db BootID:6237d49f-20d1-402a-a986-6385b82a27a9 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:29:25:c4:b8:4f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:29:25:c4:b8:4f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:32:a9:59:22:c9:32 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 17:52:22.464224 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.464224 2568 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 17:52:22.464327 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.464297 2568 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 17:52:22.466767 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.466743 2568 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 17:52:22.466895 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.466770 2568 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-131-144.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 17:52:22.466953 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.466904 2568 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 17:52:22.466953 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.466912 2568 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 17:52:22.466953 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.466925 2568 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 17:52:22.467663 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.467653 2568 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 17:52:22.468856 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.468846 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 23 17:52:22.468984 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.468975 2568 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 17:52:22.471353 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.471344 2568 kubelet.go:491] "Attempting to sync node with API server" Apr 23 17:52:22.471388 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.471361 2568 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 17:52:22.471388 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.471373 2568 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 17:52:22.471388 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.471382 2568 kubelet.go:397] "Adding apiserver pod source" Apr 23 17:52:22.471503 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.471391 2568 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 17:52:22.472869 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.472857 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 17:52:22.472919 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.472875 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 17:52:22.476258 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.476243 2568 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 17:52:22.477536 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.477519 2568 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 17:52:22.479105 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.479092 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 17:52:22.479172 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.479111 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 17:52:22.479172 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.479117 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 17:52:22.479172 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.479126 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 17:52:22.479172 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.479138 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 17:52:22.479172 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.479148 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 17:52:22.479172 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.479156 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 17:52:22.479172 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.479166 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 17:52:22.479365 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.479177 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 17:52:22.479365 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.479185 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 17:52:22.479365 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.479205 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 17:52:22.479365 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.479214 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 17:52:22.480286 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.480267 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-dpgd5" Apr 23 17:52:22.480923 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.480912 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 17:52:22.480923 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.480923 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 17:52:22.482849 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:22.482790 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 17:52:22.482972 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:22.482950 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-131-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 17:52:22.484459 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.484445 2568 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 17:52:22.484533 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.484486 2568 server.go:1295] "Started kubelet" Apr 23 17:52:22.484589 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.484539 2568 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 17:52:22.484681 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.484625 2568 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 17:52:22.484721 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.484703 2568 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 17:52:22.485401 ip-10-0-131-144 systemd[1]: Started Kubernetes Kubelet. Apr 23 17:52:22.485664 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.485639 2568 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 17:52:22.488768 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.488737 2568 server.go:317] "Adding debug handlers to kubelet server" Apr 23 17:52:22.489770 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.489754 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:52:22.492113 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.492089 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 17:52:22.492587 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.492565 2568 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 17:52:22.493230 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.493214 2568 factory.go:55] Registering systemd factory Apr 23 17:52:22.493306 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.493240 2568 factory.go:223] Registration of the systemd container factory successfully Apr 23 17:52:22.493373 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.493313 2568 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 17:52:22.493373 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.493357 2568 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 17:52:22.493373 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.493371 2568 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 17:52:22.493534 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.493502 2568 reconstruct.go:97] "Volume reconstruction finished" Apr 23 17:52:22.493584 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:22.493502 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:52:22.493584 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.493552 2568 reconciler.go:26] "Reconciler: start to sync state" Apr 23 17:52:22.493584 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.493504 2568 factory.go:153] Registering CRI-O factory Apr 23 17:52:22.493584 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.493572 2568 factory.go:223] Registration of the crio container factory successfully Apr 23 17:52:22.493777 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.493634 2568 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 17:52:22.493777 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.493668 2568 factory.go:103] Registering Raw factory Apr 23 17:52:22.493777 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.493688 2568 manager.go:1196] Started watching for new ooms in manager Apr 23 17:52:22.494093 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.494080 2568 manager.go:319] Starting recovery of all containers Apr 23 17:52:22.500925 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:22.500899 2568 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-131-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 23 17:52:22.501043 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:22.501020 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 23 17:52:22.502210 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:22.501173 2568 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-144.ec2.internal.18a90dd667781559 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-144.ec2.internal,UID:ip-10-0-131-144.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:22.484456793 +0000 UTC m=+0.404549261,LastTimestamp:2026-04-23 17:52:22.484456793 +0000 UTC m=+0.404549261,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:52:22.505458 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.505445 2568 manager.go:324] Recovery completed Apr 23 17:52:22.509474 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.509463 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:52:22.511741 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.511726 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:52:22.511828 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.511756 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:52:22.511828 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.511770 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:52:22.512300 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.512287 2568 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 17:52:22.512300 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.512299 2568 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 17:52:22.512393 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.512313 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 23 17:52:22.513535 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:22.513472 2568 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-144.ec2.internal.18a90dd6691869c2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-144.ec2.internal,UID:ip-10-0-131-144.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-131-144.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:22.511741378 +0000 UTC m=+0.431833864,LastTimestamp:2026-04-23 17:52:22.511741378 +0000 UTC m=+0.431833864,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:52:22.514818 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.514805 2568 policy_none.go:49] "None policy: Start" Apr 23 17:52:22.514859 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.514821 2568 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 17:52:22.514859 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.514831 2568 state_mem.go:35] "Initializing new in-memory state store" Apr 23 17:52:22.523206 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:22.523105 2568 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-144.ec2.internal.18a90dd66918bc0e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-144.ec2.internal,UID:ip-10-0-131-144.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-131-144.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:22.511762446 +0000 UTC m=+0.431854913,LastTimestamp:2026-04-23 17:52:22.511762446 +0000 UTC m=+0.431854913,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:52:22.530665 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:22.530590 2568 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-144.ec2.internal.18a90dd66918ee95 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-144.ec2.internal,UID:ip-10-0-131-144.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-131-144.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:22.511775381 +0000 UTC m=+0.431867850,LastTimestamp:2026-04-23 17:52:22.511775381 +0000 UTC m=+0.431867850,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:52:22.562898 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.551276 2568 manager.go:341] "Starting Device Plugin manager" Apr 23 17:52:22.562898 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:22.551318 2568 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 17:52:22.562898 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.551332 2568 server.go:85] "Starting device plugin registration server" Apr 23 17:52:22.562898 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.551563 2568 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 17:52:22.562898 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.551576 2568 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 17:52:22.562898 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.551687 2568 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 17:52:22.562898 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.551755 2568 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 17:52:22.562898 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.551763 2568 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 17:52:22.562898 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:22.552436 2568 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 17:52:22.562898 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:22.552465 2568 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:52:22.570128 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:22.570059 2568 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-144.ec2.internal.18a90dd66c0dd3e7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-144.ec2.internal,UID:ip-10-0-131-144.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:22.561379303 +0000 UTC m=+0.481471769,LastTimestamp:2026-04-23 17:52:22.561379303 +0000 UTC m=+0.481471769,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:52:22.593203 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.593181 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 17:52:22.594327 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.594313 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 17:52:22.594409 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.594341 2568 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 17:52:22.594409 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.594360 2568 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 17:52:22.594409 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.594370 2568 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 17:52:22.594545 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:22.594443 2568 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 17:52:22.603439 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:22.603417 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Apr 23 17:52:22.652710 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.652658 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:52:22.653432 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.653416 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:52:22.653500 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.653446 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:52:22.653500 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.653456 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:52:22.653500 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.653475 2568 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-144.ec2.internal" Apr 23 17:52:22.662622 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:22.662557 2568 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-131-144.ec2.internal.18a90dd6691869c2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-144.ec2.internal.18a90dd6691869c2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-144.ec2.internal,UID:ip-10-0-131-144.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-131-144.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:22.511741378 +0000 UTC m=+0.431833864,LastTimestamp:2026-04-23 17:52:22.65343328 +0000 UTC m=+0.573525746,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:52:22.672101 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:22.672076 2568 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"ip-10-0-131-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="ip-10-0-131-144.ec2.internal" Apr 23 17:52:22.672376 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:22.672315 2568 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-131-144.ec2.internal.18a90dd66918bc0e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-144.ec2.internal.18a90dd66918bc0e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-144.ec2.internal,UID:ip-10-0-131-144.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-131-144.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:22.511762446 +0000 UTC m=+0.431854913,LastTimestamp:2026-04-23 17:52:22.6534512 +0000 UTC m=+0.573543666,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:52:22.674182 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:22.674129 2568 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-131-144.ec2.internal.18a90dd66918ee95\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-144.ec2.internal.18a90dd66918ee95 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-144.ec2.internal,UID:ip-10-0-131-144.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-131-144.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:22.511775381 +0000 UTC m=+0.431867850,LastTimestamp:2026-04-23 17:52:22.653459707 +0000 UTC m=+0.573552173,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:52:22.695066 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.695046 2568 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-131-144.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal"] Apr 23 17:52:22.695123 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.695101 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:52:22.695806 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.695792 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:52:22.695880 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.695823 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:52:22.695880 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.695837 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:52:22.697111 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.697098 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:52:22.697244 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.697231 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-144.ec2.internal" Apr 23 17:52:22.697294 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.697255 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:52:22.697758 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.697742 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:52:22.697824 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.697770 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:52:22.697824 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.697784 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:52:22.697824 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.697792 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:52:22.697824 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.697809 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:52:22.697824 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.697821 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:52:22.698107 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:22.698052 2568 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-131-144.ec2.internal.18a90dd6691869c2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-144.ec2.internal.18a90dd6691869c2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-144.ec2.internal,UID:ip-10-0-131-144.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-131-144.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:22.511741378 +0000 UTC m=+0.431833864,LastTimestamp:2026-04-23 17:52:22.695806171 +0000 UTC m=+0.615898644,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:52:22.698960 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.698946 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal" Apr 23 17:52:22.699038 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.698973 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:52:22.699614 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.699597 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:52:22.699704 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.699623 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:52:22.699704 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.699634 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:52:22.705163 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:22.705087 2568 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-131-144.ec2.internal.18a90dd66918bc0e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-144.ec2.internal.18a90dd66918bc0e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-144.ec2.internal,UID:ip-10-0-131-144.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-131-144.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:22.511762446 +0000 UTC m=+0.431854913,LastTimestamp:2026-04-23 17:52:22.695829778 +0000 UTC m=+0.615922264,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:52:22.705163 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:22.705155 2568 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-131-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="400ms" Apr 23 17:52:22.710539 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:22.710485 2568 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-131-144.ec2.internal.18a90dd66918ee95\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-144.ec2.internal.18a90dd66918ee95 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-144.ec2.internal,UID:ip-10-0-131-144.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-131-144.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:22.511775381 +0000 UTC m=+0.431867850,LastTimestamp:2026-04-23 17:52:22.695842724 +0000 UTC m=+0.615935191,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:52:22.719726 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:22.719668 2568 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-131-144.ec2.internal.18a90dd6691869c2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-144.ec2.internal.18a90dd6691869c2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-144.ec2.internal,UID:ip-10-0-131-144.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-131-144.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:22.511741378 +0000 UTC m=+0.431833864,LastTimestamp:2026-04-23 17:52:22.6977551 +0000 UTC m=+0.617847572,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:52:22.726855 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:22.726793 2568 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-131-144.ec2.internal.18a90dd66918bc0e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-144.ec2.internal.18a90dd66918bc0e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-144.ec2.internal,UID:ip-10-0-131-144.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-131-144.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:22.511762446 +0000 UTC m=+0.431854913,LastTimestamp:2026-04-23 17:52:22.697777063 +0000 UTC m=+0.617869530,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:52:22.733647 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:22.733620 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-144.ec2.internal\" not found" node="ip-10-0-131-144.ec2.internal" Apr 23 17:52:22.735455 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:22.735401 2568 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-131-144.ec2.internal.18a90dd66918ee95\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-144.ec2.internal.18a90dd66918ee95 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-144.ec2.internal,UID:ip-10-0-131-144.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-131-144.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:22.511775381 +0000 UTC m=+0.431867850,LastTimestamp:2026-04-23 17:52:22.697787882 +0000 UTC m=+0.617880347,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:52:22.738018 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:22.737996 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-144.ec2.internal\" not found" node="ip-10-0-131-144.ec2.internal" Apr 23 17:52:22.742997 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:22.742945 2568 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-131-144.ec2.internal.18a90dd6691869c2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-144.ec2.internal.18a90dd6691869c2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-144.ec2.internal,UID:ip-10-0-131-144.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-131-144.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:22.511741378 +0000 UTC m=+0.431833864,LastTimestamp:2026-04-23 17:52:22.697801997 +0000 UTC m=+0.617894463,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:52:22.754616 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:22.754560 2568 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-131-144.ec2.internal.18a90dd66918bc0e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-144.ec2.internal.18a90dd66918bc0e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-144.ec2.internal,UID:ip-10-0-131-144.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-131-144.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:22.511762446 +0000 UTC m=+0.431854913,LastTimestamp:2026-04-23 17:52:22.697814077 +0000 UTC m=+0.617906544,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:52:22.762479 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:22.762421 2568 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-131-144.ec2.internal.18a90dd66918ee95\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-144.ec2.internal.18a90dd66918ee95 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-144.ec2.internal,UID:ip-10-0-131-144.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-131-144.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:22.511775381 +0000 UTC m=+0.431867850,LastTimestamp:2026-04-23 17:52:22.697827473 +0000 UTC m=+0.617919941,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:52:22.771047 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:22.770993 2568 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-131-144.ec2.internal.18a90dd6691869c2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-144.ec2.internal.18a90dd6691869c2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-144.ec2.internal,UID:ip-10-0-131-144.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-131-144.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:22.511741378 +0000 UTC m=+0.431833864,LastTimestamp:2026-04-23 17:52:22.699611067 +0000 UTC m=+0.619703533,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:52:22.778116 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:22.778062 2568 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-131-144.ec2.internal.18a90dd66918bc0e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-144.ec2.internal.18a90dd66918bc0e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-144.ec2.internal,UID:ip-10-0-131-144.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-131-144.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:22.511762446 +0000 UTC m=+0.431854913,LastTimestamp:2026-04-23 17:52:22.699629524 +0000 UTC m=+0.619721989,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:52:22.786634 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:22.786573 2568 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-131-144.ec2.internal.18a90dd66918ee95\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-144.ec2.internal.18a90dd66918ee95 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-144.ec2.internal,UID:ip-10-0-131-144.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-131-144.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:22.511775381 +0000 UTC m=+0.431867850,LastTimestamp:2026-04-23 17:52:22.699638124 +0000 UTC m=+0.619730591,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:52:22.794414 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.794393 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8dc4456d0c7e39a0756b16fde0e83318-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal\" (UID: \"8dc4456d0c7e39a0756b16fde0e83318\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal" Apr 23 17:52:22.794488 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.794418 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/68669faf3d2c20c0dc154d751d5f0070-config\") pod \"kube-apiserver-proxy-ip-10-0-131-144.ec2.internal\" (UID: \"68669faf3d2c20c0dc154d751d5f0070\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-144.ec2.internal" Apr 23 17:52:22.794488 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.794434 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8dc4456d0c7e39a0756b16fde0e83318-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal\" (UID: \"8dc4456d0c7e39a0756b16fde0e83318\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal" Apr 23 17:52:22.872991 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.872960 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:52:22.873914 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.873892 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:52:22.874009 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.873921 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:52:22.874009 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.873947 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:52:22.874009 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.873971 2568 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-144.ec2.internal" Apr 23 17:52:22.881581 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:22.881512 2568 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-131-144.ec2.internal.18a90dd6691869c2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-144.ec2.internal.18a90dd6691869c2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-144.ec2.internal,UID:ip-10-0-131-144.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-131-144.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:22.511741378 +0000 UTC m=+0.431833864,LastTimestamp:2026-04-23 17:52:22.873908129 +0000 UTC m=+0.794000598,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:52:22.889980 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:22.889953 2568 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"ip-10-0-131-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="ip-10-0-131-144.ec2.internal" Apr 23 17:52:22.890079 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:22.890010 2568 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-131-144.ec2.internal.18a90dd66918bc0e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-144.ec2.internal.18a90dd66918bc0e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-144.ec2.internal,UID:ip-10-0-131-144.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-131-144.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:22.511762446 +0000 UTC m=+0.431854913,LastTimestamp:2026-04-23 17:52:22.873925719 +0000 UTC m=+0.794018184,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:52:22.894979 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.894957 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/68669faf3d2c20c0dc154d751d5f0070-config\") pod \"kube-apiserver-proxy-ip-10-0-131-144.ec2.internal\" (UID: \"68669faf3d2c20c0dc154d751d5f0070\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-144.ec2.internal" Apr 23 17:52:22.895050 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.894988 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8dc4456d0c7e39a0756b16fde0e83318-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal\" (UID: \"8dc4456d0c7e39a0756b16fde0e83318\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal" Apr 23 17:52:22.895050 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.895004 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8dc4456d0c7e39a0756b16fde0e83318-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal\" (UID: \"8dc4456d0c7e39a0756b16fde0e83318\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal" Apr 23 17:52:22.895118 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.895050 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/68669faf3d2c20c0dc154d751d5f0070-config\") pod \"kube-apiserver-proxy-ip-10-0-131-144.ec2.internal\" (UID: \"68669faf3d2c20c0dc154d751d5f0070\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-144.ec2.internal" Apr 23 17:52:22.895118 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.895062 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8dc4456d0c7e39a0756b16fde0e83318-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal\" (UID: \"8dc4456d0c7e39a0756b16fde0e83318\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal" Apr 23 17:52:22.895118 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:22.895099 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8dc4456d0c7e39a0756b16fde0e83318-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal\" (UID: \"8dc4456d0c7e39a0756b16fde0e83318\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal" Apr 23 17:52:22.897696 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:22.897632 2568 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-131-144.ec2.internal.18a90dd66918ee95\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-144.ec2.internal.18a90dd66918ee95 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-144.ec2.internal,UID:ip-10-0-131-144.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-131-144.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:22.511775381 +0000 UTC m=+0.431867850,LastTimestamp:2026-04-23 17:52:22.873951433 +0000 UTC m=+0.794043899,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:52:23.036467 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:23.036406 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-144.ec2.internal" Apr 23 17:52:23.039137 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:23.039114 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal" Apr 23 17:52:23.115780 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:23.115743 2568 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-131-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="800ms" Apr 23 17:52:23.290374 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:23.290309 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:52:23.291183 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:23.291165 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:52:23.291298 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:23.291198 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:52:23.291298 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:23.291211 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:52:23.291298 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:23.291238 2568 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-144.ec2.internal" Apr 23 17:52:23.298697 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:23.298614 2568 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-131-144.ec2.internal.18a90dd6691869c2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-144.ec2.internal.18a90dd6691869c2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-144.ec2.internal,UID:ip-10-0-131-144.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-131-144.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:22.511741378 +0000 UTC m=+0.431833864,LastTimestamp:2026-04-23 17:52:23.291180883 +0000 UTC m=+1.211273349,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:52:23.307160 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:23.307085 2568 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-131-144.ec2.internal.18a90dd66918bc0e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-144.ec2.internal.18a90dd66918bc0e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-144.ec2.internal,UID:ip-10-0-131-144.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-131-144.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:22.511762446 +0000 UTC m=+0.431854913,LastTimestamp:2026-04-23 17:52:23.291205501 +0000 UTC m=+1.211297966,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:52:23.307244 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:23.307224 2568 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"ip-10-0-131-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="ip-10-0-131-144.ec2.internal" Apr 23 17:52:23.350918 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:23.350895 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 17:52:23.500831 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:23.500807 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:52:23.516048 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:23.516020 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 23 17:52:23.567830 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:23.567799 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68669faf3d2c20c0dc154d751d5f0070.slice/crio-1b9751ed09666d7a6f2d7c84647da3da0472d463289728344b6586dd4077bd67 WatchSource:0}: Error finding container 1b9751ed09666d7a6f2d7c84647da3da0472d463289728344b6586dd4077bd67: Status 404 returned error can't find the container with id 1b9751ed09666d7a6f2d7c84647da3da0472d463289728344b6586dd4077bd67 Apr 23 17:52:23.568259 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:52:23.568235 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8dc4456d0c7e39a0756b16fde0e83318.slice/crio-bbc22f9a4761529c08a423340d63a7de7e911249e23700c938493b78cc1c48f4 WatchSource:0}: Error finding container bbc22f9a4761529c08a423340d63a7de7e911249e23700c938493b78cc1c48f4: Status 404 returned error can't find the container with id bbc22f9a4761529c08a423340d63a7de7e911249e23700c938493b78cc1c48f4 Apr 23 17:52:23.572593 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:23.572577 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 17:52:23.582597 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:23.582524 2568 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal.18a90dd6a85718b7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal,UID:8dc4456d0c7e39a0756b16fde0e83318,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8e1f4bf2daa69c9e69766cebcfced43fa0ee926d4479becdc8a5b05b93ca6e81\",Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:23.572814007 +0000 UTC m=+1.492906459,LastTimestamp:2026-04-23 17:52:23.572814007 +0000 UTC m=+1.492906459,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:52:23.589205 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:23.589134 2568 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{kube-apiserver-proxy-ip-10-0-131-144.ec2.internal.18a90dd6a857ef3a kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-proxy-ip-10-0-131-144.ec2.internal,UID:68669faf3d2c20c0dc154d751d5f0070,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{haproxy},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:294af5c64228434d1ed6ee8ea3ac802e3c999aa847223e3b2efa18425a9fe421\",Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:23.572868922 +0000 UTC m=+1.492961374,LastTimestamp:2026-04-23 17:52:23.572868922 +0000 UTC m=+1.492961374,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:52:23.596890 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:23.596851 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal" event={"ID":"8dc4456d0c7e39a0756b16fde0e83318","Type":"ContainerStarted","Data":"bbc22f9a4761529c08a423340d63a7de7e911249e23700c938493b78cc1c48f4"} Apr 23 17:52:23.597771 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:23.597747 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-144.ec2.internal" event={"ID":"68669faf3d2c20c0dc154d751d5f0070","Type":"ContainerStarted","Data":"1b9751ed09666d7a6f2d7c84647da3da0472d463289728344b6586dd4077bd67"} Apr 23 17:52:23.925328 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:23.925241 2568 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-131-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="1.6s" Apr 23 17:52:24.033034 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:24.032996 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Apr 23 17:52:24.049534 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:24.049500 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-131-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 17:52:24.108072 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:24.108044 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:52:24.109992 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:24.109217 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:52:24.109992 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:24.109250 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:52:24.109992 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:24.109264 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:52:24.109992 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:24.109296 2568 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-144.ec2.internal" Apr 23 17:52:24.128021 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:24.127996 2568 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"ip-10-0-131-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="ip-10-0-131-144.ec2.internal" Apr 23 17:52:24.498843 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:24.498816 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:52:25.116995 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:25.116677 2568 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{kube-apiserver-proxy-ip-10-0-131-144.ec2.internal.18a90dd703adfd98 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-proxy-ip-10-0-131-144.ec2.internal,UID:68669faf3d2c20c0dc154d751d5f0070,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{haproxy},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:294af5c64228434d1ed6ee8ea3ac802e3c999aa847223e3b2efa18425a9fe421\" in 1.532s (1.532s including waiting). Image size: 488332864 bytes.,Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:25.105235352 +0000 UTC m=+3.025327824,LastTimestamp:2026-04-23 17:52:25.105235352 +0000 UTC m=+3.025327824,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:52:25.125776 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:25.125703 2568 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal.18a90dd703b6005a openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal,UID:8dc4456d0c7e39a0756b16fde0e83318,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8e1f4bf2daa69c9e69766cebcfced43fa0ee926d4479becdc8a5b05b93ca6e81\" in 1.532s (1.532s including waiting). Image size: 468435751 bytes.,Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:25.105760346 +0000 UTC m=+3.025852818,LastTimestamp:2026-04-23 17:52:25.105760346 +0000 UTC m=+3.025852818,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:52:25.183877 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:25.182992 2568 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{kube-apiserver-proxy-ip-10-0-131-144.ec2.internal.18a90dd707b5c9fd kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-proxy-ip-10-0-131-144.ec2.internal,UID:68669faf3d2c20c0dc154d751d5f0070,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{haproxy},},Reason:Created,Message:Created container: haproxy,Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:25.172855293 +0000 UTC m=+3.092947759,LastTimestamp:2026-04-23 17:52:25.172855293 +0000 UTC m=+3.092947759,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:52:25.191454 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:25.191312 2568 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{kube-apiserver-proxy-ip-10-0-131-144.ec2.internal.18a90dd7080dfd38 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-proxy-ip-10-0-131-144.ec2.internal,UID:68669faf3d2c20c0dc154d751d5f0070,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{haproxy},},Reason:Started,Message:Started container haproxy,Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:25.178635576 +0000 UTC m=+3.098728045,LastTimestamp:2026-04-23 17:52:25.178635576 +0000 UTC m=+3.098728045,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:52:25.501797 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:25.501687 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:52:25.536637 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:25.536608 2568 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-131-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="3.2s" Apr 23 17:52:25.601677 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:25.601638 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-144.ec2.internal" event={"ID":"68669faf3d2c20c0dc154d751d5f0070","Type":"ContainerStarted","Data":"18651141f7c27a5013f705fd4c444cf7869bd28719eb2231944a08edd74be8a6"} Apr 23 17:52:25.601803 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:25.601711 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:52:25.602685 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:25.602669 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:52:25.602727 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:25.602701 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:52:25.602727 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:25.602714 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:52:25.602901 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:25.602887 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-144.ec2.internal\" not found" node="ip-10-0-131-144.ec2.internal" Apr 23 17:52:25.637279 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:25.637201 2568 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal.18a90dd722e5b870 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal,UID:8dc4456d0c7e39a0756b16fde0e83318,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container: setup,Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:25.62898136 +0000 UTC m=+3.549073826,LastTimestamp:2026-04-23 17:52:25.62898136 +0000 UTC m=+3.549073826,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:52:25.646312 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:25.646245 2568 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal.18a90dd7234d8494 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal,UID:8dc4456d0c7e39a0756b16fde0e83318,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:25.635783828 +0000 UTC m=+3.555876293,LastTimestamp:2026-04-23 17:52:25.635783828 +0000 UTC m=+3.555876293,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:52:25.717789 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:25.717751 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Apr 23 17:52:25.729060 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:25.729030 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:52:25.731445 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:25.731424 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:52:25.731515 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:25.731458 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:52:25.731515 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:25.731469 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:52:25.731515 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:25.731495 2568 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-144.ec2.internal" Apr 23 17:52:25.749426 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:25.749400 2568 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"ip-10-0-131-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="ip-10-0-131-144.ec2.internal" Apr 23 17:52:26.499985 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:26.499948 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:52:26.532910 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:26.532876 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 17:52:26.551369 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:26.551343 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 23 17:52:26.604093 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:26.604057 2568 generic.go:358] "Generic (PLEG): container finished" podID="8dc4456d0c7e39a0756b16fde0e83318" containerID="968791006f4209035640f8460a41e822491326de81a61b27fd3389122029e8e7" exitCode=0 Apr 23 17:52:26.604209 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:26.604133 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:52:26.604209 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:26.604144 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal" event={"ID":"8dc4456d0c7e39a0756b16fde0e83318","Type":"ContainerDied","Data":"968791006f4209035640f8460a41e822491326de81a61b27fd3389122029e8e7"} Apr 23 17:52:26.604209 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:26.604170 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:52:26.606448 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:26.606433 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:52:26.606527 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:26.606462 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:52:26.606527 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:26.606474 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:52:26.606527 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:26.606433 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:52:26.606640 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:26.606535 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:52:26.606640 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:26.606547 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:52:26.606640 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:26.606623 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-144.ec2.internal\" not found" node="ip-10-0-131-144.ec2.internal" Apr 23 17:52:26.606772 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:26.606729 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-144.ec2.internal\" not found" node="ip-10-0-131-144.ec2.internal" Apr 23 17:52:26.616837 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:26.616742 2568 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal.18a90dd75d47e58b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal,UID:8dc4456d0c7e39a0756b16fde0e83318,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8e1f4bf2daa69c9e69766cebcfced43fa0ee926d4479becdc8a5b05b93ca6e81\" already present on machine,Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:26.608493963 +0000 UTC m=+4.528586435,LastTimestamp:2026-04-23 17:52:26.608493963 +0000 UTC m=+4.528586435,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:52:26.721908 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:26.721833 2568 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal.18a90dd7634aacef openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal,UID:8dc4456d0c7e39a0756b16fde0e83318,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:26.709339375 +0000 UTC m=+4.629431849,LastTimestamp:2026-04-23 17:52:26.709339375 +0000 UTC m=+4.629431849,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:52:26.731905 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:26.731814 2568 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal.18a90dd763bb9ad7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal,UID:8dc4456d0c7e39a0756b16fde0e83318,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:26.716740311 +0000 UTC m=+4.636832785,LastTimestamp:2026-04-23 17:52:26.716740311 +0000 UTC m=+4.636832785,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:52:26.873153 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:26.873074 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-131-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 17:52:27.499612 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:27.499573 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:52:27.606831 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:27.606804 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_8dc4456d0c7e39a0756b16fde0e83318/kube-rbac-proxy-crio/0.log" Apr 23 17:52:27.607196 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:27.607120 2568 generic.go:358] "Generic (PLEG): container finished" podID="8dc4456d0c7e39a0756b16fde0e83318" containerID="924bead263d35e0c54e0ba4d21c42fb05958b939637fb7c53e472a3c0309b60e" exitCode=1 Apr 23 17:52:27.607196 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:27.607153 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal" event={"ID":"8dc4456d0c7e39a0756b16fde0e83318","Type":"ContainerDied","Data":"924bead263d35e0c54e0ba4d21c42fb05958b939637fb7c53e472a3c0309b60e"} Apr 23 17:52:27.607262 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:27.607199 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:52:27.607958 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:27.607945 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:52:27.608005 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:27.607970 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:52:27.608005 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:27.607979 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:52:27.608154 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:27.608142 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-144.ec2.internal\" not found" node="ip-10-0-131-144.ec2.internal" Apr 23 17:52:27.608196 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:27.608187 2568 scope.go:117] "RemoveContainer" containerID="924bead263d35e0c54e0ba4d21c42fb05958b939637fb7c53e472a3c0309b60e" Apr 23 17:52:27.619801 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:27.619717 2568 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal.18a90dd75d47e58b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal.18a90dd75d47e58b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal,UID:8dc4456d0c7e39a0756b16fde0e83318,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8e1f4bf2daa69c9e69766cebcfced43fa0ee926d4479becdc8a5b05b93ca6e81\" already present on machine,Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:26.608493963 +0000 UTC m=+4.528586435,LastTimestamp:2026-04-23 17:52:27.609986097 +0000 UTC m=+5.530078569,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:52:27.711177 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:27.711060 2568 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal.18a90dd7634aacef\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal.18a90dd7634aacef openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal,UID:8dc4456d0c7e39a0756b16fde0e83318,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:26.709339375 +0000 UTC m=+4.629431849,LastTimestamp:2026-04-23 17:52:27.69974602 +0000 UTC m=+5.619838490,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:52:27.722662 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:27.722546 2568 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal.18a90dd763bb9ad7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal.18a90dd763bb9ad7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal,UID:8dc4456d0c7e39a0756b16fde0e83318,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:26.716740311 +0000 UTC m=+4.636832785,LastTimestamp:2026-04-23 17:52:27.712704327 +0000 UTC m=+5.632796799,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:52:28.498836 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:28.498812 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:52:28.611223 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:28.611193 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_8dc4456d0c7e39a0756b16fde0e83318/kube-rbac-proxy-crio/1.log" Apr 23 17:52:28.611609 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:28.611593 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_8dc4456d0c7e39a0756b16fde0e83318/kube-rbac-proxy-crio/0.log" Apr 23 17:52:28.612003 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:28.611979 2568 generic.go:358] "Generic (PLEG): container finished" podID="8dc4456d0c7e39a0756b16fde0e83318" containerID="9840694f66a802b9b244f21d3d05bc7d159ee863404907319adb98e7c8403ed6" exitCode=1 Apr 23 17:52:28.612100 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:28.612018 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal" event={"ID":"8dc4456d0c7e39a0756b16fde0e83318","Type":"ContainerDied","Data":"9840694f66a802b9b244f21d3d05bc7d159ee863404907319adb98e7c8403ed6"} Apr 23 17:52:28.612100 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:28.612051 2568 scope.go:117] "RemoveContainer" containerID="924bead263d35e0c54e0ba4d21c42fb05958b939637fb7c53e472a3c0309b60e" Apr 23 17:52:28.612100 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:28.612062 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:52:28.612860 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:28.612842 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:52:28.612963 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:28.612874 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:52:28.612963 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:28.612885 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:52:28.614837 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:28.614819 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-144.ec2.internal\" not found" node="ip-10-0-131-144.ec2.internal" Apr 23 17:52:28.614951 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:28.614867 2568 scope.go:117] "RemoveContainer" containerID="9840694f66a802b9b244f21d3d05bc7d159ee863404907319adb98e7c8403ed6" Apr 23 17:52:28.615022 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:28.615005 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_openshift-machine-config-operator(8dc4456d0c7e39a0756b16fde0e83318)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal" podUID="8dc4456d0c7e39a0756b16fde0e83318" Apr 23 17:52:28.626614 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:28.626532 2568 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal.18a90dd7d4e06864 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal,UID:8dc4456d0c7e39a0756b16fde0e83318,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_openshift-machine-config-operator(8dc4456d0c7e39a0756b16fde0e83318),Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:28.614977636 +0000 UTC m=+6.535070088,LastTimestamp:2026-04-23 17:52:28.614977636 +0000 UTC m=+6.535070088,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:52:28.744665 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:28.744633 2568 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-131-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="6.4s" Apr 23 17:52:28.949773 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:28.949744 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:52:28.950743 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:28.950723 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:52:28.950860 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:28.950760 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:52:28.950860 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:28.950775 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:52:28.950860 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:28.950807 2568 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-144.ec2.internal" Apr 23 17:52:28.969436 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:28.969407 2568 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"ip-10-0-131-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="ip-10-0-131-144.ec2.internal" Apr 23 17:52:29.499432 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:29.499401 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:52:29.614128 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:29.614103 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_8dc4456d0c7e39a0756b16fde0e83318/kube-rbac-proxy-crio/1.log" Apr 23 17:52:29.614520 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:29.614507 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:52:29.615382 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:29.615367 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:52:29.615469 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:29.615393 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:52:29.615469 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:29.615404 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:52:29.615609 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:29.615597 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-144.ec2.internal\" not found" node="ip-10-0-131-144.ec2.internal" Apr 23 17:52:29.615648 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:29.615642 2568 scope.go:117] "RemoveContainer" containerID="9840694f66a802b9b244f21d3d05bc7d159ee863404907319adb98e7c8403ed6" Apr 23 17:52:29.615770 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:29.615756 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_openshift-machine-config-operator(8dc4456d0c7e39a0756b16fde0e83318)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal" podUID="8dc4456d0c7e39a0756b16fde0e83318" Apr 23 17:52:29.624131 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:29.624056 2568 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal.18a90dd7d4e06864\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal.18a90dd7d4e06864 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal,UID:8dc4456d0c7e39a0756b16fde0e83318,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_openshift-machine-config-operator(8dc4456d0c7e39a0756b16fde0e83318),Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:28.614977636 +0000 UTC m=+6.535070088,LastTimestamp:2026-04-23 17:52:29.615728852 +0000 UTC m=+7.535821318,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:52:30.499018 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:30.498982 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:52:30.541977 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:30.541949 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Apr 23 17:52:30.933456 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:30.933417 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-131-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 17:52:31.030308 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:31.030269 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 23 17:52:31.500078 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:31.500041 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:52:32.498133 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:32.498098 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:52:32.552583 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:32.552553 2568 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:52:32.573638 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:32.573604 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 17:52:33.499123 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:33.499095 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:52:34.501122 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:34.501091 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:52:35.157022 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:35.156990 2568 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-131-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Apr 23 17:52:35.369926 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:35.369885 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:52:35.370926 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:35.370909 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:52:35.371030 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:35.370961 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:52:35.371030 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:35.370976 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:52:35.371030 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:35.371008 2568 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-144.ec2.internal" Apr 23 17:52:35.389991 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:35.389967 2568 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"ip-10-0-131-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="ip-10-0-131-144.ec2.internal" Apr 23 17:52:35.499439 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:35.499363 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:52:36.498610 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:36.498577 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:52:37.499903 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:37.499876 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:52:38.498004 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:38.497979 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:52:39.500197 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:39.500162 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:52:39.770143 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:39.770064 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 23 17:52:40.292447 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:40.292416 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Apr 23 17:52:40.499762 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:40.499737 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:52:40.642122 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:40.642089 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 17:52:41.497799 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:41.497771 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:52:42.167097 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:42.167059 2568 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-131-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Apr 23 17:52:42.390441 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:42.390402 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:52:42.391386 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:42.391370 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:52:42.391443 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:42.391402 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:52:42.391443 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:42.391413 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:52:42.391509 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:42.391488 2568 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-144.ec2.internal" Apr 23 17:52:42.406845 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:42.406822 2568 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"ip-10-0-131-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="ip-10-0-131-144.ec2.internal" Apr 23 17:52:42.499550 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:42.499477 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:52:42.552859 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:42.552810 2568 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:52:43.498999 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:43.498965 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:52:43.699570 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:43.699529 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-131-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 17:52:44.499972 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:44.499924 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:52:44.594598 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:44.594570 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:52:44.595492 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:44.595475 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:52:44.595598 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:44.595509 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:52:44.595598 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:44.595524 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:52:44.595803 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:44.595788 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-144.ec2.internal\" not found" node="ip-10-0-131-144.ec2.internal" Apr 23 17:52:44.595866 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:44.595853 2568 scope.go:117] "RemoveContainer" containerID="9840694f66a802b9b244f21d3d05bc7d159ee863404907319adb98e7c8403ed6" Apr 23 17:52:44.605647 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:44.605559 2568 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal.18a90dd75d47e58b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal.18a90dd75d47e58b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal,UID:8dc4456d0c7e39a0756b16fde0e83318,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8e1f4bf2daa69c9e69766cebcfced43fa0ee926d4479becdc8a5b05b93ca6e81\" already present on machine,Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:26.608493963 +0000 UTC m=+4.528586435,LastTimestamp:2026-04-23 17:52:44.597983222 +0000 UTC m=+22.518075695,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:52:44.717411 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:44.717324 2568 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal.18a90dd7634aacef\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal.18a90dd7634aacef openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal,UID:8dc4456d0c7e39a0756b16fde0e83318,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:26.709339375 +0000 UTC m=+4.629431849,LastTimestamp:2026-04-23 17:52:44.707035728 +0000 UTC m=+22.627128203,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:52:44.725213 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:44.725131 2568 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal.18a90dd763bb9ad7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal.18a90dd763bb9ad7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal,UID:8dc4456d0c7e39a0756b16fde0e83318,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:26.716740311 +0000 UTC m=+4.636832785,LastTimestamp:2026-04-23 17:52:44.715668002 +0000 UTC m=+22.635760456,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:52:45.500372 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:45.500341 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:52:45.638757 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:45.638728 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_8dc4456d0c7e39a0756b16fde0e83318/kube-rbac-proxy-crio/2.log" Apr 23 17:52:45.639123 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:45.639109 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_8dc4456d0c7e39a0756b16fde0e83318/kube-rbac-proxy-crio/1.log" Apr 23 17:52:45.639435 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:45.639415 2568 generic.go:358] "Generic (PLEG): container finished" podID="8dc4456d0c7e39a0756b16fde0e83318" containerID="59c094ef6b0ccc3afca9b7f9964876559a6564c77579d2953d5f3e8df6cc0f23" exitCode=1 Apr 23 17:52:45.639495 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:45.639448 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal" event={"ID":"8dc4456d0c7e39a0756b16fde0e83318","Type":"ContainerDied","Data":"59c094ef6b0ccc3afca9b7f9964876559a6564c77579d2953d5f3e8df6cc0f23"} Apr 23 17:52:45.639495 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:45.639477 2568 scope.go:117] "RemoveContainer" containerID="9840694f66a802b9b244f21d3d05bc7d159ee863404907319adb98e7c8403ed6" Apr 23 17:52:45.639611 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:45.639594 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:52:45.640650 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:45.640432 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:52:45.640650 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:45.640461 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:52:45.640650 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:45.640472 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:52:45.640803 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:45.640697 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-144.ec2.internal\" not found" node="ip-10-0-131-144.ec2.internal" Apr 23 17:52:45.640803 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:45.640741 2568 scope.go:117] "RemoveContainer" containerID="59c094ef6b0ccc3afca9b7f9964876559a6564c77579d2953d5f3e8df6cc0f23" Apr 23 17:52:45.640889 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:45.640874 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_openshift-machine-config-operator(8dc4456d0c7e39a0756b16fde0e83318)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal" podUID="8dc4456d0c7e39a0756b16fde0e83318" Apr 23 17:52:45.648044 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:45.647960 2568 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal.18a90dd7d4e06864\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal.18a90dd7d4e06864 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal,UID:8dc4456d0c7e39a0756b16fde0e83318,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_openshift-machine-config-operator(8dc4456d0c7e39a0756b16fde0e83318),Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:28.614977636 +0000 UTC m=+6.535070088,LastTimestamp:2026-04-23 17:52:45.640847899 +0000 UTC m=+23.560940373,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:52:46.499691 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:46.499657 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:52:46.641865 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:46.641837 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_8dc4456d0c7e39a0756b16fde0e83318/kube-rbac-proxy-crio/2.log" Apr 23 17:52:47.497851 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:47.497816 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:52:48.498557 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:48.498521 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:52:49.176231 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:49.176194 2568 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-131-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Apr 23 17:52:49.407416 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:49.407365 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:52:49.408434 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:49.408415 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:52:49.408506 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:49.408448 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:52:49.408506 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:49.408460 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:52:49.408506 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:49.408486 2568 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-144.ec2.internal" Apr 23 17:52:49.424337 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:49.424312 2568 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"ip-10-0-131-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="ip-10-0-131-144.ec2.internal" Apr 23 17:52:49.501323 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:49.501257 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:52:50.498015 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:50.497979 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:52:51.499583 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:51.499551 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:52:52.497534 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:52.497506 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:52:52.553730 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:52.553692 2568 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:52:53.499163 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:53.499134 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:52:53.982667 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:53.982640 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Apr 23 17:52:54.500065 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:54.500035 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:52:55.498555 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:55.498520 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:52:55.610230 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:55.610193 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 23 17:52:56.185369 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:56.185331 2568 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-131-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Apr 23 17:52:56.425107 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:56.425074 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:52:56.426581 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:56.426562 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:52:56.426684 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:56.426593 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:52:56.426684 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:56.426603 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:52:56.426684 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:56.426634 2568 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-144.ec2.internal" Apr 23 17:52:56.442799 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:56.442725 2568 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"ip-10-0-131-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="ip-10-0-131-144.ec2.internal" Apr 23 17:52:56.500687 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:56.500660 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:52:56.663691 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:56.663658 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-131-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 17:52:57.502395 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:57.502359 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:52:58.498985 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:58.498948 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:52:58.594727 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:58.594683 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:52:58.595793 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:58.595775 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:52:58.595859 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:58.595810 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:52:58.595859 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:58.595823 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:52:58.596102 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:58.596086 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-144.ec2.internal\" not found" node="ip-10-0-131-144.ec2.internal" Apr 23 17:52:58.596165 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:58.596154 2568 scope.go:117] "RemoveContainer" containerID="59c094ef6b0ccc3afca9b7f9964876559a6564c77579d2953d5f3e8df6cc0f23" Apr 23 17:52:58.596308 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:58.596293 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_openshift-machine-config-operator(8dc4456d0c7e39a0756b16fde0e83318)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal" podUID="8dc4456d0c7e39a0756b16fde0e83318" Apr 23 17:52:58.605767 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:58.605680 2568 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal.18a90dd7d4e06864\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal.18a90dd7d4e06864 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal,UID:8dc4456d0c7e39a0756b16fde0e83318,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_openshift-machine-config-operator(8dc4456d0c7e39a0756b16fde0e83318),Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:28.614977636 +0000 UTC m=+6.535070088,LastTimestamp:2026-04-23 17:52:58.596260204 +0000 UTC m=+36.516352669,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:52:59.256288 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:52:59.256252 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 17:52:59.501446 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:52:59.501416 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:00.497696 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:00.497666 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:01.499231 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:01.499201 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:02.497189 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:02.497158 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:02.554609 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:02.554575 2568 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:53:03.195586 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:03.195555 2568 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-131-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Apr 23 17:53:03.443849 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:03.443805 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:53:03.444957 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:03.444926 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:53:03.445083 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:03.444972 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:53:03.445083 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:03.444981 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:53:03.445083 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:03.445007 2568 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-144.ec2.internal" Apr 23 17:53:03.461822 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:03.461743 2568 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"ip-10-0-131-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="ip-10-0-131-144.ec2.internal" Apr 23 17:53:03.497708 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:03.497676 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:04.499805 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:04.499776 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:05.497863 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:05.497831 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:06.499419 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:06.499389 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:07.498544 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:07.498508 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:08.499306 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:08.499268 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:09.497887 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:09.497855 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:10.205473 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:10.205442 2568 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-131-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Apr 23 17:53:10.462394 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:10.462296 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:53:10.463420 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:10.463402 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:53:10.463499 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:10.463437 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:53:10.463499 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:10.463447 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:53:10.463499 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:10.463476 2568 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-144.ec2.internal" Apr 23 17:53:10.480123 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:10.480092 2568 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"ip-10-0-131-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="ip-10-0-131-144.ec2.internal" Apr 23 17:53:10.496982 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:10.496956 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:11.501397 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:11.501365 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:11.595210 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:11.595184 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:53:11.596252 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:11.596230 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:53:11.596361 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:11.596265 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:53:11.596361 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:11.596275 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:53:11.596524 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:11.596511 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-144.ec2.internal\" not found" node="ip-10-0-131-144.ec2.internal" Apr 23 17:53:11.596576 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:11.596566 2568 scope.go:117] "RemoveContainer" containerID="59c094ef6b0ccc3afca9b7f9964876559a6564c77579d2953d5f3e8df6cc0f23" Apr 23 17:53:11.605013 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:11.604913 2568 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal.18a90dd75d47e58b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal.18a90dd75d47e58b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal,UID:8dc4456d0c7e39a0756b16fde0e83318,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8e1f4bf2daa69c9e69766cebcfced43fa0ee926d4479becdc8a5b05b93ca6e81\" already present on machine,Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:26.608493963 +0000 UTC m=+4.528586435,LastTimestamp:2026-04-23 17:53:11.597347977 +0000 UTC m=+49.517440452,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:53:11.700511 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:11.700417 2568 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal.18a90dd7634aacef\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal.18a90dd7634aacef openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal,UID:8dc4456d0c7e39a0756b16fde0e83318,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:26.709339375 +0000 UTC m=+4.629431849,LastTimestamp:2026-04-23 17:53:11.687806384 +0000 UTC m=+49.607898858,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:53:11.707704 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:11.707598 2568 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal.18a90dd763bb9ad7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal.18a90dd763bb9ad7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal,UID:8dc4456d0c7e39a0756b16fde0e83318,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:26.716740311 +0000 UTC m=+4.636832785,LastTimestamp:2026-04-23 17:53:11.695859347 +0000 UTC m=+49.615951823,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:53:12.500062 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:12.500030 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:12.555497 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:12.555453 2568 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:53:12.680536 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:12.680504 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_8dc4456d0c7e39a0756b16fde0e83318/kube-rbac-proxy-crio/3.log" Apr 23 17:53:12.680842 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:12.680825 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_8dc4456d0c7e39a0756b16fde0e83318/kube-rbac-proxy-crio/2.log" Apr 23 17:53:12.681139 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:12.681121 2568 generic.go:358] "Generic (PLEG): container finished" podID="8dc4456d0c7e39a0756b16fde0e83318" containerID="d07cf52690e55070d701e54399cf4ca058fdbe8ff880b724b91517cc177ca8f5" exitCode=1 Apr 23 17:53:12.681199 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:12.681155 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal" event={"ID":"8dc4456d0c7e39a0756b16fde0e83318","Type":"ContainerDied","Data":"d07cf52690e55070d701e54399cf4ca058fdbe8ff880b724b91517cc177ca8f5"} Apr 23 17:53:12.681199 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:12.681184 2568 scope.go:117] "RemoveContainer" containerID="59c094ef6b0ccc3afca9b7f9964876559a6564c77579d2953d5f3e8df6cc0f23" Apr 23 17:53:12.681276 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:12.681268 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:53:12.682324 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:12.682056 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:53:12.682324 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:12.682094 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:53:12.682324 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:12.682111 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:53:12.682487 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:12.682375 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-144.ec2.internal\" not found" node="ip-10-0-131-144.ec2.internal" Apr 23 17:53:12.682487 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:12.682433 2568 scope.go:117] "RemoveContainer" containerID="d07cf52690e55070d701e54399cf4ca058fdbe8ff880b724b91517cc177ca8f5" Apr 23 17:53:12.682615 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:12.682598 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_openshift-machine-config-operator(8dc4456d0c7e39a0756b16fde0e83318)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal" podUID="8dc4456d0c7e39a0756b16fde0e83318" Apr 23 17:53:12.689860 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:12.689787 2568 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal.18a90dd7d4e06864\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal.18a90dd7d4e06864 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal,UID:8dc4456d0c7e39a0756b16fde0e83318,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_openshift-machine-config-operator(8dc4456d0c7e39a0756b16fde0e83318),Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:28.614977636 +0000 UTC m=+6.535070088,LastTimestamp:2026-04-23 17:53:12.682563122 +0000 UTC m=+50.602655597,Count:5,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:53:13.500805 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:13.500781 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:13.683724 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:13.683697 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_8dc4456d0c7e39a0756b16fde0e83318/kube-rbac-proxy-crio/3.log" Apr 23 17:53:14.497641 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:14.497598 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:15.500420 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:15.500391 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:16.497351 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:16.497323 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:17.215944 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:17.215900 2568 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-131-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Apr 23 17:53:17.480632 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:17.480556 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:53:17.481705 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:17.481689 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:53:17.481783 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:17.481719 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:53:17.481783 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:17.481728 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:53:17.481783 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:17.481753 2568 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-144.ec2.internal" Apr 23 17:53:17.496904 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:17.496875 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:17.496904 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:17.496890 2568 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"ip-10-0-131-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="ip-10-0-131-144.ec2.internal" Apr 23 17:53:18.498757 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:18.498720 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:19.497730 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:19.497696 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:20.499721 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:20.499689 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:21.498092 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:21.498064 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:22.500319 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:22.500294 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:22.555596 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:22.555565 2568 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:53:23.501021 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:23.500987 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:24.224443 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:24.224412 2568 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-131-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Apr 23 17:53:24.498375 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:24.498316 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:53:24.500079 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:24.500062 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:53:24.500191 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:24.500092 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:53:24.500191 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:24.500106 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:53:24.500191 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:24.500133 2568 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-144.ec2.internal" Apr 23 17:53:24.500454 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:24.500438 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:24.533301 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:24.532011 2568 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"ip-10-0-131-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="ip-10-0-131-144.ec2.internal" Apr 23 17:53:25.498722 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:25.498688 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:26.499167 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:26.499134 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:26.995361 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:26.995329 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 23 17:53:27.500147 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:27.500122 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:27.594752 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:27.594717 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:53:27.595696 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:27.595674 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:53:27.595798 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:27.595712 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:53:27.595798 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:27.595728 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:53:27.596034 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:27.596018 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-144.ec2.internal\" not found" node="ip-10-0-131-144.ec2.internal" Apr 23 17:53:27.596095 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:27.596083 2568 scope.go:117] "RemoveContainer" containerID="d07cf52690e55070d701e54399cf4ca058fdbe8ff880b724b91517cc177ca8f5" Apr 23 17:53:27.596244 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:27.596219 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_openshift-machine-config-operator(8dc4456d0c7e39a0756b16fde0e83318)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal" podUID="8dc4456d0c7e39a0756b16fde0e83318" Apr 23 17:53:27.603696 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:27.603621 2568 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal.18a90dd7d4e06864\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal.18a90dd7d4e06864 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal,UID:8dc4456d0c7e39a0756b16fde0e83318,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_openshift-machine-config-operator(8dc4456d0c7e39a0756b16fde0e83318),Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:28.614977636 +0000 UTC m=+6.535070088,LastTimestamp:2026-04-23 17:53:27.596184775 +0000 UTC m=+65.516277244,Count:6,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:53:28.252109 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:28.252079 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-131-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 17:53:28.497937 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:28.497909 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:29.175270 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:29.175233 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Apr 23 17:53:29.497534 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:29.497456 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:30.498922 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:30.498887 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:31.231604 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:31.231571 2568 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-131-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Apr 23 17:53:31.499759 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:31.499689 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:31.533088 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:31.533057 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:53:31.534093 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:31.534074 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:53:31.534198 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:31.534105 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:53:31.534198 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:31.534115 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:53:31.534198 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:31.534142 2568 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-144.ec2.internal" Apr 23 17:53:31.550179 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:31.550143 2568 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"ip-10-0-131-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="ip-10-0-131-144.ec2.internal" Apr 23 17:53:32.498285 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:32.498204 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:32.555785 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:32.555731 2568 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:53:33.501114 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:33.501080 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:34.499943 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:34.499912 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:34.594735 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:34.594704 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:53:34.596371 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:34.596351 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:53:34.596486 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:34.596387 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:53:34.596486 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:34.596397 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:53:34.596595 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:34.596582 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-144.ec2.internal\" not found" node="ip-10-0-131-144.ec2.internal" Apr 23 17:53:35.500838 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:35.500809 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:36.497336 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:36.497303 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:37.500794 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:37.500765 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:38.241457 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:38.241423 2568 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-131-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Apr 23 17:53:38.500500 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:38.500435 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:38.550801 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:38.550776 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:53:38.551753 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:38.551737 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:53:38.551801 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:38.551767 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:53:38.551801 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:38.551776 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:53:38.551867 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:38.551807 2568 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-144.ec2.internal" Apr 23 17:53:38.568582 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:38.568556 2568 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"ip-10-0-131-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="ip-10-0-131-144.ec2.internal" Apr 23 17:53:39.499671 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:39.499643 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:40.499914 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:40.499885 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:40.594664 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:40.594624 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:53:40.595559 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:40.595544 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:53:40.595639 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:40.595572 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:53:40.595639 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:40.595581 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:53:40.595795 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:40.595783 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-144.ec2.internal\" not found" node="ip-10-0-131-144.ec2.internal" Apr 23 17:53:40.595836 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:40.595830 2568 scope.go:117] "RemoveContainer" containerID="d07cf52690e55070d701e54399cf4ca058fdbe8ff880b724b91517cc177ca8f5" Apr 23 17:53:40.595987 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:40.595973 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_openshift-machine-config-operator(8dc4456d0c7e39a0756b16fde0e83318)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal" podUID="8dc4456d0c7e39a0756b16fde0e83318" Apr 23 17:53:40.602759 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:40.602680 2568 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal.18a90dd7d4e06864\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal.18a90dd7d4e06864 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal,UID:8dc4456d0c7e39a0756b16fde0e83318,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_openshift-machine-config-operator(8dc4456d0c7e39a0756b16fde0e83318),Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:28.614977636 +0000 UTC m=+6.535070088,LastTimestamp:2026-04-23 17:53:40.595924069 +0000 UTC m=+78.516016538,Count:7,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:53:41.499504 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:41.499473 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:42.501749 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:42.501715 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:42.556187 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:42.556141 2568 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:53:43.498881 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:43.498851 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:44.496972 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:44.496941 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:45.253400 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:45.253369 2568 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-131-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Apr 23 17:53:45.500119 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:45.500087 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:45.568910 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:45.568836 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:53:45.569815 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:45.569792 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:53:45.569909 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:45.569833 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:53:45.569909 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:45.569848 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:53:45.569909 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:45.569873 2568 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-144.ec2.internal" Apr 23 17:53:45.586826 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:45.586802 2568 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"ip-10-0-131-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="ip-10-0-131-144.ec2.internal" Apr 23 17:53:46.501051 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:46.501022 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:46.567004 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:46.566975 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 17:53:47.501650 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:47.501614 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:48.500330 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:48.500304 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:49.499531 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:49.499504 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:50.503044 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:50.503019 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:51.499610 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:51.499579 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:52.261071 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:52.261039 2568 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-131-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Apr 23 17:53:52.503810 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:52.503787 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:52.557334 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:52.557280 2568 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:53:52.587586 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:52.587568 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:53:52.588354 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:52.588339 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:53:52.588428 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:52.588368 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:53:52.588428 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:52.588382 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:53:52.588428 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:52.588413 2568 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-144.ec2.internal" Apr 23 17:53:52.610625 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:52.610605 2568 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"ip-10-0-131-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="ip-10-0-131-144.ec2.internal" Apr 23 17:53:53.498131 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:53.498103 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:53.595012 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:53.594991 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:53:53.595872 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:53.595855 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:53:53.595983 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:53.595885 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:53:53.595983 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:53.595899 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:53:53.596204 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:53.596188 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-144.ec2.internal\" not found" node="ip-10-0-131-144.ec2.internal" Apr 23 17:53:53.596253 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:53.596243 2568 scope.go:117] "RemoveContainer" containerID="d07cf52690e55070d701e54399cf4ca058fdbe8ff880b724b91517cc177ca8f5" Apr 23 17:53:53.607324 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:53.607207 2568 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal.18a90dd75d47e58b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal.18a90dd75d47e58b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal,UID:8dc4456d0c7e39a0756b16fde0e83318,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8e1f4bf2daa69c9e69766cebcfced43fa0ee926d4479becdc8a5b05b93ca6e81\" already present on machine,Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:26.608493963 +0000 UTC m=+4.528586435,LastTimestamp:2026-04-23 17:53:53.596924599 +0000 UTC m=+91.517017074,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:53:53.693541 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:53.693467 2568 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal.18a90dd7634aacef\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal.18a90dd7634aacef openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal,UID:8dc4456d0c7e39a0756b16fde0e83318,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:26.709339375 +0000 UTC m=+4.629431849,LastTimestamp:2026-04-23 17:53:53.684355729 +0000 UTC m=+91.604448196,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:53:53.703447 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:53.703378 2568 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal.18a90dd763bb9ad7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal.18a90dd763bb9ad7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal,UID:8dc4456d0c7e39a0756b16fde0e83318,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:26.716740311 +0000 UTC m=+4.636832785,LastTimestamp:2026-04-23 17:53:53.691182519 +0000 UTC m=+91.611274997,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:53:53.740643 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:53.740624 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_8dc4456d0c7e39a0756b16fde0e83318/kube-rbac-proxy-crio/4.log" Apr 23 17:53:53.740978 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:53.740963 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_8dc4456d0c7e39a0756b16fde0e83318/kube-rbac-proxy-crio/3.log" Apr 23 17:53:53.741294 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:53.741275 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal" event={"ID":"8dc4456d0c7e39a0756b16fde0e83318","Type":"ContainerStarted","Data":"7d22b5c3984b993dd6a97d4fcb92664ad0e56c30d7171e1edb4321b83ebb2ab8"} Apr 23 17:53:53.741386 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:53.741375 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:53:53.742098 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:53.742084 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:53:53.742153 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:53.742112 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:53:53.742153 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:53.742123 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:53:53.742317 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:53.742306 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-144.ec2.internal\" not found" node="ip-10-0-131-144.ec2.internal" Apr 23 17:53:53.742357 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:53.742349 2568 scope.go:117] "RemoveContainer" containerID="7d22b5c3984b993dd6a97d4fcb92664ad0e56c30d7171e1edb4321b83ebb2ab8" Apr 23 17:53:53.742478 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:53.742463 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_openshift-machine-config-operator(8dc4456d0c7e39a0756b16fde0e83318)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal" podUID="8dc4456d0c7e39a0756b16fde0e83318" Apr 23 17:53:53.749832 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:53.749741 2568 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal.18a90dd7d4e06864\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal.18a90dd7d4e06864 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal,UID:8dc4456d0c7e39a0756b16fde0e83318,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_openshift-machine-config-operator(8dc4456d0c7e39a0756b16fde0e83318),Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:28.614977636 +0000 UTC m=+6.535070088,LastTimestamp:2026-04-23 17:53:53.742435462 +0000 UTC m=+91.662527937,Count:8,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:53:54.498376 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:54.498342 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:54.744332 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:54.744305 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_8dc4456d0c7e39a0756b16fde0e83318/kube-rbac-proxy-crio/4.log" Apr 23 17:53:54.744649 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:54.744634 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_8dc4456d0c7e39a0756b16fde0e83318/kube-rbac-proxy-crio/3.log" Apr 23 17:53:54.744986 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:54.744967 2568 generic.go:358] "Generic (PLEG): container finished" podID="8dc4456d0c7e39a0756b16fde0e83318" containerID="7d22b5c3984b993dd6a97d4fcb92664ad0e56c30d7171e1edb4321b83ebb2ab8" exitCode=1 Apr 23 17:53:54.745028 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:54.744999 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal" event={"ID":"8dc4456d0c7e39a0756b16fde0e83318","Type":"ContainerDied","Data":"7d22b5c3984b993dd6a97d4fcb92664ad0e56c30d7171e1edb4321b83ebb2ab8"} Apr 23 17:53:54.745028 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:54.745026 2568 scope.go:117] "RemoveContainer" containerID="d07cf52690e55070d701e54399cf4ca058fdbe8ff880b724b91517cc177ca8f5" Apr 23 17:53:54.745145 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:54.745131 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:53:54.746182 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:54.745952 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:53:54.746182 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:54.745983 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:53:54.746182 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:54.745995 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:53:54.746286 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:54.746212 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-144.ec2.internal\" not found" node="ip-10-0-131-144.ec2.internal" Apr 23 17:53:54.746286 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:54.746257 2568 scope.go:117] "RemoveContainer" containerID="7d22b5c3984b993dd6a97d4fcb92664ad0e56c30d7171e1edb4321b83ebb2ab8" Apr 23 17:53:54.746396 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:54.746378 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_openshift-machine-config-operator(8dc4456d0c7e39a0756b16fde0e83318)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal" podUID="8dc4456d0c7e39a0756b16fde0e83318" Apr 23 17:53:54.754576 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:54.754503 2568 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal.18a90dd7d4e06864\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal.18a90dd7d4e06864 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal,UID:8dc4456d0c7e39a0756b16fde0e83318,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_openshift-machine-config-operator(8dc4456d0c7e39a0756b16fde0e83318),Source:EventSource{Component:kubelet,Host:ip-10-0-131-144.ec2.internal,},FirstTimestamp:2026-04-23 17:52:28.614977636 +0000 UTC m=+6.535070088,LastTimestamp:2026-04-23 17:53:54.746350393 +0000 UTC m=+92.666442859,Count:9,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-144.ec2.internal,}" Apr 23 17:53:55.501028 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:55.500999 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:55.747326 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:55.747298 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_8dc4456d0c7e39a0756b16fde0e83318/kube-rbac-proxy-crio/4.log" Apr 23 17:53:56.498691 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:56.498666 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:57.501789 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:57.501760 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:58.498977 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:58.498943 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:59.040319 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:59.040289 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-131-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 17:53:59.270566 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:59.270533 2568 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-131-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Apr 23 17:53:59.502390 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:59.502365 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:59.610717 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:59.610672 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:53:59.611591 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:59.611576 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:53:59.611670 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:59.611606 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:53:59.611670 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:59.611641 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:53:59.611670 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:53:59.611665 2568 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-144.ec2.internal" Apr 23 17:53:59.630261 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:53:59.630237 2568 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"ip-10-0-131-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="ip-10-0-131-144.ec2.internal" Apr 23 17:54:00.498978 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:00.498951 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:54:01.500199 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:01.500165 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:54:02.500020 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:02.499991 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:54:02.557436 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:02.557401 2568 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:03.316887 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:03.316858 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 23 17:54:03.499242 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:03.499215 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:54:04.512823 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:04.512787 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:54:05.269064 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:05.269026 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-dpgd5" Apr 23 17:54:05.403649 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:05.403614 2568 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 17:54:05.525403 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:05.525331 2568 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-131-144.ec2.internal" not found Apr 23 17:54:05.562280 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:05.562247 2568 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-131-144.ec2.internal" not found Apr 23 17:54:05.648868 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:05.648842 2568 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-131-144.ec2.internal" not found Apr 23 17:54:05.993989 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:05.993952 2568 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-131-144.ec2.internal" not found Apr 23 17:54:05.993989 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:05.993982 2568 csi_plugin.go:397] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "ip-10-0-131-144.ec2.internal" not found Apr 23 17:54:06.044570 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:06.044541 2568 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-131-144.ec2.internal" not found Apr 23 17:54:06.073650 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:06.073619 2568 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:54:06.088163 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:06.088141 2568 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-131-144.ec2.internal" not found Apr 23 17:54:06.157903 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:06.157874 2568 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-131-144.ec2.internal" not found Apr 23 17:54:06.270505 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:06.270426 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 17:49:05 +0000 UTC" deadline="2027-10-21 01:06:01.47293594 +0000 UTC" Apr 23 17:54:06.270505 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:06.270460 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13087h11m55.2024786s" Apr 23 17:54:06.300804 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:06.300780 2568 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-131-144.ec2.internal\" not found" node="ip-10-0-131-144.ec2.internal" Apr 23 17:54:06.433443 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:06.433411 2568 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-131-144.ec2.internal" not found Apr 23 17:54:06.433443 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:06.433438 2568 csi_plugin.go:397] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "ip-10-0-131-144.ec2.internal" not found Apr 23 17:54:06.631003 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:06.630976 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:54:06.632310 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:06.632294 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:54:06.632378 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:06.632324 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:54:06.632378 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:06.632335 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:54:06.632378 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:06.632360 2568 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-144.ec2.internal" Apr 23 17:54:06.639174 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:06.639160 2568 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-131-144.ec2.internal" Apr 23 17:54:06.639219 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:06.639183 2568 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-131-144.ec2.internal\": node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:06.657582 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:06.657561 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:06.757871 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:06.757837 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:06.858229 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:06.858195 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:06.958838 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:06.958781 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:07.059552 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:07.059510 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:07.159999 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:07.159971 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:07.260575 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:07.260510 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:07.361011 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:07.360985 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:07.461743 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:07.461699 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:07.548683 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:07.548624 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 17:54:07.559611 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:07.559588 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 17:54:07.562092 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:07.562071 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:07.594985 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:07.594956 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:54:07.595863 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:07.595849 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:54:07.595918 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:07.595877 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:54:07.595918 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:07.595891 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:54:07.596193 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:07.596177 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-144.ec2.internal\" not found" node="ip-10-0-131-144.ec2.internal" Apr 23 17:54:07.596240 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:07.596231 2568 scope.go:117] "RemoveContainer" containerID="7d22b5c3984b993dd6a97d4fcb92664ad0e56c30d7171e1edb4321b83ebb2ab8" Apr 23 17:54:07.596369 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:07.596356 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_openshift-machine-config-operator(8dc4456d0c7e39a0756b16fde0e83318)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal" podUID="8dc4456d0c7e39a0756b16fde0e83318" Apr 23 17:54:07.648025 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:07.648003 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-2dtdk" Apr 23 17:54:07.662285 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:07.662268 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:07.662344 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:07.662316 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-2dtdk" Apr 23 17:54:07.762329 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:07.762304 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:07.863240 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:07.863222 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:07.963788 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:07.963757 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:08.064434 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:08.064402 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:08.164870 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:08.164822 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:08.265845 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:08.265822 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:08.366335 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:08.366310 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:08.466830 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:08.466767 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:08.567816 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:08.567787 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:08.663076 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:08.663041 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 17:49:07 +0000 UTC" deadline="2027-10-28 05:15:35.257266289 +0000 UTC" Apr 23 17:54:08.663076 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:08.663073 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13259h21m26.59419617s" Apr 23 17:54:08.668186 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:08.668168 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:08.768582 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:08.768525 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:08.868997 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:08.868969 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:08.969622 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:08.969596 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:09.070662 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:09.070610 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:09.171059 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:09.171027 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:09.272004 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:09.271979 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:09.372532 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:09.372500 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:09.473128 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:09.473102 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:09.573966 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:09.573915 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:09.664065 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:09.663996 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 17:49:07 +0000 UTC" deadline="2028-01-01 04:29:00.022856468 +0000 UTC" Apr 23 17:54:09.664065 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:09.664026 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14818h34m50.358834774s" Apr 23 17:54:09.674156 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:09.674124 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:09.774921 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:09.774891 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:09.875385 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:09.875364 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:09.976047 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:09.975991 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:10.076941 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:10.076912 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:10.177459 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:10.177428 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:10.277905 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:10.277851 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:10.378399 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:10.378369 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:10.479133 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:10.479103 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:10.579914 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:10.579890 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:10.680975 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:10.680926 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:10.781835 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:10.781812 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:10.882890 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:10.882835 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:10.983088 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:10.983045 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:11.083626 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:11.083591 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:11.184111 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:11.184044 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:11.284605 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:11.284575 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:11.385156 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:11.385127 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:11.485487 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:11.485428 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:11.586206 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:11.586174 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:11.687016 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:11.686975 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:11.787441 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:11.787369 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:11.887859 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:11.887820 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:11.988670 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:11.988634 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:12.089125 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:12.089100 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:12.189670 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:12.189639 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:12.290147 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:12.290120 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:12.391167 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:12.391103 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:12.492082 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:12.492044 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:12.557872 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:12.557846 2568 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:12.592370 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:12.592326 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:12.693135 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:12.693073 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:12.793430 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:12.793387 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:12.893746 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:12.893715 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:12.994479 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:12.994425 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:13.095399 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:13.095373 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:13.195842 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:13.195809 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:13.296449 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:13.296380 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:13.396903 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:13.396869 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:13.497554 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:13.497533 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:13.598448 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:13.598421 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:13.699314 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:13.699279 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:13.799639 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:13.799605 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:13.900163 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:13.900097 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:14.000302 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:14.000275 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:14.100740 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:14.100709 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:14.201614 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:14.201549 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:14.302096 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:14.302066 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:14.402600 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:14.402571 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:14.503052 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:14.502978 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:14.603718 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:14.603679 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:14.704537 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:14.704498 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:14.805204 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:14.805125 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:14.905678 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:14.905639 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:15.005983 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:15.005946 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:15.106523 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:15.106489 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:15.206991 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:15.206959 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:15.307692 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:15.307663 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:15.408280 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:15.408213 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:15.509175 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:15.509138 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:15.609394 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:15.609359 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:15.709692 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:15.709634 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:15.810100 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:15.810048 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:15.910521 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:15.910490 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:16.011384 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:16.011329 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:16.111854 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:16.111819 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:16.212421 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:16.212391 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:16.312923 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:16.312863 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:16.413948 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:16.413897 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:16.514718 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:16.514695 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:16.615564 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:16.615535 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:16.716330 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:16.716290 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:16.816665 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:16.816628 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:16.917171 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:16.917107 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:17.017869 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:17.017838 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:17.024006 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:17.023983 2568 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-131-144.ec2.internal\": node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:17.118332 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:17.118292 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:17.219373 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:17.219302 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:17.319835 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:17.319803 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:17.420414 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:17.420378 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:17.521450 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:17.521396 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:17.621868 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:17.621838 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:17.722598 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:17.722563 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:17.822919 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:17.822893 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:17.923478 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:17.923439 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:18.024337 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:18.024298 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:18.125422 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:18.125358 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:18.225839 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:18.225813 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:18.326460 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:18.326420 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:18.427007 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:18.426925 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:18.527736 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:18.527703 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:18.628057 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:18.628023 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:18.729167 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:18.729103 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:18.829538 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:18.829492 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:18.930131 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:18.930095 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:19.031259 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:19.031187 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:19.131667 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:19.131645 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:19.232628 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:19.232602 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:19.333102 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:19.333073 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:19.433692 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:19.433661 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:19.534584 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:19.534551 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:19.635216 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:19.635168 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:19.735650 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:19.735617 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:19.836252 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:19.836222 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:19.937163 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:19.937107 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:20.037687 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:20.037656 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:20.138665 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:20.138638 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:20.239166 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:20.239103 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:20.339687 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:20.339653 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:20.440245 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:20.440211 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:20.541168 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:20.541104 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:20.641891 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:20.641851 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:20.742559 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:20.742533 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:20.843433 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:20.843402 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:20.943634 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:20.943605 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:21.044304 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:21.044264 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:21.144770 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:21.144710 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:21.245225 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:21.245192 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:21.345702 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:21.345659 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:21.446354 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:21.446285 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:21.547193 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:21.547161 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:21.594549 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:21.594526 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:54:21.595698 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:21.595680 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:54:21.595796 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:21.595712 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:54:21.595796 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:21.595727 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:54:21.596045 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:21.596032 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-144.ec2.internal\" not found" node="ip-10-0-131-144.ec2.internal" Apr 23 17:54:21.596097 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:21.596088 2568 scope.go:117] "RemoveContainer" containerID="7d22b5c3984b993dd6a97d4fcb92664ad0e56c30d7171e1edb4321b83ebb2ab8" Apr 23 17:54:21.596230 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:21.596215 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_openshift-machine-config-operator(8dc4456d0c7e39a0756b16fde0e83318)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal" podUID="8dc4456d0c7e39a0756b16fde0e83318" Apr 23 17:54:21.648061 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:21.648016 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:21.748704 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:21.748642 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:21.849008 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:21.848962 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:21.949660 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:21.949622 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:22.050726 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:22.050676 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:22.151181 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:22.151141 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:22.251682 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:22.251656 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:22.352396 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:22.352367 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:22.453114 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:22.453075 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:22.554175 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:22.554141 2568 kubelet_node_status.go:509] "Node not becoming ready in time after startup" Apr 23 17:54:22.558392 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:22.558379 2568 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:22.573807 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:22.573773 2568 kubelet.go:3132] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 23 17:54:27.245311 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:27.245262 2568 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-131-144.ec2.internal\": node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:27.574719 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:27.574691 2568 kubelet.go:3132] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 23 17:54:32.559044 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:32.558990 2568 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:32.575216 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:32.575184 2568 kubelet.go:3132] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 23 17:54:34.595557 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:34.595512 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:54:34.599120 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:34.599100 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:54:34.599205 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:34.599139 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:54:34.599205 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:34.599150 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:54:34.599377 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:34.599365 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-144.ec2.internal\" not found" node="ip-10-0-131-144.ec2.internal" Apr 23 17:54:34.599423 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:34.599414 2568 scope.go:117] "RemoveContainer" containerID="7d22b5c3984b993dd6a97d4fcb92664ad0e56c30d7171e1edb4321b83ebb2ab8" Apr 23 17:54:34.599549 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:34.599535 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_openshift-machine-config-operator(8dc4456d0c7e39a0756b16fde0e83318)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal" podUID="8dc4456d0c7e39a0756b16fde0e83318" Apr 23 17:54:37.419026 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:37.418972 2568 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-131-144.ec2.internal\": node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:37.575968 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:37.575915 2568 kubelet.go:3132] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 23 17:54:38.147197 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:38.147171 2568 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:54:42.559710 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:42.559673 2568 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:42.577012 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:42.576983 2568 kubelet.go:3132] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 23 17:54:46.594856 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:46.594809 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:54:46.595976 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:46.595958 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:54:46.595976 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:46.595988 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:54:46.596118 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:46.595998 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:54:46.596191 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:46.596178 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-144.ec2.internal\" not found" node="ip-10-0-131-144.ec2.internal" Apr 23 17:54:47.334653 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:47.334623 2568 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:54:47.578059 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:47.578010 2568 kubelet.go:3132] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 23 17:54:47.582227 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:47.582211 2568 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-131-144.ec2.internal\": node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:47.595285 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:47.595244 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:54:47.596011 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:47.595993 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:54:47.596089 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:47.596024 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:54:47.596089 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:47.596034 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:54:47.596261 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:47.596247 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-144.ec2.internal\" not found" node="ip-10-0-131-144.ec2.internal" Apr 23 17:54:47.596305 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:47.596296 2568 scope.go:117] "RemoveContainer" containerID="7d22b5c3984b993dd6a97d4fcb92664ad0e56c30d7171e1edb4321b83ebb2ab8" Apr 23 17:54:47.596424 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:47.596411 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_openshift-machine-config-operator(8dc4456d0c7e39a0756b16fde0e83318)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal" podUID="8dc4456d0c7e39a0756b16fde0e83318" Apr 23 17:54:52.560696 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:52.560659 2568 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-144.ec2.internal\" not found" Apr 23 17:54:52.578925 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:52.578893 2568 kubelet.go:3132] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 23 17:54:56.600692 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:56.600656 2568 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:54:56.693609 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:56.693570 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-144.ec2.internal" Apr 23 17:54:56.713488 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:56.713465 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 17:54:56.714137 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:56.714125 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal" Apr 23 17:54:56.809564 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:56.809527 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 17:54:57.551551 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.551503 2568 apiserver.go:52] "Watching apiserver" Apr 23 17:54:57.557184 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.557160 2568 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 17:54:57.557509 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.557486 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-8vtxf","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rhfpc","openshift-cluster-node-tuning-operator/tuned-rtkdm","openshift-dns/node-resolver-r69nr","openshift-multus/multus-additional-cni-plugins-f8xkt","openshift-multus/multus-kljn6","openshift-multus/network-metrics-daemon-2dh7j","openshift-network-diagnostics/network-check-target-7vcgr","kube-system/konnectivity-agent-2sr6m","kube-system/kube-apiserver-proxy-ip-10-0-131-144.ec2.internal","openshift-image-registry/node-ca-lw625","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal","openshift-network-operator/iptables-alerter-wfj88","openshift-ovn-kubernetes/ovnkube-node-lwrk7"] Apr 23 17:54:57.560228 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.560211 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2dh7j" Apr 23 17:54:57.560300 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:57.560284 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2dh7j" podUID="9fe17f32-d37d-48dc-b42b-74bbc44d26d2" Apr 23 17:54:57.561165 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.561152 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-wfj88" Apr 23 17:54:57.562263 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.562239 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-rtkdm" Apr 23 17:54:57.564855 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.564831 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-r69nr" Apr 23 17:54:57.565971 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.565953 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-f8xkt" Apr 23 17:54:57.566073 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.566047 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.567136 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.567121 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-2sr6m" Apr 23 17:54:57.568027 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.568002 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 17:54:57.568132 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.568005 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 17:54:57.568371 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.568356 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.569229 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.569215 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-r6lhx\"" Apr 23 17:54:57.570018 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.569539 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rhfpc" Apr 23 17:54:57.570923 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.570909 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-lw625" Apr 23 17:54:57.571554 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.571540 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 17:54:57.572197 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.572180 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7vcgr" Apr 23 17:54:57.572282 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:57.572261 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7vcgr" podUID="705ed263-d57f-4121-a61c-7fca69acb455" Apr 23 17:54:57.573375 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.573357 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 17:54:57.573462 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.573407 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8vtxf" Apr 23 17:54:57.573462 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.573423 2568 scope.go:117] "RemoveContainer" containerID="7d22b5c3984b993dd6a97d4fcb92664ad0e56c30d7171e1edb4321b83ebb2ab8" Apr 23 17:54:57.573569 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:57.573477 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8vtxf" podUID="f51b2065-4e39-46ce-8dda-c894907d1f96" Apr 23 17:54:57.573569 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:57.573528 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_openshift-machine-config-operator(8dc4456d0c7e39a0756b16fde0e83318)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal" podUID="8dc4456d0c7e39a0756b16fde0e83318" Apr 23 17:54:57.573689 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.573666 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 17:54:57.573689 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.573672 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 17:54:57.573787 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.573694 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 17:54:57.574015 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.573947 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 17:54:57.574015 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.573994 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-2758j\"" Apr 23 17:54:57.574136 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.574077 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 17:54:57.574136 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.574081 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-x8mkm\"" Apr 23 17:54:57.574136 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.574115 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 17:54:57.574278 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.574159 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 17:54:57.574352 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.574340 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 17:54:57.574447 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.574434 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 17:54:57.575163 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.575138 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 17:54:57.575223 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.575164 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 17:54:57.575223 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.575196 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 17:54:57.575223 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.575201 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-4s597\"" Apr 23 17:54:57.575366 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.575139 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 17:54:57.575366 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.575272 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-z5mb4\"" Apr 23 17:54:57.575575 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.575553 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 17:54:57.575755 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.575734 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 17:54:57.575839 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.575817 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-xb587\"" Apr 23 17:54:57.575905 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.575889 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 17:54:57.576003 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.575990 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 17:54:57.576133 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.576120 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-k7xsh\"" Apr 23 17:54:57.576173 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.576153 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 17:54:57.576215 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.576191 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 17:54:57.576255 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.576215 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 17:54:57.576495 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.576358 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 17:54:57.576495 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.576366 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-btltb\"" Apr 23 17:54:57.576495 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.576372 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 17:54:57.576495 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.576395 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 17:54:57.576495 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.576398 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-2vg5p\"" Apr 23 17:54:57.579505 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:57.579474 2568 kubelet.go:3132] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 23 17:54:57.594203 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.594188 2568 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 17:54:57.690865 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.690831 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcgm6\" (UniqueName: \"kubernetes.io/projected/6f8d8a15-5978-4526-a3ff-4be016bdd0d7-kube-api-access-jcgm6\") pod \"iptables-alerter-wfj88\" (UID: \"6f8d8a15-5978-4526-a3ff-4be016bdd0d7\") " pod="openshift-network-operator/iptables-alerter-wfj88" Apr 23 17:54:57.690865 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.690866 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/38b64dbe-c6b9-427c-b563-fde192bde0bd-etc-sysctl-conf\") pod \"tuned-rtkdm\" (UID: \"38b64dbe-c6b9-427c-b563-fde192bde0bd\") " pod="openshift-cluster-node-tuning-operator/tuned-rtkdm" Apr 23 17:54:57.691386 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.690883 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-ovnkube-config\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.691386 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.690943 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e172a861-bbd7-4d2d-9045-1bfe854b4621-cnibin\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.691386 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.690971 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e172a861-bbd7-4d2d-9045-1bfe854b4621-host-run-multus-certs\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.691386 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.690988 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9fe17f32-d37d-48dc-b42b-74bbc44d26d2-metrics-certs\") pod \"network-metrics-daemon-2dh7j\" (UID: \"9fe17f32-d37d-48dc-b42b-74bbc44d26d2\") " pod="openshift-multus/network-metrics-daemon-2dh7j" Apr 23 17:54:57.691386 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.691003 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/48f03a68-806b-44d8-b107-0af334f9ad51-host\") pod \"node-ca-lw625\" (UID: \"48f03a68-806b-44d8-b107-0af334f9ad51\") " pod="openshift-image-registry/node-ca-lw625" Apr 23 17:54:57.691386 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.691020 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/23f0cb8e-457e-471c-aba7-ab0e123e6f30-os-release\") pod \"multus-additional-cni-plugins-f8xkt\" (UID: \"23f0cb8e-457e-471c-aba7-ab0e123e6f30\") " pod="openshift-multus/multus-additional-cni-plugins-f8xkt" Apr 23 17:54:57.691386 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.691035 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/23f0cb8e-457e-471c-aba7-ab0e123e6f30-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-f8xkt\" (UID: \"23f0cb8e-457e-471c-aba7-ab0e123e6f30\") " pod="openshift-multus/multus-additional-cni-plugins-f8xkt" Apr 23 17:54:57.691386 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.691055 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e172a861-bbd7-4d2d-9045-1bfe854b4621-host-var-lib-cni-multus\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.691386 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.691074 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f51b2065-4e39-46ce-8dda-c894907d1f96-original-pull-secret\") pod \"global-pull-secret-syncer-8vtxf\" (UID: \"f51b2065-4e39-46ce-8dda-c894907d1f96\") " pod="kube-system/global-pull-secret-syncer-8vtxf" Apr 23 17:54:57.691386 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.691089 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/38b64dbe-c6b9-427c-b563-fde192bde0bd-run\") pod \"tuned-rtkdm\" (UID: \"38b64dbe-c6b9-427c-b563-fde192bde0bd\") " pod="openshift-cluster-node-tuning-operator/tuned-rtkdm" Apr 23 17:54:57.691386 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.691102 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/38b64dbe-c6b9-427c-b563-fde192bde0bd-tmp\") pod \"tuned-rtkdm\" (UID: \"38b64dbe-c6b9-427c-b563-fde192bde0bd\") " pod="openshift-cluster-node-tuning-operator/tuned-rtkdm" Apr 23 17:54:57.691386 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.691152 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-host-slash\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.691386 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.691212 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e172a861-bbd7-4d2d-9045-1bfe854b4621-host-var-lib-cni-bin\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.691386 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.691237 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e172a861-bbd7-4d2d-9045-1bfe854b4621-host-var-lib-kubelet\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.691386 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.691256 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/38b64dbe-c6b9-427c-b563-fde192bde0bd-etc-sysctl-d\") pod \"tuned-rtkdm\" (UID: \"38b64dbe-c6b9-427c-b563-fde192bde0bd\") " pod="openshift-cluster-node-tuning-operator/tuned-rtkdm" Apr 23 17:54:57.691386 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.691275 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/38b64dbe-c6b9-427c-b563-fde192bde0bd-host\") pod \"tuned-rtkdm\" (UID: \"38b64dbe-c6b9-427c-b563-fde192bde0bd\") " pod="openshift-cluster-node-tuning-operator/tuned-rtkdm" Apr 23 17:54:57.691386 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.691298 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-etc-openvswitch\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.692123 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.691318 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e172a861-bbd7-4d2d-9045-1bfe854b4621-os-release\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.692123 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.691337 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/38b64dbe-c6b9-427c-b563-fde192bde0bd-lib-modules\") pod \"tuned-rtkdm\" (UID: \"38b64dbe-c6b9-427c-b563-fde192bde0bd\") " pod="openshift-cluster-node-tuning-operator/tuned-rtkdm" Apr 23 17:54:57.692123 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.691359 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/38b64dbe-c6b9-427c-b563-fde192bde0bd-etc-tuned\") pod \"tuned-rtkdm\" (UID: \"38b64dbe-c6b9-427c-b563-fde192bde0bd\") " pod="openshift-cluster-node-tuning-operator/tuned-rtkdm" Apr 23 17:54:57.692123 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.691382 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dd27\" (UniqueName: \"kubernetes.io/projected/c7659d0d-36b6-4e60-9b82-1870686d5685-kube-api-access-9dd27\") pod \"node-resolver-r69nr\" (UID: \"c7659d0d-36b6-4e60-9b82-1870686d5685\") " pod="openshift-dns/node-resolver-r69nr" Apr 23 17:54:57.692123 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.691423 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-host-run-netns\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.692123 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.691447 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-run-systemd\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.692123 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.691470 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f51b2065-4e39-46ce-8dda-c894907d1f96-kubelet-config\") pod \"global-pull-secret-syncer-8vtxf\" (UID: \"f51b2065-4e39-46ce-8dda-c894907d1f96\") " pod="kube-system/global-pull-secret-syncer-8vtxf" Apr 23 17:54:57.692123 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.691494 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-log-socket\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.692123 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.691521 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-host-cni-netd\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.692123 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.691544 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-ovnkube-script-lib\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.692123 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.691565 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8fbfbc17-c10b-4c74-bfbd-14a45c716246-device-dir\") pod \"aws-ebs-csi-driver-node-rhfpc\" (UID: \"8fbfbc17-c10b-4c74-bfbd-14a45c716246\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rhfpc" Apr 23 17:54:57.692123 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.691586 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e172a861-bbd7-4d2d-9045-1bfe854b4621-system-cni-dir\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.692123 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.691606 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e172a861-bbd7-4d2d-9045-1bfe854b4621-cni-binary-copy\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.692123 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.691619 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e172a861-bbd7-4d2d-9045-1bfe854b4621-hostroot\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.692123 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.691641 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/48f03a68-806b-44d8-b107-0af334f9ad51-serviceca\") pod \"node-ca-lw625\" (UID: \"48f03a68-806b-44d8-b107-0af334f9ad51\") " pod="openshift-image-registry/node-ca-lw625" Apr 23 17:54:57.692123 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.691690 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6f8d8a15-5978-4526-a3ff-4be016bdd0d7-iptables-alerter-script\") pod \"iptables-alerter-wfj88\" (UID: \"6f8d8a15-5978-4526-a3ff-4be016bdd0d7\") " pod="openshift-network-operator/iptables-alerter-wfj88" Apr 23 17:54:57.692123 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.691728 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6f8d8a15-5978-4526-a3ff-4be016bdd0d7-host-slash\") pod \"iptables-alerter-wfj88\" (UID: \"6f8d8a15-5978-4526-a3ff-4be016bdd0d7\") " pod="openshift-network-operator/iptables-alerter-wfj88" Apr 23 17:54:57.692691 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.691752 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/38b64dbe-c6b9-427c-b563-fde192bde0bd-etc-sysconfig\") pod \"tuned-rtkdm\" (UID: \"38b64dbe-c6b9-427c-b563-fde192bde0bd\") " pod="openshift-cluster-node-tuning-operator/tuned-rtkdm" Apr 23 17:54:57.692691 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.691775 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/38b64dbe-c6b9-427c-b563-fde192bde0bd-etc-kubernetes\") pod \"tuned-rtkdm\" (UID: \"38b64dbe-c6b9-427c-b563-fde192bde0bd\") " pod="openshift-cluster-node-tuning-operator/tuned-rtkdm" Apr 23 17:54:57.692691 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.691801 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8fbfbc17-c10b-4c74-bfbd-14a45c716246-etc-selinux\") pod \"aws-ebs-csi-driver-node-rhfpc\" (UID: \"8fbfbc17-c10b-4c74-bfbd-14a45c716246\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rhfpc" Apr 23 17:54:57.692691 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.691827 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e172a861-bbd7-4d2d-9045-1bfe854b4621-host-run-netns\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.692691 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.691851 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e172a861-bbd7-4d2d-9045-1bfe854b4621-multus-conf-dir\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.692691 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.691885 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-env-overrides\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.692691 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.691913 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8fbfbc17-c10b-4c74-bfbd-14a45c716246-kubelet-dir\") pod \"aws-ebs-csi-driver-node-rhfpc\" (UID: \"8fbfbc17-c10b-4c74-bfbd-14a45c716246\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rhfpc" Apr 23 17:54:57.692691 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.691963 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8fbfbc17-c10b-4c74-bfbd-14a45c716246-registration-dir\") pod \"aws-ebs-csi-driver-node-rhfpc\" (UID: \"8fbfbc17-c10b-4c74-bfbd-14a45c716246\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rhfpc" Apr 23 17:54:57.692691 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.691992 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffvkh\" (UniqueName: \"kubernetes.io/projected/8fbfbc17-c10b-4c74-bfbd-14a45c716246-kube-api-access-ffvkh\") pod \"aws-ebs-csi-driver-node-rhfpc\" (UID: \"8fbfbc17-c10b-4c74-bfbd-14a45c716246\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rhfpc" Apr 23 17:54:57.692691 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.692018 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/23f0cb8e-457e-471c-aba7-ab0e123e6f30-cnibin\") pod \"multus-additional-cni-plugins-f8xkt\" (UID: \"23f0cb8e-457e-471c-aba7-ab0e123e6f30\") " pod="openshift-multus/multus-additional-cni-plugins-f8xkt" Apr 23 17:54:57.692691 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.692047 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/23f0cb8e-457e-471c-aba7-ab0e123e6f30-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-f8xkt\" (UID: \"23f0cb8e-457e-471c-aba7-ab0e123e6f30\") " pod="openshift-multus/multus-additional-cni-plugins-f8xkt" Apr 23 17:54:57.692691 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.692074 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-host-run-ovn-kubernetes\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.692691 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.692101 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.692691 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.692125 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e172a861-bbd7-4d2d-9045-1bfe854b4621-multus-cni-dir\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.692691 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.692146 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qchsx\" (UniqueName: \"kubernetes.io/projected/e172a861-bbd7-4d2d-9045-1bfe854b4621-kube-api-access-qchsx\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.692691 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.692171 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-var-lib-openvswitch\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.693172 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.692193 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8fbfbc17-c10b-4c74-bfbd-14a45c716246-socket-dir\") pod \"aws-ebs-csi-driver-node-rhfpc\" (UID: \"8fbfbc17-c10b-4c74-bfbd-14a45c716246\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rhfpc" Apr 23 17:54:57.693172 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.692209 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/23f0cb8e-457e-471c-aba7-ab0e123e6f30-cni-binary-copy\") pod \"multus-additional-cni-plugins-f8xkt\" (UID: \"23f0cb8e-457e-471c-aba7-ab0e123e6f30\") " pod="openshift-multus/multus-additional-cni-plugins-f8xkt" Apr 23 17:54:57.693172 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.692223 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2sr8\" (UniqueName: \"kubernetes.io/projected/23f0cb8e-457e-471c-aba7-ab0e123e6f30-kube-api-access-z2sr8\") pod \"multus-additional-cni-plugins-f8xkt\" (UID: \"23f0cb8e-457e-471c-aba7-ab0e123e6f30\") " pod="openshift-multus/multus-additional-cni-plugins-f8xkt" Apr 23 17:54:57.693172 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.692239 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e172a861-bbd7-4d2d-9045-1bfe854b4621-host-run-k8s-cni-cncf-io\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.693172 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.692259 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6jv8\" (UniqueName: \"kubernetes.io/projected/9fe17f32-d37d-48dc-b42b-74bbc44d26d2-kube-api-access-g6jv8\") pod \"network-metrics-daemon-2dh7j\" (UID: \"9fe17f32-d37d-48dc-b42b-74bbc44d26d2\") " pod="openshift-multus/network-metrics-daemon-2dh7j" Apr 23 17:54:57.693172 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.692287 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/38b64dbe-c6b9-427c-b563-fde192bde0bd-etc-modprobe-d\") pod \"tuned-rtkdm\" (UID: \"38b64dbe-c6b9-427c-b563-fde192bde0bd\") " pod="openshift-cluster-node-tuning-operator/tuned-rtkdm" Apr 23 17:54:57.693172 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.692303 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/38b64dbe-c6b9-427c-b563-fde192bde0bd-etc-systemd\") pod \"tuned-rtkdm\" (UID: \"38b64dbe-c6b9-427c-b563-fde192bde0bd\") " pod="openshift-cluster-node-tuning-operator/tuned-rtkdm" Apr 23 17:54:57.693172 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.692327 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmtdn\" (UniqueName: \"kubernetes.io/projected/38b64dbe-c6b9-427c-b563-fde192bde0bd-kube-api-access-cmtdn\") pod \"tuned-rtkdm\" (UID: \"38b64dbe-c6b9-427c-b563-fde192bde0bd\") " pod="openshift-cluster-node-tuning-operator/tuned-rtkdm" Apr 23 17:54:57.693172 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.692350 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-run-openvswitch\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.693172 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.692365 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/23f0cb8e-457e-471c-aba7-ab0e123e6f30-system-cni-dir\") pod \"multus-additional-cni-plugins-f8xkt\" (UID: \"23f0cb8e-457e-471c-aba7-ab0e123e6f30\") " pod="openshift-multus/multus-additional-cni-plugins-f8xkt" Apr 23 17:54:57.693172 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.692379 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/23f0cb8e-457e-471c-aba7-ab0e123e6f30-tuning-conf-dir\") pod \"multus-additional-cni-plugins-f8xkt\" (UID: \"23f0cb8e-457e-471c-aba7-ab0e123e6f30\") " pod="openshift-multus/multus-additional-cni-plugins-f8xkt" Apr 23 17:54:57.693172 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.692394 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/41a9f941-4751-4a1c-b87e-c4384db8566d-konnectivity-ca\") pod \"konnectivity-agent-2sr6m\" (UID: \"41a9f941-4751-4a1c-b87e-c4384db8566d\") " pod="kube-system/konnectivity-agent-2sr6m" Apr 23 17:54:57.693172 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.692408 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f51b2065-4e39-46ce-8dda-c894907d1f96-dbus\") pod \"global-pull-secret-syncer-8vtxf\" (UID: \"f51b2065-4e39-46ce-8dda-c894907d1f96\") " pod="kube-system/global-pull-secret-syncer-8vtxf" Apr 23 17:54:57.693172 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.692427 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-host-cni-bin\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.693172 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.692444 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl59k\" (UniqueName: \"kubernetes.io/projected/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-kube-api-access-pl59k\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.693172 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.692457 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e172a861-bbd7-4d2d-9045-1bfe854b4621-multus-socket-dir-parent\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.693609 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.692472 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/41a9f941-4751-4a1c-b87e-c4384db8566d-agent-certs\") pod \"konnectivity-agent-2sr6m\" (UID: \"41a9f941-4751-4a1c-b87e-c4384db8566d\") " pod="kube-system/konnectivity-agent-2sr6m" Apr 23 17:54:57.693609 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.692488 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/38b64dbe-c6b9-427c-b563-fde192bde0bd-sys\") pod \"tuned-rtkdm\" (UID: \"38b64dbe-c6b9-427c-b563-fde192bde0bd\") " pod="openshift-cluster-node-tuning-operator/tuned-rtkdm" Apr 23 17:54:57.693609 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.692505 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-systemd-units\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.693609 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.692523 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-run-ovn\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.693609 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.692547 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-node-log\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.693609 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.692561 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8fbfbc17-c10b-4c74-bfbd-14a45c716246-sys-fs\") pod \"aws-ebs-csi-driver-node-rhfpc\" (UID: \"8fbfbc17-c10b-4c74-bfbd-14a45c716246\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rhfpc" Apr 23 17:54:57.693609 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.692576 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr7qv\" (UniqueName: \"kubernetes.io/projected/705ed263-d57f-4121-a61c-7fca69acb455-kube-api-access-mr7qv\") pod \"network-check-target-7vcgr\" (UID: \"705ed263-d57f-4121-a61c-7fca69acb455\") " pod="openshift-network-diagnostics/network-check-target-7vcgr" Apr 23 17:54:57.693609 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.692597 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e172a861-bbd7-4d2d-9045-1bfe854b4621-multus-daemon-config\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.693609 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.692614 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e172a861-bbd7-4d2d-9045-1bfe854b4621-etc-kubernetes\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.693609 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.692632 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/38b64dbe-c6b9-427c-b563-fde192bde0bd-var-lib-kubelet\") pod \"tuned-rtkdm\" (UID: \"38b64dbe-c6b9-427c-b563-fde192bde0bd\") " pod="openshift-cluster-node-tuning-operator/tuned-rtkdm" Apr 23 17:54:57.693609 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.692654 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c7659d0d-36b6-4e60-9b82-1870686d5685-tmp-dir\") pod \"node-resolver-r69nr\" (UID: \"c7659d0d-36b6-4e60-9b82-1870686d5685\") " pod="openshift-dns/node-resolver-r69nr" Apr 23 17:54:57.693609 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.692671 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-ovn-node-metrics-cert\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.693609 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.692687 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrb7n\" (UniqueName: \"kubernetes.io/projected/48f03a68-806b-44d8-b107-0af334f9ad51-kube-api-access-qrb7n\") pod \"node-ca-lw625\" (UID: \"48f03a68-806b-44d8-b107-0af334f9ad51\") " pod="openshift-image-registry/node-ca-lw625" Apr 23 17:54:57.693609 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.692704 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c7659d0d-36b6-4e60-9b82-1870686d5685-hosts-file\") pod \"node-resolver-r69nr\" (UID: \"c7659d0d-36b6-4e60-9b82-1870686d5685\") " pod="openshift-dns/node-resolver-r69nr" Apr 23 17:54:57.693609 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.692722 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-host-kubelet\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.790163 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.790114 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-144.ec2.internal" podStartSLOduration=1.7900980880000001 podStartE2EDuration="1.790098088s" podCreationTimestamp="2026-04-23 17:54:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:54:57.764653087 +0000 UTC m=+155.684745563" watchObservedRunningTime="2026-04-23 17:54:57.790098088 +0000 UTC m=+155.710190545" Apr 23 17:54:57.793104 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.793082 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-ovnkube-config\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.793194 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.793113 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e172a861-bbd7-4d2d-9045-1bfe854b4621-cnibin\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.793194 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.793131 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e172a861-bbd7-4d2d-9045-1bfe854b4621-host-run-multus-certs\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.793194 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.793156 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9fe17f32-d37d-48dc-b42b-74bbc44d26d2-metrics-certs\") pod \"network-metrics-daemon-2dh7j\" (UID: \"9fe17f32-d37d-48dc-b42b-74bbc44d26d2\") " pod="openshift-multus/network-metrics-daemon-2dh7j" Apr 23 17:54:57.793194 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.793178 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/48f03a68-806b-44d8-b107-0af334f9ad51-host\") pod \"node-ca-lw625\" (UID: \"48f03a68-806b-44d8-b107-0af334f9ad51\") " pod="openshift-image-registry/node-ca-lw625" Apr 23 17:54:57.793370 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.793201 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/23f0cb8e-457e-471c-aba7-ab0e123e6f30-os-release\") pod \"multus-additional-cni-plugins-f8xkt\" (UID: \"23f0cb8e-457e-471c-aba7-ab0e123e6f30\") " pod="openshift-multus/multus-additional-cni-plugins-f8xkt" Apr 23 17:54:57.793370 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.793225 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/23f0cb8e-457e-471c-aba7-ab0e123e6f30-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-f8xkt\" (UID: \"23f0cb8e-457e-471c-aba7-ab0e123e6f30\") " pod="openshift-multus/multus-additional-cni-plugins-f8xkt" Apr 23 17:54:57.793370 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.793248 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e172a861-bbd7-4d2d-9045-1bfe854b4621-cnibin\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.793370 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.793199 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e172a861-bbd7-4d2d-9045-1bfe854b4621-host-run-multus-certs\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.793370 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.793250 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e172a861-bbd7-4d2d-9045-1bfe854b4621-host-var-lib-cni-multus\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.793370 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.793323 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/23f0cb8e-457e-471c-aba7-ab0e123e6f30-os-release\") pod \"multus-additional-cni-plugins-f8xkt\" (UID: \"23f0cb8e-457e-471c-aba7-ab0e123e6f30\") " pod="openshift-multus/multus-additional-cni-plugins-f8xkt" Apr 23 17:54:57.793642 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:57.793387 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:54:57.793642 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.793497 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e172a861-bbd7-4d2d-9045-1bfe854b4621-host-var-lib-cni-multus\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.793642 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:57.793527 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fe17f32-d37d-48dc-b42b-74bbc44d26d2-metrics-certs podName:9fe17f32-d37d-48dc-b42b-74bbc44d26d2 nodeName:}" failed. No retries permitted until 2026-04-23 17:54:58.293493678 +0000 UTC m=+156.213586205 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9fe17f32-d37d-48dc-b42b-74bbc44d26d2-metrics-certs") pod "network-metrics-daemon-2dh7j" (UID: "9fe17f32-d37d-48dc-b42b-74bbc44d26d2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:54:57.793642 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.793552 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f51b2065-4e39-46ce-8dda-c894907d1f96-original-pull-secret\") pod \"global-pull-secret-syncer-8vtxf\" (UID: \"f51b2065-4e39-46ce-8dda-c894907d1f96\") " pod="kube-system/global-pull-secret-syncer-8vtxf" Apr 23 17:54:57.793642 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.793577 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/48f03a68-806b-44d8-b107-0af334f9ad51-host\") pod \"node-ca-lw625\" (UID: \"48f03a68-806b-44d8-b107-0af334f9ad51\") " pod="openshift-image-registry/node-ca-lw625" Apr 23 17:54:57.793642 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.793582 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/38b64dbe-c6b9-427c-b563-fde192bde0bd-run\") pod \"tuned-rtkdm\" (UID: \"38b64dbe-c6b9-427c-b563-fde192bde0bd\") " pod="openshift-cluster-node-tuning-operator/tuned-rtkdm" Apr 23 17:54:57.793642 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.793635 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/38b64dbe-c6b9-427c-b563-fde192bde0bd-tmp\") pod \"tuned-rtkdm\" (UID: \"38b64dbe-c6b9-427c-b563-fde192bde0bd\") " pod="openshift-cluster-node-tuning-operator/tuned-rtkdm" Apr 23 17:54:57.793642 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.793640 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/38b64dbe-c6b9-427c-b563-fde192bde0bd-run\") pod \"tuned-rtkdm\" (UID: \"38b64dbe-c6b9-427c-b563-fde192bde0bd\") " pod="openshift-cluster-node-tuning-operator/tuned-rtkdm" Apr 23 17:54:57.794055 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.793657 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-host-slash\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.794055 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.793685 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e172a861-bbd7-4d2d-9045-1bfe854b4621-host-var-lib-cni-bin\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.794055 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.793709 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e172a861-bbd7-4d2d-9045-1bfe854b4621-host-var-lib-kubelet\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.794055 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:57.793716 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 17:54:57.794055 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.793736 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/38b64dbe-c6b9-427c-b563-fde192bde0bd-etc-sysctl-d\") pod \"tuned-rtkdm\" (UID: \"38b64dbe-c6b9-427c-b563-fde192bde0bd\") " pod="openshift-cluster-node-tuning-operator/tuned-rtkdm" Apr 23 17:54:57.794055 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.793760 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/38b64dbe-c6b9-427c-b563-fde192bde0bd-host\") pod \"tuned-rtkdm\" (UID: \"38b64dbe-c6b9-427c-b563-fde192bde0bd\") " pod="openshift-cluster-node-tuning-operator/tuned-rtkdm" Apr 23 17:54:57.794055 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:57.793773 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f51b2065-4e39-46ce-8dda-c894907d1f96-original-pull-secret podName:f51b2065-4e39-46ce-8dda-c894907d1f96 nodeName:}" failed. No retries permitted until 2026-04-23 17:54:58.293757014 +0000 UTC m=+156.213849468 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f51b2065-4e39-46ce-8dda-c894907d1f96-original-pull-secret") pod "global-pull-secret-syncer-8vtxf" (UID: "f51b2065-4e39-46ce-8dda-c894907d1f96") : object "kube-system"/"original-pull-secret" not registered Apr 23 17:54:57.794055 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.793739 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/23f0cb8e-457e-471c-aba7-ab0e123e6f30-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-f8xkt\" (UID: \"23f0cb8e-457e-471c-aba7-ab0e123e6f30\") " pod="openshift-multus/multus-additional-cni-plugins-f8xkt" Apr 23 17:54:57.794055 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.793804 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-etc-openvswitch\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.794055 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.793832 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e172a861-bbd7-4d2d-9045-1bfe854b4621-os-release\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.794055 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.793845 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/38b64dbe-c6b9-427c-b563-fde192bde0bd-host\") pod \"tuned-rtkdm\" (UID: \"38b64dbe-c6b9-427c-b563-fde192bde0bd\") " pod="openshift-cluster-node-tuning-operator/tuned-rtkdm" Apr 23 17:54:57.794055 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.793857 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/38b64dbe-c6b9-427c-b563-fde192bde0bd-lib-modules\") pod \"tuned-rtkdm\" (UID: \"38b64dbe-c6b9-427c-b563-fde192bde0bd\") " pod="openshift-cluster-node-tuning-operator/tuned-rtkdm" Apr 23 17:54:57.794055 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.793878 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-host-slash\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.794055 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.793882 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/38b64dbe-c6b9-427c-b563-fde192bde0bd-etc-tuned\") pod \"tuned-rtkdm\" (UID: \"38b64dbe-c6b9-427c-b563-fde192bde0bd\") " pod="openshift-cluster-node-tuning-operator/tuned-rtkdm" Apr 23 17:54:57.794055 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.793906 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e172a861-bbd7-4d2d-9045-1bfe854b4621-host-var-lib-cni-bin\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.794055 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.793910 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9dd27\" (UniqueName: \"kubernetes.io/projected/c7659d0d-36b6-4e60-9b82-1870686d5685-kube-api-access-9dd27\") pod \"node-resolver-r69nr\" (UID: \"c7659d0d-36b6-4e60-9b82-1870686d5685\") " pod="openshift-dns/node-resolver-r69nr" Apr 23 17:54:57.794055 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.793957 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-host-run-netns\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.794828 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.793966 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/38b64dbe-c6b9-427c-b563-fde192bde0bd-etc-sysctl-d\") pod \"tuned-rtkdm\" (UID: \"38b64dbe-c6b9-427c-b563-fde192bde0bd\") " pod="openshift-cluster-node-tuning-operator/tuned-rtkdm" Apr 23 17:54:57.794828 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.793981 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e172a861-bbd7-4d2d-9045-1bfe854b4621-os-release\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.794828 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.793982 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-run-systemd\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.794828 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.794010 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-etc-openvswitch\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.794828 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.794019 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f51b2065-4e39-46ce-8dda-c894907d1f96-kubelet-config\") pod \"global-pull-secret-syncer-8vtxf\" (UID: \"f51b2065-4e39-46ce-8dda-c894907d1f96\") " pod="kube-system/global-pull-secret-syncer-8vtxf" Apr 23 17:54:57.794828 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.794021 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-run-systemd\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.794828 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.794042 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-log-socket\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.794828 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.794064 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-host-cni-netd\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.794828 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.794089 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-ovnkube-script-lib\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.794828 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.794092 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/38b64dbe-c6b9-427c-b563-fde192bde0bd-lib-modules\") pod \"tuned-rtkdm\" (UID: \"38b64dbe-c6b9-427c-b563-fde192bde0bd\") " pod="openshift-cluster-node-tuning-operator/tuned-rtkdm" Apr 23 17:54:57.794828 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.793806 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e172a861-bbd7-4d2d-9045-1bfe854b4621-host-var-lib-kubelet\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.794828 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.794106 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f51b2065-4e39-46ce-8dda-c894907d1f96-kubelet-config\") pod \"global-pull-secret-syncer-8vtxf\" (UID: \"f51b2065-4e39-46ce-8dda-c894907d1f96\") " pod="kube-system/global-pull-secret-syncer-8vtxf" Apr 23 17:54:57.794828 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.794113 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8fbfbc17-c10b-4c74-bfbd-14a45c716246-device-dir\") pod \"aws-ebs-csi-driver-node-rhfpc\" (UID: \"8fbfbc17-c10b-4c74-bfbd-14a45c716246\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rhfpc" Apr 23 17:54:57.794828 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.794065 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-host-run-netns\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.794828 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.794142 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e172a861-bbd7-4d2d-9045-1bfe854b4621-system-cni-dir\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.794828 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.794146 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-host-cni-netd\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.794828 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.794148 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-log-socket\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.794828 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.794180 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8fbfbc17-c10b-4c74-bfbd-14a45c716246-device-dir\") pod \"aws-ebs-csi-driver-node-rhfpc\" (UID: \"8fbfbc17-c10b-4c74-bfbd-14a45c716246\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rhfpc" Apr 23 17:54:57.795656 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.794212 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e172a861-bbd7-4d2d-9045-1bfe854b4621-cni-binary-copy\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.795656 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.794233 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e172a861-bbd7-4d2d-9045-1bfe854b4621-hostroot\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.795656 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.794279 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/48f03a68-806b-44d8-b107-0af334f9ad51-serviceca\") pod \"node-ca-lw625\" (UID: \"48f03a68-806b-44d8-b107-0af334f9ad51\") " pod="openshift-image-registry/node-ca-lw625" Apr 23 17:54:57.795656 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.794279 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e172a861-bbd7-4d2d-9045-1bfe854b4621-system-cni-dir\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.795656 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.794307 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e172a861-bbd7-4d2d-9045-1bfe854b4621-hostroot\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.795656 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.794348 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6f8d8a15-5978-4526-a3ff-4be016bdd0d7-iptables-alerter-script\") pod \"iptables-alerter-wfj88\" (UID: \"6f8d8a15-5978-4526-a3ff-4be016bdd0d7\") " pod="openshift-network-operator/iptables-alerter-wfj88" Apr 23 17:54:57.795656 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.794378 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-ovnkube-config\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.795656 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.794372 2568 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 17:54:57.795656 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.794453 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6f8d8a15-5978-4526-a3ff-4be016bdd0d7-host-slash\") pod \"iptables-alerter-wfj88\" (UID: \"6f8d8a15-5978-4526-a3ff-4be016bdd0d7\") " pod="openshift-network-operator/iptables-alerter-wfj88" Apr 23 17:54:57.795656 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.794558 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/38b64dbe-c6b9-427c-b563-fde192bde0bd-etc-sysconfig\") pod \"tuned-rtkdm\" (UID: \"38b64dbe-c6b9-427c-b563-fde192bde0bd\") " pod="openshift-cluster-node-tuning-operator/tuned-rtkdm" Apr 23 17:54:57.795656 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.794585 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/38b64dbe-c6b9-427c-b563-fde192bde0bd-etc-kubernetes\") pod \"tuned-rtkdm\" (UID: \"38b64dbe-c6b9-427c-b563-fde192bde0bd\") " pod="openshift-cluster-node-tuning-operator/tuned-rtkdm" Apr 23 17:54:57.795656 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.794618 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-ovnkube-script-lib\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.795656 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.794635 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/38b64dbe-c6b9-427c-b563-fde192bde0bd-etc-sysconfig\") pod \"tuned-rtkdm\" (UID: \"38b64dbe-c6b9-427c-b563-fde192bde0bd\") " pod="openshift-cluster-node-tuning-operator/tuned-rtkdm" Apr 23 17:54:57.795656 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.794667 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/38b64dbe-c6b9-427c-b563-fde192bde0bd-etc-kubernetes\") pod \"tuned-rtkdm\" (UID: \"38b64dbe-c6b9-427c-b563-fde192bde0bd\") " pod="openshift-cluster-node-tuning-operator/tuned-rtkdm" Apr 23 17:54:57.795656 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.794669 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6f8d8a15-5978-4526-a3ff-4be016bdd0d7-host-slash\") pod \"iptables-alerter-wfj88\" (UID: \"6f8d8a15-5978-4526-a3ff-4be016bdd0d7\") " pod="openshift-network-operator/iptables-alerter-wfj88" Apr 23 17:54:57.795656 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.794702 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8fbfbc17-c10b-4c74-bfbd-14a45c716246-etc-selinux\") pod \"aws-ebs-csi-driver-node-rhfpc\" (UID: \"8fbfbc17-c10b-4c74-bfbd-14a45c716246\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rhfpc" Apr 23 17:54:57.795656 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.794728 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e172a861-bbd7-4d2d-9045-1bfe854b4621-host-run-netns\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.795656 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.794749 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e172a861-bbd7-4d2d-9045-1bfe854b4621-multus-conf-dir\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.796464 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.794760 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e172a861-bbd7-4d2d-9045-1bfe854b4621-cni-binary-copy\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.796464 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.794772 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-env-overrides\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.796464 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.794808 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8fbfbc17-c10b-4c74-bfbd-14a45c716246-kubelet-dir\") pod \"aws-ebs-csi-driver-node-rhfpc\" (UID: \"8fbfbc17-c10b-4c74-bfbd-14a45c716246\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rhfpc" Apr 23 17:54:57.796464 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.794834 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8fbfbc17-c10b-4c74-bfbd-14a45c716246-registration-dir\") pod \"aws-ebs-csi-driver-node-rhfpc\" (UID: \"8fbfbc17-c10b-4c74-bfbd-14a45c716246\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rhfpc" Apr 23 17:54:57.796464 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.794843 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8fbfbc17-c10b-4c74-bfbd-14a45c716246-etc-selinux\") pod \"aws-ebs-csi-driver-node-rhfpc\" (UID: \"8fbfbc17-c10b-4c74-bfbd-14a45c716246\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rhfpc" Apr 23 17:54:57.796464 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.794855 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6f8d8a15-5978-4526-a3ff-4be016bdd0d7-iptables-alerter-script\") pod \"iptables-alerter-wfj88\" (UID: \"6f8d8a15-5978-4526-a3ff-4be016bdd0d7\") " pod="openshift-network-operator/iptables-alerter-wfj88" Apr 23 17:54:57.796464 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.794862 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ffvkh\" (UniqueName: \"kubernetes.io/projected/8fbfbc17-c10b-4c74-bfbd-14a45c716246-kube-api-access-ffvkh\") pod \"aws-ebs-csi-driver-node-rhfpc\" (UID: \"8fbfbc17-c10b-4c74-bfbd-14a45c716246\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rhfpc" Apr 23 17:54:57.796464 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.794867 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/48f03a68-806b-44d8-b107-0af334f9ad51-serviceca\") pod \"node-ca-lw625\" (UID: \"48f03a68-806b-44d8-b107-0af334f9ad51\") " pod="openshift-image-registry/node-ca-lw625" Apr 23 17:54:57.796464 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.794897 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/23f0cb8e-457e-471c-aba7-ab0e123e6f30-cnibin\") pod \"multus-additional-cni-plugins-f8xkt\" (UID: \"23f0cb8e-457e-471c-aba7-ab0e123e6f30\") " pod="openshift-multus/multus-additional-cni-plugins-f8xkt" Apr 23 17:54:57.796464 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.794906 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e172a861-bbd7-4d2d-9045-1bfe854b4621-host-run-netns\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.796464 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.794914 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e172a861-bbd7-4d2d-9045-1bfe854b4621-multus-conf-dir\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.796464 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.794923 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/23f0cb8e-457e-471c-aba7-ab0e123e6f30-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-f8xkt\" (UID: \"23f0cb8e-457e-471c-aba7-ab0e123e6f30\") " pod="openshift-multus/multus-additional-cni-plugins-f8xkt" Apr 23 17:54:57.796464 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.794957 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8fbfbc17-c10b-4c74-bfbd-14a45c716246-registration-dir\") pod \"aws-ebs-csi-driver-node-rhfpc\" (UID: \"8fbfbc17-c10b-4c74-bfbd-14a45c716246\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rhfpc" Apr 23 17:54:57.796464 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.794978 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8fbfbc17-c10b-4c74-bfbd-14a45c716246-kubelet-dir\") pod \"aws-ebs-csi-driver-node-rhfpc\" (UID: \"8fbfbc17-c10b-4c74-bfbd-14a45c716246\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rhfpc" Apr 23 17:54:57.796464 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.794978 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-host-run-ovn-kubernetes\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.796464 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.795006 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/23f0cb8e-457e-471c-aba7-ab0e123e6f30-cnibin\") pod \"multus-additional-cni-plugins-f8xkt\" (UID: \"23f0cb8e-457e-471c-aba7-ab0e123e6f30\") " pod="openshift-multus/multus-additional-cni-plugins-f8xkt" Apr 23 17:54:57.796464 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.795011 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-host-run-ovn-kubernetes\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.797263 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.795030 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.797263 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.795059 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e172a861-bbd7-4d2d-9045-1bfe854b4621-multus-cni-dir\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.797263 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.795085 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qchsx\" (UniqueName: \"kubernetes.io/projected/e172a861-bbd7-4d2d-9045-1bfe854b4621-kube-api-access-qchsx\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.797263 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.795124 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.797263 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.795132 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-env-overrides\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.797263 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.795136 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-var-lib-openvswitch\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.797263 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.795173 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e172a861-bbd7-4d2d-9045-1bfe854b4621-multus-cni-dir\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.797263 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.795178 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-var-lib-openvswitch\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.797263 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.795190 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8fbfbc17-c10b-4c74-bfbd-14a45c716246-socket-dir\") pod \"aws-ebs-csi-driver-node-rhfpc\" (UID: \"8fbfbc17-c10b-4c74-bfbd-14a45c716246\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rhfpc" Apr 23 17:54:57.797263 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.795228 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/23f0cb8e-457e-471c-aba7-ab0e123e6f30-cni-binary-copy\") pod \"multus-additional-cni-plugins-f8xkt\" (UID: \"23f0cb8e-457e-471c-aba7-ab0e123e6f30\") " pod="openshift-multus/multus-additional-cni-plugins-f8xkt" Apr 23 17:54:57.797263 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.795253 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z2sr8\" (UniqueName: \"kubernetes.io/projected/23f0cb8e-457e-471c-aba7-ab0e123e6f30-kube-api-access-z2sr8\") pod \"multus-additional-cni-plugins-f8xkt\" (UID: \"23f0cb8e-457e-471c-aba7-ab0e123e6f30\") " pod="openshift-multus/multus-additional-cni-plugins-f8xkt" Apr 23 17:54:57.797263 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.795278 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e172a861-bbd7-4d2d-9045-1bfe854b4621-host-run-k8s-cni-cncf-io\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.797263 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.795305 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g6jv8\" (UniqueName: \"kubernetes.io/projected/9fe17f32-d37d-48dc-b42b-74bbc44d26d2-kube-api-access-g6jv8\") pod \"network-metrics-daemon-2dh7j\" (UID: \"9fe17f32-d37d-48dc-b42b-74bbc44d26d2\") " pod="openshift-multus/network-metrics-daemon-2dh7j" Apr 23 17:54:57.797263 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.795313 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8fbfbc17-c10b-4c74-bfbd-14a45c716246-socket-dir\") pod \"aws-ebs-csi-driver-node-rhfpc\" (UID: \"8fbfbc17-c10b-4c74-bfbd-14a45c716246\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rhfpc" Apr 23 17:54:57.797263 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.795332 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/38b64dbe-c6b9-427c-b563-fde192bde0bd-etc-modprobe-d\") pod \"tuned-rtkdm\" (UID: \"38b64dbe-c6b9-427c-b563-fde192bde0bd\") " pod="openshift-cluster-node-tuning-operator/tuned-rtkdm" Apr 23 17:54:57.797263 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.795358 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/38b64dbe-c6b9-427c-b563-fde192bde0bd-etc-systemd\") pod \"tuned-rtkdm\" (UID: \"38b64dbe-c6b9-427c-b563-fde192bde0bd\") " pod="openshift-cluster-node-tuning-operator/tuned-rtkdm" Apr 23 17:54:57.797263 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.795366 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e172a861-bbd7-4d2d-9045-1bfe854b4621-host-run-k8s-cni-cncf-io\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.798084 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.795385 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cmtdn\" (UniqueName: \"kubernetes.io/projected/38b64dbe-c6b9-427c-b563-fde192bde0bd-kube-api-access-cmtdn\") pod \"tuned-rtkdm\" (UID: \"38b64dbe-c6b9-427c-b563-fde192bde0bd\") " pod="openshift-cluster-node-tuning-operator/tuned-rtkdm" Apr 23 17:54:57.798084 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.795411 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-run-openvswitch\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.798084 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.795424 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/23f0cb8e-457e-471c-aba7-ab0e123e6f30-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-f8xkt\" (UID: \"23f0cb8e-457e-471c-aba7-ab0e123e6f30\") " pod="openshift-multus/multus-additional-cni-plugins-f8xkt" Apr 23 17:54:57.798084 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.795437 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/23f0cb8e-457e-471c-aba7-ab0e123e6f30-system-cni-dir\") pod \"multus-additional-cni-plugins-f8xkt\" (UID: \"23f0cb8e-457e-471c-aba7-ab0e123e6f30\") " pod="openshift-multus/multus-additional-cni-plugins-f8xkt" Apr 23 17:54:57.798084 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.795457 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/38b64dbe-c6b9-427c-b563-fde192bde0bd-etc-modprobe-d\") pod \"tuned-rtkdm\" (UID: \"38b64dbe-c6b9-427c-b563-fde192bde0bd\") " pod="openshift-cluster-node-tuning-operator/tuned-rtkdm" Apr 23 17:54:57.798084 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.795462 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/23f0cb8e-457e-471c-aba7-ab0e123e6f30-tuning-conf-dir\") pod \"multus-additional-cni-plugins-f8xkt\" (UID: \"23f0cb8e-457e-471c-aba7-ab0e123e6f30\") " pod="openshift-multus/multus-additional-cni-plugins-f8xkt" Apr 23 17:54:57.798084 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.795508 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/41a9f941-4751-4a1c-b87e-c4384db8566d-konnectivity-ca\") pod \"konnectivity-agent-2sr6m\" (UID: \"41a9f941-4751-4a1c-b87e-c4384db8566d\") " pod="kube-system/konnectivity-agent-2sr6m" Apr 23 17:54:57.798084 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.795554 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f51b2065-4e39-46ce-8dda-c894907d1f96-dbus\") pod \"global-pull-secret-syncer-8vtxf\" (UID: \"f51b2065-4e39-46ce-8dda-c894907d1f96\") " pod="kube-system/global-pull-secret-syncer-8vtxf" Apr 23 17:54:57.798084 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.795578 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/23f0cb8e-457e-471c-aba7-ab0e123e6f30-tuning-conf-dir\") pod \"multus-additional-cni-plugins-f8xkt\" (UID: \"23f0cb8e-457e-471c-aba7-ab0e123e6f30\") " pod="openshift-multus/multus-additional-cni-plugins-f8xkt" Apr 23 17:54:57.798084 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.795587 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-host-cni-bin\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.798084 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.795614 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pl59k\" (UniqueName: \"kubernetes.io/projected/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-kube-api-access-pl59k\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.798084 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.795619 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-run-openvswitch\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.798084 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.795664 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/23f0cb8e-457e-471c-aba7-ab0e123e6f30-system-cni-dir\") pod \"multus-additional-cni-plugins-f8xkt\" (UID: \"23f0cb8e-457e-471c-aba7-ab0e123e6f30\") " pod="openshift-multus/multus-additional-cni-plugins-f8xkt" Apr 23 17:54:57.798084 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.795666 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/23f0cb8e-457e-471c-aba7-ab0e123e6f30-cni-binary-copy\") pod \"multus-additional-cni-plugins-f8xkt\" (UID: \"23f0cb8e-457e-471c-aba7-ab0e123e6f30\") " pod="openshift-multus/multus-additional-cni-plugins-f8xkt" Apr 23 17:54:57.798084 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.795693 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f51b2065-4e39-46ce-8dda-c894907d1f96-dbus\") pod \"global-pull-secret-syncer-8vtxf\" (UID: \"f51b2065-4e39-46ce-8dda-c894907d1f96\") " pod="kube-system/global-pull-secret-syncer-8vtxf" Apr 23 17:54:57.798084 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.795711 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/38b64dbe-c6b9-427c-b563-fde192bde0bd-etc-systemd\") pod \"tuned-rtkdm\" (UID: \"38b64dbe-c6b9-427c-b563-fde192bde0bd\") " pod="openshift-cluster-node-tuning-operator/tuned-rtkdm" Apr 23 17:54:57.798084 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.795715 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-host-cni-bin\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.798721 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.795773 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e172a861-bbd7-4d2d-9045-1bfe854b4621-multus-socket-dir-parent\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.798721 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.795803 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/41a9f941-4751-4a1c-b87e-c4384db8566d-agent-certs\") pod \"konnectivity-agent-2sr6m\" (UID: \"41a9f941-4751-4a1c-b87e-c4384db8566d\") " pod="kube-system/konnectivity-agent-2sr6m" Apr 23 17:54:57.798721 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.795829 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e172a861-bbd7-4d2d-9045-1bfe854b4621-multus-socket-dir-parent\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.798721 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.795835 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/38b64dbe-c6b9-427c-b563-fde192bde0bd-sys\") pod \"tuned-rtkdm\" (UID: \"38b64dbe-c6b9-427c-b563-fde192bde0bd\") " pod="openshift-cluster-node-tuning-operator/tuned-rtkdm" Apr 23 17:54:57.798721 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.795945 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-systemd-units\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.798721 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.795966 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-run-ovn\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.798721 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.795981 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-node-log\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.798721 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.795994 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8fbfbc17-c10b-4c74-bfbd-14a45c716246-sys-fs\") pod \"aws-ebs-csi-driver-node-rhfpc\" (UID: \"8fbfbc17-c10b-4c74-bfbd-14a45c716246\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rhfpc" Apr 23 17:54:57.798721 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.796001 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/38b64dbe-c6b9-427c-b563-fde192bde0bd-sys\") pod \"tuned-rtkdm\" (UID: \"38b64dbe-c6b9-427c-b563-fde192bde0bd\") " pod="openshift-cluster-node-tuning-operator/tuned-rtkdm" Apr 23 17:54:57.798721 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.796017 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mr7qv\" (UniqueName: \"kubernetes.io/projected/705ed263-d57f-4121-a61c-7fca69acb455-kube-api-access-mr7qv\") pod \"network-check-target-7vcgr\" (UID: \"705ed263-d57f-4121-a61c-7fca69acb455\") " pod="openshift-network-diagnostics/network-check-target-7vcgr" Apr 23 17:54:57.798721 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.796040 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-run-ovn\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.798721 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.796051 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-node-log\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.798721 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.796042 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e172a861-bbd7-4d2d-9045-1bfe854b4621-multus-daemon-config\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.798721 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.796086 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e172a861-bbd7-4d2d-9045-1bfe854b4621-etc-kubernetes\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.798721 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.796110 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/38b64dbe-c6b9-427c-b563-fde192bde0bd-var-lib-kubelet\") pod \"tuned-rtkdm\" (UID: \"38b64dbe-c6b9-427c-b563-fde192bde0bd\") " pod="openshift-cluster-node-tuning-operator/tuned-rtkdm" Apr 23 17:54:57.798721 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.796133 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c7659d0d-36b6-4e60-9b82-1870686d5685-tmp-dir\") pod \"node-resolver-r69nr\" (UID: \"c7659d0d-36b6-4e60-9b82-1870686d5685\") " pod="openshift-dns/node-resolver-r69nr" Apr 23 17:54:57.798721 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.796141 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e172a861-bbd7-4d2d-9045-1bfe854b4621-etc-kubernetes\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.798721 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.796156 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-ovn-node-metrics-cert\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.799225 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.796108 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8fbfbc17-c10b-4c74-bfbd-14a45c716246-sys-fs\") pod \"aws-ebs-csi-driver-node-rhfpc\" (UID: \"8fbfbc17-c10b-4c74-bfbd-14a45c716246\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rhfpc" Apr 23 17:54:57.799225 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.796224 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qrb7n\" (UniqueName: \"kubernetes.io/projected/48f03a68-806b-44d8-b107-0af334f9ad51-kube-api-access-qrb7n\") pod \"node-ca-lw625\" (UID: \"48f03a68-806b-44d8-b107-0af334f9ad51\") " pod="openshift-image-registry/node-ca-lw625" Apr 23 17:54:57.799225 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.796257 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/38b64dbe-c6b9-427c-b563-fde192bde0bd-var-lib-kubelet\") pod \"tuned-rtkdm\" (UID: \"38b64dbe-c6b9-427c-b563-fde192bde0bd\") " pod="openshift-cluster-node-tuning-operator/tuned-rtkdm" Apr 23 17:54:57.799225 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.796271 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c7659d0d-36b6-4e60-9b82-1870686d5685-hosts-file\") pod \"node-resolver-r69nr\" (UID: \"c7659d0d-36b6-4e60-9b82-1870686d5685\") " pod="openshift-dns/node-resolver-r69nr" Apr 23 17:54:57.799225 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.796303 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-systemd-units\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.799225 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.796341 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-host-kubelet\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.799225 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.796368 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-host-kubelet\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.799225 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.796370 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jcgm6\" (UniqueName: \"kubernetes.io/projected/6f8d8a15-5978-4526-a3ff-4be016bdd0d7-kube-api-access-jcgm6\") pod \"iptables-alerter-wfj88\" (UID: \"6f8d8a15-5978-4526-a3ff-4be016bdd0d7\") " pod="openshift-network-operator/iptables-alerter-wfj88" Apr 23 17:54:57.799225 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.796531 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/38b64dbe-c6b9-427c-b563-fde192bde0bd-etc-sysctl-conf\") pod \"tuned-rtkdm\" (UID: \"38b64dbe-c6b9-427c-b563-fde192bde0bd\") " pod="openshift-cluster-node-tuning-operator/tuned-rtkdm" Apr 23 17:54:57.799225 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.796674 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/38b64dbe-c6b9-427c-b563-fde192bde0bd-etc-sysctl-conf\") pod \"tuned-rtkdm\" (UID: \"38b64dbe-c6b9-427c-b563-fde192bde0bd\") " pod="openshift-cluster-node-tuning-operator/tuned-rtkdm" Apr 23 17:54:57.799225 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.796721 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c7659d0d-36b6-4e60-9b82-1870686d5685-hosts-file\") pod \"node-resolver-r69nr\" (UID: \"c7659d0d-36b6-4e60-9b82-1870686d5685\") " pod="openshift-dns/node-resolver-r69nr" Apr 23 17:54:57.799225 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.796975 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c7659d0d-36b6-4e60-9b82-1870686d5685-tmp-dir\") pod \"node-resolver-r69nr\" (UID: \"c7659d0d-36b6-4e60-9b82-1870686d5685\") " pod="openshift-dns/node-resolver-r69nr" Apr 23 17:54:57.799225 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.797006 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/41a9f941-4751-4a1c-b87e-c4384db8566d-konnectivity-ca\") pod \"konnectivity-agent-2sr6m\" (UID: \"41a9f941-4751-4a1c-b87e-c4384db8566d\") " pod="kube-system/konnectivity-agent-2sr6m" Apr 23 17:54:57.799225 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.797088 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e172a861-bbd7-4d2d-9045-1bfe854b4621-multus-daemon-config\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.799225 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.798574 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/38b64dbe-c6b9-427c-b563-fde192bde0bd-tmp\") pod \"tuned-rtkdm\" (UID: \"38b64dbe-c6b9-427c-b563-fde192bde0bd\") " pod="openshift-cluster-node-tuning-operator/tuned-rtkdm" Apr 23 17:54:57.799225 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.798595 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/38b64dbe-c6b9-427c-b563-fde192bde0bd-etc-tuned\") pod \"tuned-rtkdm\" (UID: \"38b64dbe-c6b9-427c-b563-fde192bde0bd\") " pod="openshift-cluster-node-tuning-operator/tuned-rtkdm" Apr 23 17:54:57.799225 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.798733 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-ovn-node-metrics-cert\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.799225 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.798796 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/41a9f941-4751-4a1c-b87e-c4384db8566d-agent-certs\") pod \"konnectivity-agent-2sr6m\" (UID: \"41a9f941-4751-4a1c-b87e-c4384db8566d\") " pod="kube-system/konnectivity-agent-2sr6m" Apr 23 17:54:57.804072 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:57.804030 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:54:57.804072 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:57.804048 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:54:57.804072 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:57.804057 2568 projected.go:194] Error preparing data for projected volume kube-api-access-mr7qv for pod openshift-network-diagnostics/network-check-target-7vcgr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:54:57.804221 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:57.804108 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/705ed263-d57f-4121-a61c-7fca69acb455-kube-api-access-mr7qv podName:705ed263-d57f-4121-a61c-7fca69acb455 nodeName:}" failed. No retries permitted until 2026-04-23 17:54:58.304095174 +0000 UTC m=+156.224187631 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-mr7qv" (UniqueName: "kubernetes.io/projected/705ed263-d57f-4121-a61c-7fca69acb455-kube-api-access-mr7qv") pod "network-check-target-7vcgr" (UID: "705ed263-d57f-4121-a61c-7fca69acb455") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:54:57.806274 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.806246 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffvkh\" (UniqueName: \"kubernetes.io/projected/8fbfbc17-c10b-4c74-bfbd-14a45c716246-kube-api-access-ffvkh\") pod \"aws-ebs-csi-driver-node-rhfpc\" (UID: \"8fbfbc17-c10b-4c74-bfbd-14a45c716246\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rhfpc" Apr 23 17:54:57.807344 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.807327 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmtdn\" (UniqueName: \"kubernetes.io/projected/38b64dbe-c6b9-427c-b563-fde192bde0bd-kube-api-access-cmtdn\") pod \"tuned-rtkdm\" (UID: \"38b64dbe-c6b9-427c-b563-fde192bde0bd\") " pod="openshift-cluster-node-tuning-operator/tuned-rtkdm" Apr 23 17:54:57.809626 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.809587 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qchsx\" (UniqueName: \"kubernetes.io/projected/e172a861-bbd7-4d2d-9045-1bfe854b4621-kube-api-access-qchsx\") pod \"multus-kljn6\" (UID: \"e172a861-bbd7-4d2d-9045-1bfe854b4621\") " pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.810391 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.810373 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl59k\" (UniqueName: \"kubernetes.io/projected/52f4d4c2-438f-4486-8ffb-4afbaad14bf1-kube-api-access-pl59k\") pod \"ovnkube-node-lwrk7\" (UID: \"52f4d4c2-438f-4486-8ffb-4afbaad14bf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.810467 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.810391 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2sr8\" (UniqueName: \"kubernetes.io/projected/23f0cb8e-457e-471c-aba7-ab0e123e6f30-kube-api-access-z2sr8\") pod \"multus-additional-cni-plugins-f8xkt\" (UID: \"23f0cb8e-457e-471c-aba7-ab0e123e6f30\") " pod="openshift-multus/multus-additional-cni-plugins-f8xkt" Apr 23 17:54:57.810686 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.810657 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6jv8\" (UniqueName: \"kubernetes.io/projected/9fe17f32-d37d-48dc-b42b-74bbc44d26d2-kube-api-access-g6jv8\") pod \"network-metrics-daemon-2dh7j\" (UID: \"9fe17f32-d37d-48dc-b42b-74bbc44d26d2\") " pod="openshift-multus/network-metrics-daemon-2dh7j" Apr 23 17:54:57.810791 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.810773 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrb7n\" (UniqueName: \"kubernetes.io/projected/48f03a68-806b-44d8-b107-0af334f9ad51-kube-api-access-qrb7n\") pod \"node-ca-lw625\" (UID: \"48f03a68-806b-44d8-b107-0af334f9ad51\") " pod="openshift-image-registry/node-ca-lw625" Apr 23 17:54:57.811739 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.811720 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dd27\" (UniqueName: \"kubernetes.io/projected/c7659d0d-36b6-4e60-9b82-1870686d5685-kube-api-access-9dd27\") pod \"node-resolver-r69nr\" (UID: \"c7659d0d-36b6-4e60-9b82-1870686d5685\") " pod="openshift-dns/node-resolver-r69nr" Apr 23 17:54:57.812277 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.812259 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcgm6\" (UniqueName: \"kubernetes.io/projected/6f8d8a15-5978-4526-a3ff-4be016bdd0d7-kube-api-access-jcgm6\") pod \"iptables-alerter-wfj88\" (UID: \"6f8d8a15-5978-4526-a3ff-4be016bdd0d7\") " pod="openshift-network-operator/iptables-alerter-wfj88" Apr 23 17:54:57.873846 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.873813 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-wfj88" Apr 23 17:54:57.878542 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.878523 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-rtkdm" Apr 23 17:54:57.880677 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:54:57.880652 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f8d8a15_5978_4526_a3ff_4be016bdd0d7.slice/crio-938b60988ff68a70c5a931769ee356d07f5fc9ef1372213498ea5ed5b616ba1d WatchSource:0}: Error finding container 938b60988ff68a70c5a931769ee356d07f5fc9ef1372213498ea5ed5b616ba1d: Status 404 returned error can't find the container with id 938b60988ff68a70c5a931769ee356d07f5fc9ef1372213498ea5ed5b616ba1d Apr 23 17:54:57.884387 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.884367 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-r69nr" Apr 23 17:54:57.886476 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:54:57.886454 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38b64dbe_c6b9_427c_b563_fde192bde0bd.slice/crio-87ee0f87e53dfc66eaa8fca5b2998d9808f916e7daac0dc351e64848b2667598 WatchSource:0}: Error finding container 87ee0f87e53dfc66eaa8fca5b2998d9808f916e7daac0dc351e64848b2667598: Status 404 returned error can't find the container with id 87ee0f87e53dfc66eaa8fca5b2998d9808f916e7daac0dc351e64848b2667598 Apr 23 17:54:57.888877 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.888861 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-f8xkt" Apr 23 17:54:57.890743 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:54:57.890723 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7659d0d_36b6_4e60_9b82_1870686d5685.slice/crio-af221ae84e0c5a12ec4491fc5fc64cd5dd290196fe6ee9cd038a2db8d15c5417 WatchSource:0}: Error finding container af221ae84e0c5a12ec4491fc5fc64cd5dd290196fe6ee9cd038a2db8d15c5417: Status 404 returned error can't find the container with id af221ae84e0c5a12ec4491fc5fc64cd5dd290196fe6ee9cd038a2db8d15c5417 Apr 23 17:54:57.894392 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.894374 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-kljn6" Apr 23 17:54:57.895559 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:54:57.895535 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23f0cb8e_457e_471c_aba7_ab0e123e6f30.slice/crio-c39da8a908dd188148b3e68a54946e1edcaa27e3e388065e2b770c1127dcfebd WatchSource:0}: Error finding container c39da8a908dd188148b3e68a54946e1edcaa27e3e388065e2b770c1127dcfebd: Status 404 returned error can't find the container with id c39da8a908dd188148b3e68a54946e1edcaa27e3e388065e2b770c1127dcfebd Apr 23 17:54:57.899092 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.899071 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-2sr6m" Apr 23 17:54:57.900354 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:54:57.900329 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode172a861_bbd7_4d2d_9045_1bfe854b4621.slice/crio-50c78fc28cc76ad9eac4ef1e9c5d0a50d5be2fd5a3938c4d9c6d0a8a4a1b6af5 WatchSource:0}: Error finding container 50c78fc28cc76ad9eac4ef1e9c5d0a50d5be2fd5a3938c4d9c6d0a8a4a1b6af5: Status 404 returned error can't find the container with id 50c78fc28cc76ad9eac4ef1e9c5d0a50d5be2fd5a3938c4d9c6d0a8a4a1b6af5 Apr 23 17:54:57.904044 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.904027 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:54:57.905039 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:54:57.905015 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41a9f941_4751_4a1c_b87e_c4384db8566d.slice/crio-024d15aa22bddbd3eb9bb3b4c0e552904b14bba89c3cc6e41f650671cc908a81 WatchSource:0}: Error finding container 024d15aa22bddbd3eb9bb3b4c0e552904b14bba89c3cc6e41f650671cc908a81: Status 404 returned error can't find the container with id 024d15aa22bddbd3eb9bb3b4c0e552904b14bba89c3cc6e41f650671cc908a81 Apr 23 17:54:57.909739 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.909672 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rhfpc" Apr 23 17:54:57.910875 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:54:57.910854 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52f4d4c2_438f_4486_8ffb_4afbaad14bf1.slice/crio-f0a7695d604746b93ea02dd7c842b319de2b36e6b7341cd24df45bda1a07162c WatchSource:0}: Error finding container f0a7695d604746b93ea02dd7c842b319de2b36e6b7341cd24df45bda1a07162c: Status 404 returned error can't find the container with id f0a7695d604746b93ea02dd7c842b319de2b36e6b7341cd24df45bda1a07162c Apr 23 17:54:57.913165 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:57.913150 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-lw625" Apr 23 17:54:57.918249 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:54:57.918230 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fbfbc17_c10b_4c74_bfbd_14a45c716246.slice/crio-aa31f00b1ac2b06e13a7fc681e50ac8e8d1256a68fe141d931627aa4359d173d WatchSource:0}: Error finding container aa31f00b1ac2b06e13a7fc681e50ac8e8d1256a68fe141d931627aa4359d173d: Status 404 returned error can't find the container with id aa31f00b1ac2b06e13a7fc681e50ac8e8d1256a68fe141d931627aa4359d173d Apr 23 17:54:57.922183 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:54:57.922161 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48f03a68_806b_44d8_b107_0af334f9ad51.slice/crio-441bf4380859d868bab3f94e80d59d74484b3eb0e8a90160645f0e017405391a WatchSource:0}: Error finding container 441bf4380859d868bab3f94e80d59d74484b3eb0e8a90160645f0e017405391a: Status 404 returned error can't find the container with id 441bf4380859d868bab3f94e80d59d74484b3eb0e8a90160645f0e017405391a Apr 23 17:54:58.298987 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:58.298949 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9fe17f32-d37d-48dc-b42b-74bbc44d26d2-metrics-certs\") pod \"network-metrics-daemon-2dh7j\" (UID: \"9fe17f32-d37d-48dc-b42b-74bbc44d26d2\") " pod="openshift-multus/network-metrics-daemon-2dh7j" Apr 23 17:54:58.299165 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:58.299005 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f51b2065-4e39-46ce-8dda-c894907d1f96-original-pull-secret\") pod \"global-pull-secret-syncer-8vtxf\" (UID: \"f51b2065-4e39-46ce-8dda-c894907d1f96\") " pod="kube-system/global-pull-secret-syncer-8vtxf" Apr 23 17:54:58.299165 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:58.299139 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 17:54:58.299295 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:58.299202 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f51b2065-4e39-46ce-8dda-c894907d1f96-original-pull-secret podName:f51b2065-4e39-46ce-8dda-c894907d1f96 nodeName:}" failed. No retries permitted until 2026-04-23 17:54:59.299182623 +0000 UTC m=+157.219275076 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f51b2065-4e39-46ce-8dda-c894907d1f96-original-pull-secret") pod "global-pull-secret-syncer-8vtxf" (UID: "f51b2065-4e39-46ce-8dda-c894907d1f96") : object "kube-system"/"original-pull-secret" not registered Apr 23 17:54:58.299692 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:58.299673 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:54:58.299779 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:58.299731 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fe17f32-d37d-48dc-b42b-74bbc44d26d2-metrics-certs podName:9fe17f32-d37d-48dc-b42b-74bbc44d26d2 nodeName:}" failed. No retries permitted until 2026-04-23 17:54:59.299715253 +0000 UTC m=+157.219807706 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9fe17f32-d37d-48dc-b42b-74bbc44d26d2-metrics-certs") pod "network-metrics-daemon-2dh7j" (UID: "9fe17f32-d37d-48dc-b42b-74bbc44d26d2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:54:58.400536 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:58.399862 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mr7qv\" (UniqueName: \"kubernetes.io/projected/705ed263-d57f-4121-a61c-7fca69acb455-kube-api-access-mr7qv\") pod \"network-check-target-7vcgr\" (UID: \"705ed263-d57f-4121-a61c-7fca69acb455\") " pod="openshift-network-diagnostics/network-check-target-7vcgr" Apr 23 17:54:58.400536 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:58.400057 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:54:58.400536 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:58.400076 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:54:58.400536 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:58.400089 2568 projected.go:194] Error preparing data for projected volume kube-api-access-mr7qv for pod openshift-network-diagnostics/network-check-target-7vcgr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:54:58.400536 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:58.400158 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/705ed263-d57f-4121-a61c-7fca69acb455-kube-api-access-mr7qv podName:705ed263-d57f-4121-a61c-7fca69acb455 nodeName:}" failed. No retries permitted until 2026-04-23 17:54:59.400139384 +0000 UTC m=+157.320231839 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-mr7qv" (UniqueName: "kubernetes.io/projected/705ed263-d57f-4121-a61c-7fca69acb455-kube-api-access-mr7qv") pod "network-check-target-7vcgr" (UID: "705ed263-d57f-4121-a61c-7fca69acb455") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:54:58.845607 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:58.845535 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rhfpc" event={"ID":"8fbfbc17-c10b-4c74-bfbd-14a45c716246","Type":"ContainerStarted","Data":"aa31f00b1ac2b06e13a7fc681e50ac8e8d1256a68fe141d931627aa4359d173d"} Apr 23 17:54:58.860967 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:58.860906 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" event={"ID":"52f4d4c2-438f-4486-8ffb-4afbaad14bf1","Type":"ContainerStarted","Data":"f0a7695d604746b93ea02dd7c842b319de2b36e6b7341cd24df45bda1a07162c"} Apr 23 17:54:58.871477 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:58.871445 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-2sr6m" event={"ID":"41a9f941-4751-4a1c-b87e-c4384db8566d","Type":"ContainerStarted","Data":"024d15aa22bddbd3eb9bb3b4c0e552904b14bba89c3cc6e41f650671cc908a81"} Apr 23 17:54:58.875960 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:58.875600 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kljn6" event={"ID":"e172a861-bbd7-4d2d-9045-1bfe854b4621","Type":"ContainerStarted","Data":"50c78fc28cc76ad9eac4ef1e9c5d0a50d5be2fd5a3938c4d9c6d0a8a4a1b6af5"} Apr 23 17:54:58.878349 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:58.878267 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f8xkt" event={"ID":"23f0cb8e-457e-471c-aba7-ab0e123e6f30","Type":"ContainerStarted","Data":"c39da8a908dd188148b3e68a54946e1edcaa27e3e388065e2b770c1127dcfebd"} Apr 23 17:54:58.887790 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:58.887741 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-wfj88" event={"ID":"6f8d8a15-5978-4526-a3ff-4be016bdd0d7","Type":"ContainerStarted","Data":"938b60988ff68a70c5a931769ee356d07f5fc9ef1372213498ea5ed5b616ba1d"} Apr 23 17:54:58.895493 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:58.895435 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-lw625" event={"ID":"48f03a68-806b-44d8-b107-0af334f9ad51","Type":"ContainerStarted","Data":"441bf4380859d868bab3f94e80d59d74484b3eb0e8a90160645f0e017405391a"} Apr 23 17:54:58.907706 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:58.907681 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-r69nr" event={"ID":"c7659d0d-36b6-4e60-9b82-1870686d5685","Type":"ContainerStarted","Data":"af221ae84e0c5a12ec4491fc5fc64cd5dd290196fe6ee9cd038a2db8d15c5417"} Apr 23 17:54:58.913552 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:58.913493 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-rtkdm" event={"ID":"38b64dbe-c6b9-427c-b563-fde192bde0bd","Type":"ContainerStarted","Data":"87ee0f87e53dfc66eaa8fca5b2998d9808f916e7daac0dc351e64848b2667598"} Apr 23 17:54:59.306504 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:59.305636 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f51b2065-4e39-46ce-8dda-c894907d1f96-original-pull-secret\") pod \"global-pull-secret-syncer-8vtxf\" (UID: \"f51b2065-4e39-46ce-8dda-c894907d1f96\") " pod="kube-system/global-pull-secret-syncer-8vtxf" Apr 23 17:54:59.306504 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:59.305740 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9fe17f32-d37d-48dc-b42b-74bbc44d26d2-metrics-certs\") pod \"network-metrics-daemon-2dh7j\" (UID: \"9fe17f32-d37d-48dc-b42b-74bbc44d26d2\") " pod="openshift-multus/network-metrics-daemon-2dh7j" Apr 23 17:54:59.306504 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:59.305858 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:54:59.306504 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:59.305918 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fe17f32-d37d-48dc-b42b-74bbc44d26d2-metrics-certs podName:9fe17f32-d37d-48dc-b42b-74bbc44d26d2 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:01.3059008 +0000 UTC m=+159.225993259 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9fe17f32-d37d-48dc-b42b-74bbc44d26d2-metrics-certs") pod "network-metrics-daemon-2dh7j" (UID: "9fe17f32-d37d-48dc-b42b-74bbc44d26d2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:54:59.306504 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:59.306338 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 17:54:59.306504 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:59.306402 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f51b2065-4e39-46ce-8dda-c894907d1f96-original-pull-secret podName:f51b2065-4e39-46ce-8dda-c894907d1f96 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:01.306387235 +0000 UTC m=+159.226479689 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f51b2065-4e39-46ce-8dda-c894907d1f96-original-pull-secret") pod "global-pull-secret-syncer-8vtxf" (UID: "f51b2065-4e39-46ce-8dda-c894907d1f96") : object "kube-system"/"original-pull-secret" not registered Apr 23 17:54:59.406426 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:59.406385 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mr7qv\" (UniqueName: \"kubernetes.io/projected/705ed263-d57f-4121-a61c-7fca69acb455-kube-api-access-mr7qv\") pod \"network-check-target-7vcgr\" (UID: \"705ed263-d57f-4121-a61c-7fca69acb455\") " pod="openshift-network-diagnostics/network-check-target-7vcgr" Apr 23 17:54:59.406622 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:59.406573 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:54:59.406622 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:59.406600 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:54:59.406622 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:59.406616 2568 projected.go:194] Error preparing data for projected volume kube-api-access-mr7qv for pod openshift-network-diagnostics/network-check-target-7vcgr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:54:59.406780 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:59.406677 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/705ed263-d57f-4121-a61c-7fca69acb455-kube-api-access-mr7qv podName:705ed263-d57f-4121-a61c-7fca69acb455 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:01.406657832 +0000 UTC m=+159.326750291 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-mr7qv" (UniqueName: "kubernetes.io/projected/705ed263-d57f-4121-a61c-7fca69acb455-kube-api-access-mr7qv") pod "network-check-target-7vcgr" (UID: "705ed263-d57f-4121-a61c-7fca69acb455") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:54:59.596189 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:59.595355 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2dh7j" Apr 23 17:54:59.596189 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:59.595493 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2dh7j" podUID="9fe17f32-d37d-48dc-b42b-74bbc44d26d2" Apr 23 17:54:59.596189 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:59.595887 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7vcgr" Apr 23 17:54:59.596189 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:59.595986 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7vcgr" podUID="705ed263-d57f-4121-a61c-7fca69acb455" Apr 23 17:54:59.596189 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:54:59.596069 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8vtxf" Apr 23 17:54:59.596189 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:54:59.596143 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8vtxf" podUID="f51b2065-4e39-46ce-8dda-c894907d1f96" Apr 23 17:55:01.321687 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:01.321645 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9fe17f32-d37d-48dc-b42b-74bbc44d26d2-metrics-certs\") pod \"network-metrics-daemon-2dh7j\" (UID: \"9fe17f32-d37d-48dc-b42b-74bbc44d26d2\") " pod="openshift-multus/network-metrics-daemon-2dh7j" Apr 23 17:55:01.322219 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:01.321704 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f51b2065-4e39-46ce-8dda-c894907d1f96-original-pull-secret\") pod \"global-pull-secret-syncer-8vtxf\" (UID: \"f51b2065-4e39-46ce-8dda-c894907d1f96\") " pod="kube-system/global-pull-secret-syncer-8vtxf" Apr 23 17:55:01.322219 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:01.321832 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 17:55:01.322219 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:01.321892 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f51b2065-4e39-46ce-8dda-c894907d1f96-original-pull-secret podName:f51b2065-4e39-46ce-8dda-c894907d1f96 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:05.321874305 +0000 UTC m=+163.241966770 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f51b2065-4e39-46ce-8dda-c894907d1f96-original-pull-secret") pod "global-pull-secret-syncer-8vtxf" (UID: "f51b2065-4e39-46ce-8dda-c894907d1f96") : object "kube-system"/"original-pull-secret" not registered Apr 23 17:55:01.322219 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:01.322029 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:55:01.322219 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:01.322095 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fe17f32-d37d-48dc-b42b-74bbc44d26d2-metrics-certs podName:9fe17f32-d37d-48dc-b42b-74bbc44d26d2 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:05.32207568 +0000 UTC m=+163.242168149 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9fe17f32-d37d-48dc-b42b-74bbc44d26d2-metrics-certs") pod "network-metrics-daemon-2dh7j" (UID: "9fe17f32-d37d-48dc-b42b-74bbc44d26d2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:55:01.422217 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:01.422176 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mr7qv\" (UniqueName: \"kubernetes.io/projected/705ed263-d57f-4121-a61c-7fca69acb455-kube-api-access-mr7qv\") pod \"network-check-target-7vcgr\" (UID: \"705ed263-d57f-4121-a61c-7fca69acb455\") " pod="openshift-network-diagnostics/network-check-target-7vcgr" Apr 23 17:55:01.422407 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:01.422362 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:55:01.422407 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:01.422391 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:55:01.422520 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:01.422423 2568 projected.go:194] Error preparing data for projected volume kube-api-access-mr7qv for pod openshift-network-diagnostics/network-check-target-7vcgr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:55:01.422520 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:01.422484 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/705ed263-d57f-4121-a61c-7fca69acb455-kube-api-access-mr7qv podName:705ed263-d57f-4121-a61c-7fca69acb455 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:05.422464668 +0000 UTC m=+163.342557130 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-mr7qv" (UniqueName: "kubernetes.io/projected/705ed263-d57f-4121-a61c-7fca69acb455-kube-api-access-mr7qv") pod "network-check-target-7vcgr" (UID: "705ed263-d57f-4121-a61c-7fca69acb455") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:55:01.595642 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:01.595549 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2dh7j" Apr 23 17:55:01.595815 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:01.595720 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2dh7j" podUID="9fe17f32-d37d-48dc-b42b-74bbc44d26d2" Apr 23 17:55:01.595815 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:01.595777 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7vcgr" Apr 23 17:55:01.596009 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:01.595786 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8vtxf" Apr 23 17:55:01.596009 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:01.595884 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7vcgr" podUID="705ed263-d57f-4121-a61c-7fca69acb455" Apr 23 17:55:01.596009 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:01.595975 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8vtxf" podUID="f51b2065-4e39-46ce-8dda-c894907d1f96" Apr 23 17:55:02.579925 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:02.579841 2568 kubelet.go:3132] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 23 17:55:03.595235 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:03.595165 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7vcgr" Apr 23 17:55:03.595679 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:03.595288 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7vcgr" podUID="705ed263-d57f-4121-a61c-7fca69acb455" Apr 23 17:55:03.595679 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:03.595291 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2dh7j" Apr 23 17:55:03.595679 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:03.595326 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8vtxf" Apr 23 17:55:03.595679 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:03.595396 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2dh7j" podUID="9fe17f32-d37d-48dc-b42b-74bbc44d26d2" Apr 23 17:55:03.595679 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:03.595459 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8vtxf" podUID="f51b2065-4e39-46ce-8dda-c894907d1f96" Apr 23 17:55:05.354415 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:05.354370 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9fe17f32-d37d-48dc-b42b-74bbc44d26d2-metrics-certs\") pod \"network-metrics-daemon-2dh7j\" (UID: \"9fe17f32-d37d-48dc-b42b-74bbc44d26d2\") " pod="openshift-multus/network-metrics-daemon-2dh7j" Apr 23 17:55:05.354885 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:05.354426 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f51b2065-4e39-46ce-8dda-c894907d1f96-original-pull-secret\") pod \"global-pull-secret-syncer-8vtxf\" (UID: \"f51b2065-4e39-46ce-8dda-c894907d1f96\") " pod="kube-system/global-pull-secret-syncer-8vtxf" Apr 23 17:55:05.354885 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:05.354555 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 17:55:05.354885 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:05.354555 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:55:05.354885 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:05.354643 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fe17f32-d37d-48dc-b42b-74bbc44d26d2-metrics-certs podName:9fe17f32-d37d-48dc-b42b-74bbc44d26d2 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:13.354621058 +0000 UTC m=+171.274713517 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9fe17f32-d37d-48dc-b42b-74bbc44d26d2-metrics-certs") pod "network-metrics-daemon-2dh7j" (UID: "9fe17f32-d37d-48dc-b42b-74bbc44d26d2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:55:05.354885 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:05.354663 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f51b2065-4e39-46ce-8dda-c894907d1f96-original-pull-secret podName:f51b2065-4e39-46ce-8dda-c894907d1f96 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:13.354652602 +0000 UTC m=+171.274745062 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f51b2065-4e39-46ce-8dda-c894907d1f96-original-pull-secret") pod "global-pull-secret-syncer-8vtxf" (UID: "f51b2065-4e39-46ce-8dda-c894907d1f96") : object "kube-system"/"original-pull-secret" not registered Apr 23 17:55:05.456172 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:05.455531 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mr7qv\" (UniqueName: \"kubernetes.io/projected/705ed263-d57f-4121-a61c-7fca69acb455-kube-api-access-mr7qv\") pod \"network-check-target-7vcgr\" (UID: \"705ed263-d57f-4121-a61c-7fca69acb455\") " pod="openshift-network-diagnostics/network-check-target-7vcgr" Apr 23 17:55:05.456172 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:05.455709 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:55:05.456172 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:05.455729 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:55:05.456172 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:05.455742 2568 projected.go:194] Error preparing data for projected volume kube-api-access-mr7qv for pod openshift-network-diagnostics/network-check-target-7vcgr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:55:05.456172 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:05.455830 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/705ed263-d57f-4121-a61c-7fca69acb455-kube-api-access-mr7qv podName:705ed263-d57f-4121-a61c-7fca69acb455 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:13.455811786 +0000 UTC m=+171.375904251 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-mr7qv" (UniqueName: "kubernetes.io/projected/705ed263-d57f-4121-a61c-7fca69acb455-kube-api-access-mr7qv") pod "network-check-target-7vcgr" (UID: "705ed263-d57f-4121-a61c-7fca69acb455") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:55:05.595017 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:05.594920 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8vtxf" Apr 23 17:55:05.595207 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:05.595065 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8vtxf" podUID="f51b2065-4e39-46ce-8dda-c894907d1f96" Apr 23 17:55:05.595207 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:05.595163 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7vcgr" Apr 23 17:55:05.595326 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:05.595263 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7vcgr" podUID="705ed263-d57f-4121-a61c-7fca69acb455" Apr 23 17:55:05.595326 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:05.595316 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2dh7j" Apr 23 17:55:05.595428 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:05.595401 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2dh7j" podUID="9fe17f32-d37d-48dc-b42b-74bbc44d26d2" Apr 23 17:55:07.581097 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:07.581062 2568 kubelet.go:3132] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 23 17:55:07.595344 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:07.595311 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2dh7j" Apr 23 17:55:07.595479 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:07.595342 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8vtxf" Apr 23 17:55:07.595479 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:07.595456 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2dh7j" podUID="9fe17f32-d37d-48dc-b42b-74bbc44d26d2" Apr 23 17:55:07.595600 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:07.595516 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8vtxf" podUID="f51b2065-4e39-46ce-8dda-c894907d1f96" Apr 23 17:55:07.595600 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:07.595535 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7vcgr" Apr 23 17:55:07.595698 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:07.595652 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7vcgr" podUID="705ed263-d57f-4121-a61c-7fca69acb455" Apr 23 17:55:09.594975 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:09.594923 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2dh7j" Apr 23 17:55:09.595415 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:09.595004 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7vcgr" Apr 23 17:55:09.595415 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:09.595119 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7vcgr" podUID="705ed263-d57f-4121-a61c-7fca69acb455" Apr 23 17:55:09.595415 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:09.595184 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8vtxf" Apr 23 17:55:09.595415 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:09.595282 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8vtxf" podUID="f51b2065-4e39-46ce-8dda-c894907d1f96" Apr 23 17:55:09.595415 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:09.595364 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2dh7j" podUID="9fe17f32-d37d-48dc-b42b-74bbc44d26d2" Apr 23 17:55:10.597888 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:10.597850 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2dh7j" Apr 23 17:55:10.598352 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:10.598005 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2dh7j" podUID="9fe17f32-d37d-48dc-b42b-74bbc44d26d2" Apr 23 17:55:10.598421 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:10.598366 2568 scope.go:117] "RemoveContainer" containerID="7d22b5c3984b993dd6a97d4fcb92664ad0e56c30d7171e1edb4321b83ebb2ab8" Apr 23 17:55:10.598550 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:10.598530 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_openshift-machine-config-operator(8dc4456d0c7e39a0756b16fde0e83318)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal" podUID="8dc4456d0c7e39a0756b16fde0e83318" Apr 23 17:55:11.595229 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:11.595174 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7vcgr" Apr 23 17:55:11.595439 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:11.595175 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8vtxf" Apr 23 17:55:11.595439 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:11.595312 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7vcgr" podUID="705ed263-d57f-4121-a61c-7fca69acb455" Apr 23 17:55:11.595439 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:11.595411 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8vtxf" podUID="f51b2065-4e39-46ce-8dda-c894907d1f96" Apr 23 17:55:12.581477 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:12.581429 2568 kubelet.go:3132] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 23 17:55:12.600767 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:12.600737 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2dh7j" Apr 23 17:55:12.600944 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:12.600857 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2dh7j" podUID="9fe17f32-d37d-48dc-b42b-74bbc44d26d2" Apr 23 17:55:13.412411 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:13.412368 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9fe17f32-d37d-48dc-b42b-74bbc44d26d2-metrics-certs\") pod \"network-metrics-daemon-2dh7j\" (UID: \"9fe17f32-d37d-48dc-b42b-74bbc44d26d2\") " pod="openshift-multus/network-metrics-daemon-2dh7j" Apr 23 17:55:13.412411 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:13.412424 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f51b2065-4e39-46ce-8dda-c894907d1f96-original-pull-secret\") pod \"global-pull-secret-syncer-8vtxf\" (UID: \"f51b2065-4e39-46ce-8dda-c894907d1f96\") " pod="kube-system/global-pull-secret-syncer-8vtxf" Apr 23 17:55:13.412682 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:13.412530 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:55:13.412682 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:13.412622 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fe17f32-d37d-48dc-b42b-74bbc44d26d2-metrics-certs podName:9fe17f32-d37d-48dc-b42b-74bbc44d26d2 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:29.412604478 +0000 UTC m=+187.332696944 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9fe17f32-d37d-48dc-b42b-74bbc44d26d2-metrics-certs") pod "network-metrics-daemon-2dh7j" (UID: "9fe17f32-d37d-48dc-b42b-74bbc44d26d2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:55:13.412682 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:13.412539 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 17:55:13.412809 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:13.412710 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f51b2065-4e39-46ce-8dda-c894907d1f96-original-pull-secret podName:f51b2065-4e39-46ce-8dda-c894907d1f96 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:29.412693709 +0000 UTC m=+187.332786167 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f51b2065-4e39-46ce-8dda-c894907d1f96-original-pull-secret") pod "global-pull-secret-syncer-8vtxf" (UID: "f51b2065-4e39-46ce-8dda-c894907d1f96") : object "kube-system"/"original-pull-secret" not registered Apr 23 17:55:13.513527 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:13.513480 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mr7qv\" (UniqueName: \"kubernetes.io/projected/705ed263-d57f-4121-a61c-7fca69acb455-kube-api-access-mr7qv\") pod \"network-check-target-7vcgr\" (UID: \"705ed263-d57f-4121-a61c-7fca69acb455\") " pod="openshift-network-diagnostics/network-check-target-7vcgr" Apr 23 17:55:13.513743 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:13.513678 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:55:13.513743 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:13.513704 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:55:13.513743 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:13.513716 2568 projected.go:194] Error preparing data for projected volume kube-api-access-mr7qv for pod openshift-network-diagnostics/network-check-target-7vcgr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:55:13.513903 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:13.513808 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/705ed263-d57f-4121-a61c-7fca69acb455-kube-api-access-mr7qv podName:705ed263-d57f-4121-a61c-7fca69acb455 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:29.513794026 +0000 UTC m=+187.433886479 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-mr7qv" (UniqueName: "kubernetes.io/projected/705ed263-d57f-4121-a61c-7fca69acb455-kube-api-access-mr7qv") pod "network-check-target-7vcgr" (UID: "705ed263-d57f-4121-a61c-7fca69acb455") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:55:13.594893 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:13.594853 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7vcgr" Apr 23 17:55:13.595341 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:13.594853 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8vtxf" Apr 23 17:55:13.595341 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:13.595006 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7vcgr" podUID="705ed263-d57f-4121-a61c-7fca69acb455" Apr 23 17:55:13.595341 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:13.595113 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8vtxf" podUID="f51b2065-4e39-46ce-8dda-c894907d1f96" Apr 23 17:55:14.597225 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:14.596993 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2dh7j" Apr 23 17:55:14.597862 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:14.597353 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2dh7j" podUID="9fe17f32-d37d-48dc-b42b-74bbc44d26d2" Apr 23 17:55:14.945902 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:14.945824 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rhfpc" event={"ID":"8fbfbc17-c10b-4c74-bfbd-14a45c716246","Type":"ContainerStarted","Data":"5f9f13a5b041c99a858760ade46b90c30078c042d2f856d0e8af633a64bf7d51"} Apr 23 17:55:14.949879 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:14.949496 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwrk7_52f4d4c2-438f-4486-8ffb-4afbaad14bf1/ovn-acl-logging/0.log" Apr 23 17:55:14.950069 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:14.950027 2568 generic.go:358] "Generic (PLEG): container finished" podID="52f4d4c2-438f-4486-8ffb-4afbaad14bf1" containerID="83a9793cd7227f81d01087ce145ea050e137ff3e474bb8e510e24ab2b5b86781" exitCode=1 Apr 23 17:55:14.950225 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:14.950157 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" event={"ID":"52f4d4c2-438f-4486-8ffb-4afbaad14bf1","Type":"ContainerStarted","Data":"887b5445f850fd06f31c3558525b17c65eab324fdcf57ac611c5464c392b96dc"} Apr 23 17:55:14.950306 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:14.950293 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" event={"ID":"52f4d4c2-438f-4486-8ffb-4afbaad14bf1","Type":"ContainerStarted","Data":"7405fec2fa250008a56860c5356acaae5a90e7838a29884d430ba9b804353757"} Apr 23 17:55:14.950360 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:14.950309 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" event={"ID":"52f4d4c2-438f-4486-8ffb-4afbaad14bf1","Type":"ContainerStarted","Data":"d2d7f70db775340c898779e3070019a46a6aa6386abf0b86f55a0e7329b0022a"} Apr 23 17:55:14.950360 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:14.950323 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" event={"ID":"52f4d4c2-438f-4486-8ffb-4afbaad14bf1","Type":"ContainerStarted","Data":"c247667b1f875fab8848a6cff6a9a7fa4a901e10181a951d35dd159d3ec1461f"} Apr 23 17:55:14.950360 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:14.950337 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" event={"ID":"52f4d4c2-438f-4486-8ffb-4afbaad14bf1","Type":"ContainerDied","Data":"83a9793cd7227f81d01087ce145ea050e137ff3e474bb8e510e24ab2b5b86781"} Apr 23 17:55:14.950360 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:14.950352 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" event={"ID":"52f4d4c2-438f-4486-8ffb-4afbaad14bf1","Type":"ContainerStarted","Data":"4463f8bf781742de61bc400fa40d5e23f8e6046dfb44c6ab41c93abd8250555e"} Apr 23 17:55:14.954632 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:14.954599 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-2sr6m" event={"ID":"41a9f941-4751-4a1c-b87e-c4384db8566d","Type":"ContainerStarted","Data":"344b9b52475570b990a43de81cd5a80d1836cab154c5dbb274fae3f154778647"} Apr 23 17:55:14.955912 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:14.955887 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kljn6" event={"ID":"e172a861-bbd7-4d2d-9045-1bfe854b4621","Type":"ContainerStarted","Data":"2d41a38a69b010b20cf7a10d3e1e5594e6ffa738664ebf5bace7f326eb090259"} Apr 23 17:55:14.957334 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:14.957310 2568 generic.go:358] "Generic (PLEG): container finished" podID="23f0cb8e-457e-471c-aba7-ab0e123e6f30" containerID="4cdd4cdeb0e50522d294e70c82e491d5560a2fcde74b0144c9c6af6e0853de17" exitCode=0 Apr 23 17:55:14.957490 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:14.957389 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f8xkt" event={"ID":"23f0cb8e-457e-471c-aba7-ab0e123e6f30","Type":"ContainerDied","Data":"4cdd4cdeb0e50522d294e70c82e491d5560a2fcde74b0144c9c6af6e0853de17"} Apr 23 17:55:14.958765 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:14.958686 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-lw625" event={"ID":"48f03a68-806b-44d8-b107-0af334f9ad51","Type":"ContainerStarted","Data":"71616bb043341925c69d13c1202c7fd00d118b881d0d07aa257e8a14f12d843e"} Apr 23 17:55:14.960122 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:14.960099 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-r69nr" event={"ID":"c7659d0d-36b6-4e60-9b82-1870686d5685","Type":"ContainerStarted","Data":"c11e34cafd6ec25014b265b4cd2b68330ad0e0fe8c4517ecc3997b2e1a0f483a"} Apr 23 17:55:14.961358 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:14.961334 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-rtkdm" event={"ID":"38b64dbe-c6b9-427c-b563-fde192bde0bd","Type":"ContainerStarted","Data":"f39881c8f6df863e059465104661c2c9fdee41fa4d9ad976a568592d06636e34"} Apr 23 17:55:14.971793 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:14.971745 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-2sr6m" podStartSLOduration=56.89369711 podStartE2EDuration="1m8.971728352s" podCreationTimestamp="2026-04-23 17:54:06 +0000 UTC" firstStartedPulling="2026-04-23 17:54:57.906831175 +0000 UTC m=+155.826923639" lastFinishedPulling="2026-04-23 17:55:09.984862424 +0000 UTC m=+167.904954881" observedRunningTime="2026-04-23 17:55:14.971014826 +0000 UTC m=+172.891107302" watchObservedRunningTime="2026-04-23 17:55:14.971728352 +0000 UTC m=+172.891821211" Apr 23 17:55:14.988071 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:14.987986 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-r69nr" podStartSLOduration=52.605342353 podStartE2EDuration="1m8.987966397s" podCreationTimestamp="2026-04-23 17:54:06 +0000 UTC" firstStartedPulling="2026-04-23 17:54:57.892651034 +0000 UTC m=+155.812743487" lastFinishedPulling="2026-04-23 17:55:14.275275076 +0000 UTC m=+172.195367531" observedRunningTime="2026-04-23 17:55:14.9872699 +0000 UTC m=+172.907362376" watchObservedRunningTime="2026-04-23 17:55:14.987966397 +0000 UTC m=+172.908058873" Apr 23 17:55:15.026253 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:15.025661 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-rtkdm" podStartSLOduration=52.624176006 podStartE2EDuration="1m9.025645627s" podCreationTimestamp="2026-04-23 17:54:06 +0000 UTC" firstStartedPulling="2026-04-23 17:54:57.888087576 +0000 UTC m=+155.808180032" lastFinishedPulling="2026-04-23 17:55:14.289557185 +0000 UTC m=+172.209649653" observedRunningTime="2026-04-23 17:55:15.003658931 +0000 UTC m=+172.923751408" watchObservedRunningTime="2026-04-23 17:55:15.025645627 +0000 UTC m=+172.945738104" Apr 23 17:55:15.043926 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:15.043822 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-kljn6" podStartSLOduration=52.581834068 podStartE2EDuration="1m9.043801839s" podCreationTimestamp="2026-04-23 17:54:06 +0000 UTC" firstStartedPulling="2026-04-23 17:54:57.90180173 +0000 UTC m=+155.821894187" lastFinishedPulling="2026-04-23 17:55:14.3637695 +0000 UTC m=+172.283861958" observedRunningTime="2026-04-23 17:55:15.043721023 +0000 UTC m=+172.963813497" watchObservedRunningTime="2026-04-23 17:55:15.043801839 +0000 UTC m=+172.963894316" Apr 23 17:55:15.061785 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:15.061718 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-lw625" podStartSLOduration=52.710412517 podStartE2EDuration="1m9.061694433s" podCreationTimestamp="2026-04-23 17:54:06 +0000 UTC" firstStartedPulling="2026-04-23 17:54:57.923952838 +0000 UTC m=+155.844045296" lastFinishedPulling="2026-04-23 17:55:14.275234744 +0000 UTC m=+172.195327212" observedRunningTime="2026-04-23 17:55:15.06105308 +0000 UTC m=+172.981145557" watchObservedRunningTime="2026-04-23 17:55:15.061694433 +0000 UTC m=+172.981786909" Apr 23 17:55:15.400301 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:15.400088 2568 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 17:55:15.594654 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:15.594623 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7vcgr" Apr 23 17:55:15.594832 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:15.594625 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8vtxf" Apr 23 17:55:15.594832 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:15.594747 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7vcgr" podUID="705ed263-d57f-4121-a61c-7fca69acb455" Apr 23 17:55:15.594960 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:15.594827 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8vtxf" podUID="f51b2065-4e39-46ce-8dda-c894907d1f96" Apr 23 17:55:15.647075 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:15.646951 2568 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T17:55:15.400291441Z","UUID":"5385f251-d3f9-472a-9553-e9036f4b9eea","Handler":null,"Name":"","Endpoint":""} Apr 23 17:55:15.650110 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:15.650087 2568 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 17:55:15.650250 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:15.650122 2568 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 17:55:15.964697 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:15.964524 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-wfj88" event={"ID":"6f8d8a15-5978-4526-a3ff-4be016bdd0d7","Type":"ContainerStarted","Data":"12f0c4ddcda7ab48e63bf461cdedec7c886ae7bcc37f758d475533eea0c8f4dc"} Apr 23 17:55:15.967044 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:15.967008 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rhfpc" event={"ID":"8fbfbc17-c10b-4c74-bfbd-14a45c716246","Type":"ContainerStarted","Data":"aff39a919bc641b56e23ba507a271bbead750e970b0f214647fecd8424adf0a3"} Apr 23 17:55:15.987567 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:15.987501 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-wfj88" podStartSLOduration=53.594218555 podStartE2EDuration="1m9.987485191s" podCreationTimestamp="2026-04-23 17:54:06 +0000 UTC" firstStartedPulling="2026-04-23 17:54:57.881966681 +0000 UTC m=+155.802059137" lastFinishedPulling="2026-04-23 17:55:14.275233305 +0000 UTC m=+172.195325773" observedRunningTime="2026-04-23 17:55:15.987000066 +0000 UTC m=+173.907092542" watchObservedRunningTime="2026-04-23 17:55:15.987485191 +0000 UTC m=+173.907577667" Apr 23 17:55:16.595693 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:16.595659 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2dh7j" Apr 23 17:55:16.595893 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:16.595814 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2dh7j" podUID="9fe17f32-d37d-48dc-b42b-74bbc44d26d2" Apr 23 17:55:16.971406 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:16.971366 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rhfpc" event={"ID":"8fbfbc17-c10b-4c74-bfbd-14a45c716246","Type":"ContainerStarted","Data":"6620ccf118a5311fff45e26755b129ad503bd1bd9dc6b3cfc7eeb1a60f4392a2"} Apr 23 17:55:17.001718 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:17.001658 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rhfpc" podStartSLOduration=52.675241293 podStartE2EDuration="1m11.001637826s" podCreationTimestamp="2026-04-23 17:54:06 +0000 UTC" firstStartedPulling="2026-04-23 17:54:57.919899293 +0000 UTC m=+155.839991746" lastFinishedPulling="2026-04-23 17:55:16.246295813 +0000 UTC m=+174.166388279" observedRunningTime="2026-04-23 17:55:17.00157197 +0000 UTC m=+174.921664447" watchObservedRunningTime="2026-04-23 17:55:17.001637826 +0000 UTC m=+174.921730306" Apr 23 17:55:17.582381 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:17.582339 2568 kubelet.go:3132] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 23 17:55:17.594623 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:17.594585 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8vtxf" Apr 23 17:55:17.594799 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:17.594585 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7vcgr" Apr 23 17:55:17.594799 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:17.594703 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8vtxf" podUID="f51b2065-4e39-46ce-8dda-c894907d1f96" Apr 23 17:55:17.594880 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:17.594805 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7vcgr" podUID="705ed263-d57f-4121-a61c-7fca69acb455" Apr 23 17:55:17.899522 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:17.899443 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-2sr6m" Apr 23 17:55:17.900182 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:17.900152 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-2sr6m" Apr 23 17:55:17.976214 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:17.976186 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwrk7_52f4d4c2-438f-4486-8ffb-4afbaad14bf1/ovn-acl-logging/0.log" Apr 23 17:55:17.976658 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:17.976519 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" event={"ID":"52f4d4c2-438f-4486-8ffb-4afbaad14bf1","Type":"ContainerStarted","Data":"3b06bb1c626cab1c75dbfd3ec58f59de81fa226e0dcd4aadf10e7d61c742908f"} Apr 23 17:55:17.976832 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:17.976811 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-2sr6m" Apr 23 17:55:17.977339 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:17.977321 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-2sr6m" Apr 23 17:55:18.594912 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:18.594878 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2dh7j" Apr 23 17:55:18.595118 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:18.595045 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2dh7j" podUID="9fe17f32-d37d-48dc-b42b-74bbc44d26d2" Apr 23 17:55:19.595376 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:19.595194 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8vtxf" Apr 23 17:55:19.595873 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:19.595236 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7vcgr" Apr 23 17:55:19.595873 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:19.595486 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8vtxf" podUID="f51b2065-4e39-46ce-8dda-c894907d1f96" Apr 23 17:55:19.595873 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:19.595522 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7vcgr" podUID="705ed263-d57f-4121-a61c-7fca69acb455" Apr 23 17:55:19.982868 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:19.982797 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwrk7_52f4d4c2-438f-4486-8ffb-4afbaad14bf1/ovn-acl-logging/0.log" Apr 23 17:55:19.983151 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:19.983127 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" event={"ID":"52f4d4c2-438f-4486-8ffb-4afbaad14bf1","Type":"ContainerStarted","Data":"5b77993836cfd516927a473acf46053997201c3a8765d7bbc7e11989b78a4455"} Apr 23 17:55:19.983496 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:19.983472 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:55:19.983636 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:19.983616 2568 scope.go:117] "RemoveContainer" containerID="83a9793cd7227f81d01087ce145ea050e137ff3e474bb8e510e24ab2b5b86781" Apr 23 17:55:19.984801 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:19.984780 2568 generic.go:358] "Generic (PLEG): container finished" podID="23f0cb8e-457e-471c-aba7-ab0e123e6f30" containerID="b8399dfe5f928858904d7b078ee008a5503ef4ef162b2922b93b35a2186ed16a" exitCode=0 Apr 23 17:55:19.984892 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:19.984863 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f8xkt" event={"ID":"23f0cb8e-457e-471c-aba7-ab0e123e6f30","Type":"ContainerDied","Data":"b8399dfe5f928858904d7b078ee008a5503ef4ef162b2922b93b35a2186ed16a"} Apr 23 17:55:19.998425 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:19.998400 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:55:20.595630 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:20.595594 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2dh7j" Apr 23 17:55:20.596332 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:20.595742 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2dh7j" podUID="9fe17f32-d37d-48dc-b42b-74bbc44d26d2" Apr 23 17:55:20.989215 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:20.989140 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwrk7_52f4d4c2-438f-4486-8ffb-4afbaad14bf1/ovn-acl-logging/0.log" Apr 23 17:55:20.989479 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:20.989458 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" event={"ID":"52f4d4c2-438f-4486-8ffb-4afbaad14bf1","Type":"ContainerStarted","Data":"4671168bdd88a12aade2a2cac8a6073a6a3946aa5e0b27caa09c88a3ef2a4ec3"} Apr 23 17:55:20.989859 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:20.989839 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:55:20.989859 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:20.989864 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:55:21.003518 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:21.003495 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:55:21.026113 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:21.026070 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" podStartSLOduration=58.497851791 podStartE2EDuration="1m15.026057649s" podCreationTimestamp="2026-04-23 17:54:06 +0000 UTC" firstStartedPulling="2026-04-23 17:54:57.914035531 +0000 UTC m=+155.834127999" lastFinishedPulling="2026-04-23 17:55:14.442241387 +0000 UTC m=+172.362333857" observedRunningTime="2026-04-23 17:55:21.025291625 +0000 UTC m=+178.945384100" watchObservedRunningTime="2026-04-23 17:55:21.026057649 +0000 UTC m=+178.946150123" Apr 23 17:55:21.595164 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:21.595130 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7vcgr" Apr 23 17:55:21.595336 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:21.595175 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8vtxf" Apr 23 17:55:21.595336 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:21.595251 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7vcgr" podUID="705ed263-d57f-4121-a61c-7fca69acb455" Apr 23 17:55:21.595425 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:21.595390 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8vtxf" podUID="f51b2065-4e39-46ce-8dda-c894907d1f96" Apr 23 17:55:21.992757 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:21.992668 2568 generic.go:358] "Generic (PLEG): container finished" podID="23f0cb8e-457e-471c-aba7-ab0e123e6f30" containerID="6b6bd1ae4c25f49487b155b1fe87ceba6cb3881d0058ecc556dc0aca847c5c0e" exitCode=0 Apr 23 17:55:21.993228 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:21.992751 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f8xkt" event={"ID":"23f0cb8e-457e-471c-aba7-ab0e123e6f30","Type":"ContainerDied","Data":"6b6bd1ae4c25f49487b155b1fe87ceba6cb3881d0058ecc556dc0aca847c5c0e"} Apr 23 17:55:22.583023 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:22.582978 2568 kubelet.go:3132] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 23 17:55:22.595939 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:22.595886 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2dh7j" Apr 23 17:55:22.596109 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:22.596055 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2dh7j" podUID="9fe17f32-d37d-48dc-b42b-74bbc44d26d2" Apr 23 17:55:22.596376 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:22.596359 2568 scope.go:117] "RemoveContainer" containerID="7d22b5c3984b993dd6a97d4fcb92664ad0e56c30d7171e1edb4321b83ebb2ab8" Apr 23 17:55:22.995967 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:22.995778 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_8dc4456d0c7e39a0756b16fde0e83318/kube-rbac-proxy-crio/4.log" Apr 23 17:55:22.996489 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:22.996357 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal" event={"ID":"8dc4456d0c7e39a0756b16fde0e83318","Type":"ContainerStarted","Data":"a08e8d1dd295229105e0f4dfa7f802d0e5ffae6341cea1b0f515ef331642f28d"} Apr 23 17:55:23.594829 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:23.594800 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8vtxf" Apr 23 17:55:23.595008 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:23.594801 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7vcgr" Apr 23 17:55:23.595008 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:23.594907 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8vtxf" podUID="f51b2065-4e39-46ce-8dda-c894907d1f96" Apr 23 17:55:23.595100 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:23.595027 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7vcgr" podUID="705ed263-d57f-4121-a61c-7fca69acb455" Apr 23 17:55:23.599797 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:23.599736 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal" podStartSLOduration=27.599718419 podStartE2EDuration="27.599718419s" podCreationTimestamp="2026-04-23 17:54:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:55:23.03589445 +0000 UTC m=+180.955986926" watchObservedRunningTime="2026-04-23 17:55:23.599718419 +0000 UTC m=+181.519810895" Apr 23 17:55:23.600454 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:23.600429 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-8vtxf"] Apr 23 17:55:23.601380 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:23.601342 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2dh7j"] Apr 23 17:55:23.601493 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:23.601480 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2dh7j" Apr 23 17:55:23.601726 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:23.601620 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2dh7j" podUID="9fe17f32-d37d-48dc-b42b-74bbc44d26d2" Apr 23 17:55:23.602106 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:23.602074 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-7vcgr"] Apr 23 17:55:24.002028 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:24.001996 2568 generic.go:358] "Generic (PLEG): container finished" podID="23f0cb8e-457e-471c-aba7-ab0e123e6f30" containerID="6bbe9a8fa7405484f3d356348f091d6a5787e472800690ec688818c47192b50f" exitCode=0 Apr 23 17:55:24.002598 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:24.002078 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7vcgr" Apr 23 17:55:24.002598 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:24.002072 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f8xkt" event={"ID":"23f0cb8e-457e-471c-aba7-ab0e123e6f30","Type":"ContainerDied","Data":"6bbe9a8fa7405484f3d356348f091d6a5787e472800690ec688818c47192b50f"} Apr 23 17:55:24.002598 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:24.002343 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7vcgr" podUID="705ed263-d57f-4121-a61c-7fca69acb455" Apr 23 17:55:24.002598 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:24.002360 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8vtxf" Apr 23 17:55:24.002598 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:24.002474 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8vtxf" podUID="f51b2065-4e39-46ce-8dda-c894907d1f96" Apr 23 17:55:25.595059 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:25.595024 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7vcgr" Apr 23 17:55:25.595518 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:25.595134 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8vtxf" Apr 23 17:55:25.595518 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:25.595149 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7vcgr" podUID="705ed263-d57f-4121-a61c-7fca69acb455" Apr 23 17:55:25.595518 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:25.595261 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8vtxf" podUID="f51b2065-4e39-46ce-8dda-c894907d1f96" Apr 23 17:55:25.595518 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:25.595314 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2dh7j" Apr 23 17:55:25.595518 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:25.595395 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2dh7j" podUID="9fe17f32-d37d-48dc-b42b-74bbc44d26d2" Apr 23 17:55:27.584169 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:27.584127 2568 kubelet.go:3132] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 23 17:55:27.595391 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:27.595352 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8vtxf" Apr 23 17:55:27.595391 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:27.595396 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2dh7j" Apr 23 17:55:27.595630 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:27.595487 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7vcgr" Apr 23 17:55:27.595630 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:27.595489 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8vtxf" podUID="f51b2065-4e39-46ce-8dda-c894907d1f96" Apr 23 17:55:27.595630 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:27.595543 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2dh7j" podUID="9fe17f32-d37d-48dc-b42b-74bbc44d26d2" Apr 23 17:55:27.595630 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:27.595611 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7vcgr" podUID="705ed263-d57f-4121-a61c-7fca69acb455" Apr 23 17:55:29.433508 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:29.433311 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9fe17f32-d37d-48dc-b42b-74bbc44d26d2-metrics-certs\") pod \"network-metrics-daemon-2dh7j\" (UID: \"9fe17f32-d37d-48dc-b42b-74bbc44d26d2\") " pod="openshift-multus/network-metrics-daemon-2dh7j" Apr 23 17:55:29.433993 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:29.433522 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f51b2065-4e39-46ce-8dda-c894907d1f96-original-pull-secret\") pod \"global-pull-secret-syncer-8vtxf\" (UID: \"f51b2065-4e39-46ce-8dda-c894907d1f96\") " pod="kube-system/global-pull-secret-syncer-8vtxf" Apr 23 17:55:29.433993 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:29.433444 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:55:29.433993 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:29.433622 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 17:55:29.433993 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:29.433665 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fe17f32-d37d-48dc-b42b-74bbc44d26d2-metrics-certs podName:9fe17f32-d37d-48dc-b42b-74bbc44d26d2 nodeName:}" failed. No retries permitted until 2026-04-23 17:56:01.433640751 +0000 UTC m=+219.353733211 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9fe17f32-d37d-48dc-b42b-74bbc44d26d2-metrics-certs") pod "network-metrics-daemon-2dh7j" (UID: "9fe17f32-d37d-48dc-b42b-74bbc44d26d2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:55:29.433993 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:29.433689 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f51b2065-4e39-46ce-8dda-c894907d1f96-original-pull-secret podName:f51b2065-4e39-46ce-8dda-c894907d1f96 nodeName:}" failed. No retries permitted until 2026-04-23 17:56:01.433677913 +0000 UTC m=+219.353770367 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f51b2065-4e39-46ce-8dda-c894907d1f96-original-pull-secret") pod "global-pull-secret-syncer-8vtxf" (UID: "f51b2065-4e39-46ce-8dda-c894907d1f96") : object "kube-system"/"original-pull-secret" not registered Apr 23 17:55:29.534230 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:29.534192 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mr7qv\" (UniqueName: \"kubernetes.io/projected/705ed263-d57f-4121-a61c-7fca69acb455-kube-api-access-mr7qv\") pod \"network-check-target-7vcgr\" (UID: \"705ed263-d57f-4121-a61c-7fca69acb455\") " pod="openshift-network-diagnostics/network-check-target-7vcgr" Apr 23 17:55:29.534403 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:29.534360 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:55:29.534403 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:29.534383 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:55:29.534403 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:29.534393 2568 projected.go:194] Error preparing data for projected volume kube-api-access-mr7qv for pod openshift-network-diagnostics/network-check-target-7vcgr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:55:29.534512 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:29.534442 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/705ed263-d57f-4121-a61c-7fca69acb455-kube-api-access-mr7qv podName:705ed263-d57f-4121-a61c-7fca69acb455 nodeName:}" failed. No retries permitted until 2026-04-23 17:56:01.534429602 +0000 UTC m=+219.454522058 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-mr7qv" (UniqueName: "kubernetes.io/projected/705ed263-d57f-4121-a61c-7fca69acb455-kube-api-access-mr7qv") pod "network-check-target-7vcgr" (UID: "705ed263-d57f-4121-a61c-7fca69acb455") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:55:29.595057 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:29.595016 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2dh7j" Apr 23 17:55:29.595057 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:29.595042 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7vcgr" Apr 23 17:55:29.595057 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:29.595016 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8vtxf" Apr 23 17:55:29.595305 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:29.595136 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7vcgr" podUID="705ed263-d57f-4121-a61c-7fca69acb455" Apr 23 17:55:29.595305 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:29.595179 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8vtxf" podUID="f51b2065-4e39-46ce-8dda-c894907d1f96" Apr 23 17:55:29.595305 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:29.595242 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2dh7j" podUID="9fe17f32-d37d-48dc-b42b-74bbc44d26d2" Apr 23 17:55:31.018612 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:31.018578 2568 generic.go:358] "Generic (PLEG): container finished" podID="23f0cb8e-457e-471c-aba7-ab0e123e6f30" containerID="86565d460e3b732d563b33aa1aae8ff65d2feb1ca71d1f138e9a21b73634c523" exitCode=0 Apr 23 17:55:31.019085 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:31.018643 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f8xkt" event={"ID":"23f0cb8e-457e-471c-aba7-ab0e123e6f30","Type":"ContainerDied","Data":"86565d460e3b732d563b33aa1aae8ff65d2feb1ca71d1f138e9a21b73634c523"} Apr 23 17:55:31.595135 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:31.595099 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2dh7j" Apr 23 17:55:31.595135 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:31.595136 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8vtxf" Apr 23 17:55:31.595360 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:31.595099 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7vcgr" Apr 23 17:55:31.595360 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:31.595257 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2dh7j" podUID="9fe17f32-d37d-48dc-b42b-74bbc44d26d2" Apr 23 17:55:31.595427 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:31.595373 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8vtxf" podUID="f51b2065-4e39-46ce-8dda-c894907d1f96" Apr 23 17:55:31.595488 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:31.595426 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7vcgr" podUID="705ed263-d57f-4121-a61c-7fca69acb455" Apr 23 17:55:32.022532 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:32.022448 2568 generic.go:358] "Generic (PLEG): container finished" podID="23f0cb8e-457e-471c-aba7-ab0e123e6f30" containerID="279e5100f053364cda11429de6159204dc7cbd1b1b2ea990869d8a7385fcaf77" exitCode=0 Apr 23 17:55:32.022532 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:32.022494 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f8xkt" event={"ID":"23f0cb8e-457e-471c-aba7-ab0e123e6f30","Type":"ContainerDied","Data":"279e5100f053364cda11429de6159204dc7cbd1b1b2ea990869d8a7385fcaf77"} Apr 23 17:55:33.027762 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:33.027730 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f8xkt" event={"ID":"23f0cb8e-457e-471c-aba7-ab0e123e6f30","Type":"ContainerStarted","Data":"366b7689e03c843d354e7967c209bede055f2f3c4565946c3d448636bfa0252a"} Apr 23 17:55:33.061722 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:33.061663 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-f8xkt" podStartSLOduration=55.005391283 podStartE2EDuration="1m27.061649321s" podCreationTimestamp="2026-04-23 17:54:06 +0000 UTC" firstStartedPulling="2026-04-23 17:54:57.897522672 +0000 UTC m=+155.817615128" lastFinishedPulling="2026-04-23 17:55:29.953780714 +0000 UTC m=+187.873873166" observedRunningTime="2026-04-23 17:55:33.058251474 +0000 UTC m=+190.978343953" watchObservedRunningTime="2026-04-23 17:55:33.061649321 +0000 UTC m=+190.981741795" Apr 23 17:55:33.446017 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:33.445980 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-64qgx"] Apr 23 17:55:33.469189 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:33.469161 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-64qgx" Apr 23 17:55:33.471317 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:33.471286 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 17:55:33.471478 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:33.471368 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 17:55:33.471478 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:33.471369 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 17:55:33.471478 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:33.471418 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 17:55:33.471836 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:33.471818 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 17:55:33.471912 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:33.471843 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 17:55:33.472549 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:33.472533 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-rzdrn\"" Apr 23 17:55:33.563020 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:33.562980 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4e2e3a37-968e-4bc2-87c5-ca830cebbc44-node-exporter-accelerators-collector-config\") pod \"node-exporter-64qgx\" (UID: \"4e2e3a37-968e-4bc2-87c5-ca830cebbc44\") " pod="openshift-monitoring/node-exporter-64qgx" Apr 23 17:55:33.563199 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:33.563055 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4e2e3a37-968e-4bc2-87c5-ca830cebbc44-metrics-client-ca\") pod \"node-exporter-64qgx\" (UID: \"4e2e3a37-968e-4bc2-87c5-ca830cebbc44\") " pod="openshift-monitoring/node-exporter-64qgx" Apr 23 17:55:33.563199 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:33.563075 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4e2e3a37-968e-4bc2-87c5-ca830cebbc44-sys\") pod \"node-exporter-64qgx\" (UID: \"4e2e3a37-968e-4bc2-87c5-ca830cebbc44\") " pod="openshift-monitoring/node-exporter-64qgx" Apr 23 17:55:33.563199 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:33.563094 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7nzf\" (UniqueName: \"kubernetes.io/projected/4e2e3a37-968e-4bc2-87c5-ca830cebbc44-kube-api-access-t7nzf\") pod \"node-exporter-64qgx\" (UID: \"4e2e3a37-968e-4bc2-87c5-ca830cebbc44\") " pod="openshift-monitoring/node-exporter-64qgx" Apr 23 17:55:33.563199 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:33.563112 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4e2e3a37-968e-4bc2-87c5-ca830cebbc44-node-exporter-textfile\") pod \"node-exporter-64qgx\" (UID: \"4e2e3a37-968e-4bc2-87c5-ca830cebbc44\") " pod="openshift-monitoring/node-exporter-64qgx" Apr 23 17:55:33.563199 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:33.563130 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4e2e3a37-968e-4bc2-87c5-ca830cebbc44-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-64qgx\" (UID: \"4e2e3a37-968e-4bc2-87c5-ca830cebbc44\") " pod="openshift-monitoring/node-exporter-64qgx" Apr 23 17:55:33.563355 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:33.563194 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4e2e3a37-968e-4bc2-87c5-ca830cebbc44-root\") pod \"node-exporter-64qgx\" (UID: \"4e2e3a37-968e-4bc2-87c5-ca830cebbc44\") " pod="openshift-monitoring/node-exporter-64qgx" Apr 23 17:55:33.563355 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:33.563226 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4e2e3a37-968e-4bc2-87c5-ca830cebbc44-node-exporter-wtmp\") pod \"node-exporter-64qgx\" (UID: \"4e2e3a37-968e-4bc2-87c5-ca830cebbc44\") " pod="openshift-monitoring/node-exporter-64qgx" Apr 23 17:55:33.563355 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:33.563249 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4e2e3a37-968e-4bc2-87c5-ca830cebbc44-node-exporter-tls\") pod \"node-exporter-64qgx\" (UID: \"4e2e3a37-968e-4bc2-87c5-ca830cebbc44\") " pod="openshift-monitoring/node-exporter-64qgx" Apr 23 17:55:33.595146 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:33.595115 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8vtxf" Apr 23 17:55:33.595248 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:33.595153 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7vcgr" Apr 23 17:55:33.595248 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:33.595153 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2dh7j" Apr 23 17:55:33.597287 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:33.597264 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-5487j\"" Apr 23 17:55:33.597423 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:33.597397 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 17:55:33.597659 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:33.597643 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 17:55:33.597730 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:33.597674 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 17:55:33.597776 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:33.597728 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-gszvz\"" Apr 23 17:55:33.597776 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:33.597755 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 17:55:33.663920 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:33.663886 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4e2e3a37-968e-4bc2-87c5-ca830cebbc44-node-exporter-textfile\") pod \"node-exporter-64qgx\" (UID: \"4e2e3a37-968e-4bc2-87c5-ca830cebbc44\") " pod="openshift-monitoring/node-exporter-64qgx" Apr 23 17:55:33.663920 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:33.663921 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4e2e3a37-968e-4bc2-87c5-ca830cebbc44-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-64qgx\" (UID: \"4e2e3a37-968e-4bc2-87c5-ca830cebbc44\") " pod="openshift-monitoring/node-exporter-64qgx" Apr 23 17:55:33.664173 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:33.663977 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4e2e3a37-968e-4bc2-87c5-ca830cebbc44-root\") pod \"node-exporter-64qgx\" (UID: \"4e2e3a37-968e-4bc2-87c5-ca830cebbc44\") " pod="openshift-monitoring/node-exporter-64qgx" Apr 23 17:55:33.664173 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:33.663999 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4e2e3a37-968e-4bc2-87c5-ca830cebbc44-node-exporter-wtmp\") pod \"node-exporter-64qgx\" (UID: \"4e2e3a37-968e-4bc2-87c5-ca830cebbc44\") " pod="openshift-monitoring/node-exporter-64qgx" Apr 23 17:55:33.664173 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:33.664058 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4e2e3a37-968e-4bc2-87c5-ca830cebbc44-root\") pod \"node-exporter-64qgx\" (UID: \"4e2e3a37-968e-4bc2-87c5-ca830cebbc44\") " pod="openshift-monitoring/node-exporter-64qgx" Apr 23 17:55:33.664173 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:33.664115 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4e2e3a37-968e-4bc2-87c5-ca830cebbc44-node-exporter-tls\") pod \"node-exporter-64qgx\" (UID: \"4e2e3a37-968e-4bc2-87c5-ca830cebbc44\") " pod="openshift-monitoring/node-exporter-64qgx" Apr 23 17:55:33.664378 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:33.664191 2568 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 23 17:55:33.664378 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:33.664199 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4e2e3a37-968e-4bc2-87c5-ca830cebbc44-node-exporter-accelerators-collector-config\") pod \"node-exporter-64qgx\" (UID: \"4e2e3a37-968e-4bc2-87c5-ca830cebbc44\") " pod="openshift-monitoring/node-exporter-64qgx" Apr 23 17:55:33.664378 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:33.664131 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4e2e3a37-968e-4bc2-87c5-ca830cebbc44-node-exporter-wtmp\") pod \"node-exporter-64qgx\" (UID: \"4e2e3a37-968e-4bc2-87c5-ca830cebbc44\") " pod="openshift-monitoring/node-exporter-64qgx" Apr 23 17:55:33.664378 ip-10-0-131-144 kubenswrapper[2568]: E0423 17:55:33.664249 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4e2e3a37-968e-4bc2-87c5-ca830cebbc44-node-exporter-tls podName:4e2e3a37-968e-4bc2-87c5-ca830cebbc44 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:34.164233791 +0000 UTC m=+192.084326245 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/4e2e3a37-968e-4bc2-87c5-ca830cebbc44-node-exporter-tls") pod "node-exporter-64qgx" (UID: "4e2e3a37-968e-4bc2-87c5-ca830cebbc44") : secret "node-exporter-tls" not found Apr 23 17:55:33.664378 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:33.664334 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4e2e3a37-968e-4bc2-87c5-ca830cebbc44-metrics-client-ca\") pod \"node-exporter-64qgx\" (UID: \"4e2e3a37-968e-4bc2-87c5-ca830cebbc44\") " pod="openshift-monitoring/node-exporter-64qgx" Apr 23 17:55:33.664378 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:33.664368 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4e2e3a37-968e-4bc2-87c5-ca830cebbc44-sys\") pod \"node-exporter-64qgx\" (UID: \"4e2e3a37-968e-4bc2-87c5-ca830cebbc44\") " pod="openshift-monitoring/node-exporter-64qgx" Apr 23 17:55:33.664680 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:33.664402 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t7nzf\" (UniqueName: \"kubernetes.io/projected/4e2e3a37-968e-4bc2-87c5-ca830cebbc44-kube-api-access-t7nzf\") pod \"node-exporter-64qgx\" (UID: \"4e2e3a37-968e-4bc2-87c5-ca830cebbc44\") " pod="openshift-monitoring/node-exporter-64qgx" Apr 23 17:55:33.664680 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:33.664457 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4e2e3a37-968e-4bc2-87c5-ca830cebbc44-sys\") pod \"node-exporter-64qgx\" (UID: \"4e2e3a37-968e-4bc2-87c5-ca830cebbc44\") " pod="openshift-monitoring/node-exporter-64qgx" Apr 23 17:55:33.664818 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:33.664798 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4e2e3a37-968e-4bc2-87c5-ca830cebbc44-node-exporter-accelerators-collector-config\") pod \"node-exporter-64qgx\" (UID: \"4e2e3a37-968e-4bc2-87c5-ca830cebbc44\") " pod="openshift-monitoring/node-exporter-64qgx" Apr 23 17:55:33.664883 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:33.664867 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4e2e3a37-968e-4bc2-87c5-ca830cebbc44-metrics-client-ca\") pod \"node-exporter-64qgx\" (UID: \"4e2e3a37-968e-4bc2-87c5-ca830cebbc44\") " pod="openshift-monitoring/node-exporter-64qgx" Apr 23 17:55:33.668159 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:33.668138 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4e2e3a37-968e-4bc2-87c5-ca830cebbc44-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-64qgx\" (UID: \"4e2e3a37-968e-4bc2-87c5-ca830cebbc44\") " pod="openshift-monitoring/node-exporter-64qgx" Apr 23 17:55:33.675171 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:33.675141 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4e2e3a37-968e-4bc2-87c5-ca830cebbc44-node-exporter-textfile\") pod \"node-exporter-64qgx\" (UID: \"4e2e3a37-968e-4bc2-87c5-ca830cebbc44\") " pod="openshift-monitoring/node-exporter-64qgx" Apr 23 17:55:33.675480 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:33.675458 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7nzf\" (UniqueName: \"kubernetes.io/projected/4e2e3a37-968e-4bc2-87c5-ca830cebbc44-kube-api-access-t7nzf\") pod \"node-exporter-64qgx\" (UID: \"4e2e3a37-968e-4bc2-87c5-ca830cebbc44\") " pod="openshift-monitoring/node-exporter-64qgx" Apr 23 17:55:34.167416 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:34.167377 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4e2e3a37-968e-4bc2-87c5-ca830cebbc44-node-exporter-tls\") pod \"node-exporter-64qgx\" (UID: \"4e2e3a37-968e-4bc2-87c5-ca830cebbc44\") " pod="openshift-monitoring/node-exporter-64qgx" Apr 23 17:55:34.169729 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:34.169704 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4e2e3a37-968e-4bc2-87c5-ca830cebbc44-node-exporter-tls\") pod \"node-exporter-64qgx\" (UID: \"4e2e3a37-968e-4bc2-87c5-ca830cebbc44\") " pod="openshift-monitoring/node-exporter-64qgx" Apr 23 17:55:34.378708 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:34.378671 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-64qgx" Apr 23 17:55:34.386374 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:55:34.386339 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e2e3a37_968e_4bc2_87c5_ca830cebbc44.slice/crio-2273e9802d4d86c5811cacc30e953168dcfe306f5c6db67465edeae713effe9d WatchSource:0}: Error finding container 2273e9802d4d86c5811cacc30e953168dcfe306f5c6db67465edeae713effe9d: Status 404 returned error can't find the container with id 2273e9802d4d86c5811cacc30e953168dcfe306f5c6db67465edeae713effe9d Apr 23 17:55:35.033209 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:35.033168 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-64qgx" event={"ID":"4e2e3a37-968e-4bc2-87c5-ca830cebbc44","Type":"ContainerStarted","Data":"2273e9802d4d86c5811cacc30e953168dcfe306f5c6db67465edeae713effe9d"} Apr 23 17:55:36.036555 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:36.036522 2568 generic.go:358] "Generic (PLEG): container finished" podID="4e2e3a37-968e-4bc2-87c5-ca830cebbc44" containerID="5d2d779542d270f3dbd42f601e34d47984825897c43923d0590c65d89581e8af" exitCode=0 Apr 23 17:55:36.037034 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:36.036595 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-64qgx" event={"ID":"4e2e3a37-968e-4bc2-87c5-ca830cebbc44","Type":"ContainerDied","Data":"5d2d779542d270f3dbd42f601e34d47984825897c43923d0590c65d89581e8af"} Apr 23 17:55:37.040912 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:37.040876 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-64qgx" event={"ID":"4e2e3a37-968e-4bc2-87c5-ca830cebbc44","Type":"ContainerStarted","Data":"33f82b8ef4004ec6b4788343bc4a9546bfee99dc594b7dd517573af53824d41c"} Apr 23 17:55:37.040912 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:37.040911 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-64qgx" event={"ID":"4e2e3a37-968e-4bc2-87c5-ca830cebbc44","Type":"ContainerStarted","Data":"d58bc3a36400b4f9c3073fc476ee6aca09b02b4aa6d61fabafc79e98a6aa1fe7"} Apr 23 17:55:37.066234 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:37.066173 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-64qgx" podStartSLOduration=3.298314244 podStartE2EDuration="4.06615608s" podCreationTimestamp="2026-04-23 17:55:33 +0000 UTC" firstStartedPulling="2026-04-23 17:55:34.388187481 +0000 UTC m=+192.308279937" lastFinishedPulling="2026-04-23 17:55:35.156029308 +0000 UTC m=+193.076121773" observedRunningTime="2026-04-23 17:55:37.065152902 +0000 UTC m=+194.985245379" watchObservedRunningTime="2026-04-23 17:55:37.06615608 +0000 UTC m=+194.986248533" Apr 23 17:55:38.991843 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:38.991812 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-144.ec2.internal" event="NodeReady" Apr 23 17:55:39.049505 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.049476 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-w9st4"] Apr 23 17:55:39.052351 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.052335 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-w9st4" Apr 23 17:55:39.053669 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.053647 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-xmnns"] Apr 23 17:55:39.055478 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.055460 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 17:55:39.055745 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.055727 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 17:55:39.055943 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.055917 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-54wff\"" Apr 23 17:55:39.056371 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.056355 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xmnns" Apr 23 17:55:39.060276 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.060261 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 17:55:39.060602 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.060587 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-f4mbl\"" Apr 23 17:55:39.060687 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.060668 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 17:55:39.061286 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.061272 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 17:55:39.068193 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.068174 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-w9st4"] Apr 23 17:55:39.097325 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.097271 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xmnns"] Apr 23 17:55:39.098623 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.098596 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-xjc8h"] Apr 23 17:55:39.101661 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.101638 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-xjc8h" Apr 23 17:55:39.101788 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.101759 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cf0eb35a-032a-4b56-a1d6-f08cffb1de45-tmp-dir\") pod \"dns-default-w9st4\" (UID: \"cf0eb35a-032a-4b56-a1d6-f08cffb1de45\") " pod="openshift-dns/dns-default-w9st4" Apr 23 17:55:39.101919 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.101895 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swm4n\" (UniqueName: \"kubernetes.io/projected/d8390fd8-fb61-4d17-af7d-e393c2393937-kube-api-access-swm4n\") pod \"ingress-canary-xmnns\" (UID: \"d8390fd8-fb61-4d17-af7d-e393c2393937\") " pod="openshift-ingress-canary/ingress-canary-xmnns" Apr 23 17:55:39.102017 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.101957 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgjxr\" (UniqueName: \"kubernetes.io/projected/cf0eb35a-032a-4b56-a1d6-f08cffb1de45-kube-api-access-wgjxr\") pod \"dns-default-w9st4\" (UID: \"cf0eb35a-032a-4b56-a1d6-f08cffb1de45\") " pod="openshift-dns/dns-default-w9st4" Apr 23 17:55:39.102098 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.102076 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf0eb35a-032a-4b56-a1d6-f08cffb1de45-config-volume\") pod \"dns-default-w9st4\" (UID: \"cf0eb35a-032a-4b56-a1d6-f08cffb1de45\") " pod="openshift-dns/dns-default-w9st4" Apr 23 17:55:39.102156 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.102111 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cf0eb35a-032a-4b56-a1d6-f08cffb1de45-metrics-tls\") pod \"dns-default-w9st4\" (UID: \"cf0eb35a-032a-4b56-a1d6-f08cffb1de45\") " pod="openshift-dns/dns-default-w9st4" Apr 23 17:55:39.102156 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.102132 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8390fd8-fb61-4d17-af7d-e393c2393937-cert\") pod \"ingress-canary-xmnns\" (UID: \"d8390fd8-fb61-4d17-af7d-e393c2393937\") " pod="openshift-ingress-canary/ingress-canary-xmnns" Apr 23 17:55:39.104746 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.104717 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 17:55:39.104862 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.104809 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 17:55:39.104862 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.104840 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 17:55:39.105110 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.105094 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-b7fp4\"" Apr 23 17:55:39.105272 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.105253 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 17:55:39.118395 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.118374 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-xjc8h"] Apr 23 17:55:39.203196 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.203157 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cf0eb35a-032a-4b56-a1d6-f08cffb1de45-tmp-dir\") pod \"dns-default-w9st4\" (UID: \"cf0eb35a-032a-4b56-a1d6-f08cffb1de45\") " pod="openshift-dns/dns-default-w9st4" Apr 23 17:55:39.203387 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.203201 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-swm4n\" (UniqueName: \"kubernetes.io/projected/d8390fd8-fb61-4d17-af7d-e393c2393937-kube-api-access-swm4n\") pod \"ingress-canary-xmnns\" (UID: \"d8390fd8-fb61-4d17-af7d-e393c2393937\") " pod="openshift-ingress-canary/ingress-canary-xmnns" Apr 23 17:55:39.203387 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.203230 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wgjxr\" (UniqueName: \"kubernetes.io/projected/cf0eb35a-032a-4b56-a1d6-f08cffb1de45-kube-api-access-wgjxr\") pod \"dns-default-w9st4\" (UID: \"cf0eb35a-032a-4b56-a1d6-f08cffb1de45\") " pod="openshift-dns/dns-default-w9st4" Apr 23 17:55:39.203387 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.203279 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5n8v\" (UniqueName: \"kubernetes.io/projected/66fc82e8-4b83-46d1-beec-4b668ecd1eeb-kube-api-access-f5n8v\") pod \"insights-runtime-extractor-xjc8h\" (UID: \"66fc82e8-4b83-46d1-beec-4b668ecd1eeb\") " pod="openshift-insights/insights-runtime-extractor-xjc8h" Apr 23 17:55:39.203387 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.203304 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/66fc82e8-4b83-46d1-beec-4b668ecd1eeb-data-volume\") pod \"insights-runtime-extractor-xjc8h\" (UID: \"66fc82e8-4b83-46d1-beec-4b668ecd1eeb\") " pod="openshift-insights/insights-runtime-extractor-xjc8h" Apr 23 17:55:39.203387 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.203378 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/66fc82e8-4b83-46d1-beec-4b668ecd1eeb-crio-socket\") pod \"insights-runtime-extractor-xjc8h\" (UID: \"66fc82e8-4b83-46d1-beec-4b668ecd1eeb\") " pod="openshift-insights/insights-runtime-extractor-xjc8h" Apr 23 17:55:39.203575 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.203400 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/66fc82e8-4b83-46d1-beec-4b668ecd1eeb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xjc8h\" (UID: \"66fc82e8-4b83-46d1-beec-4b668ecd1eeb\") " pod="openshift-insights/insights-runtime-extractor-xjc8h" Apr 23 17:55:39.203575 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.203459 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf0eb35a-032a-4b56-a1d6-f08cffb1de45-config-volume\") pod \"dns-default-w9st4\" (UID: \"cf0eb35a-032a-4b56-a1d6-f08cffb1de45\") " pod="openshift-dns/dns-default-w9st4" Apr 23 17:55:39.203575 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.203477 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cf0eb35a-032a-4b56-a1d6-f08cffb1de45-metrics-tls\") pod \"dns-default-w9st4\" (UID: \"cf0eb35a-032a-4b56-a1d6-f08cffb1de45\") " pod="openshift-dns/dns-default-w9st4" Apr 23 17:55:39.203575 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.203495 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8390fd8-fb61-4d17-af7d-e393c2393937-cert\") pod \"ingress-canary-xmnns\" (UID: \"d8390fd8-fb61-4d17-af7d-e393c2393937\") " pod="openshift-ingress-canary/ingress-canary-xmnns" Apr 23 17:55:39.203575 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.203519 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/66fc82e8-4b83-46d1-beec-4b668ecd1eeb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xjc8h\" (UID: \"66fc82e8-4b83-46d1-beec-4b668ecd1eeb\") " pod="openshift-insights/insights-runtime-extractor-xjc8h" Apr 23 17:55:39.203815 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.203707 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cf0eb35a-032a-4b56-a1d6-f08cffb1de45-tmp-dir\") pod \"dns-default-w9st4\" (UID: \"cf0eb35a-032a-4b56-a1d6-f08cffb1de45\") " pod="openshift-dns/dns-default-w9st4" Apr 23 17:55:39.204000 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.203983 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf0eb35a-032a-4b56-a1d6-f08cffb1de45-config-volume\") pod \"dns-default-w9st4\" (UID: \"cf0eb35a-032a-4b56-a1d6-f08cffb1de45\") " pod="openshift-dns/dns-default-w9st4" Apr 23 17:55:39.205694 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.205670 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cf0eb35a-032a-4b56-a1d6-f08cffb1de45-metrics-tls\") pod \"dns-default-w9st4\" (UID: \"cf0eb35a-032a-4b56-a1d6-f08cffb1de45\") " pod="openshift-dns/dns-default-w9st4" Apr 23 17:55:39.205850 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.205832 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8390fd8-fb61-4d17-af7d-e393c2393937-cert\") pod \"ingress-canary-xmnns\" (UID: \"d8390fd8-fb61-4d17-af7d-e393c2393937\") " pod="openshift-ingress-canary/ingress-canary-xmnns" Apr 23 17:55:39.218363 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.218336 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-swm4n\" (UniqueName: \"kubernetes.io/projected/d8390fd8-fb61-4d17-af7d-e393c2393937-kube-api-access-swm4n\") pod \"ingress-canary-xmnns\" (UID: \"d8390fd8-fb61-4d17-af7d-e393c2393937\") " pod="openshift-ingress-canary/ingress-canary-xmnns" Apr 23 17:55:39.219863 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.219840 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgjxr\" (UniqueName: \"kubernetes.io/projected/cf0eb35a-032a-4b56-a1d6-f08cffb1de45-kube-api-access-wgjxr\") pod \"dns-default-w9st4\" (UID: \"cf0eb35a-032a-4b56-a1d6-f08cffb1de45\") " pod="openshift-dns/dns-default-w9st4" Apr 23 17:55:39.304164 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.304076 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/66fc82e8-4b83-46d1-beec-4b668ecd1eeb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xjc8h\" (UID: \"66fc82e8-4b83-46d1-beec-4b668ecd1eeb\") " pod="openshift-insights/insights-runtime-extractor-xjc8h" Apr 23 17:55:39.304309 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.304229 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f5n8v\" (UniqueName: \"kubernetes.io/projected/66fc82e8-4b83-46d1-beec-4b668ecd1eeb-kube-api-access-f5n8v\") pod \"insights-runtime-extractor-xjc8h\" (UID: \"66fc82e8-4b83-46d1-beec-4b668ecd1eeb\") " pod="openshift-insights/insights-runtime-extractor-xjc8h" Apr 23 17:55:39.304309 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.304256 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/66fc82e8-4b83-46d1-beec-4b668ecd1eeb-data-volume\") pod \"insights-runtime-extractor-xjc8h\" (UID: \"66fc82e8-4b83-46d1-beec-4b668ecd1eeb\") " pod="openshift-insights/insights-runtime-extractor-xjc8h" Apr 23 17:55:39.304309 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.304276 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/66fc82e8-4b83-46d1-beec-4b668ecd1eeb-crio-socket\") pod \"insights-runtime-extractor-xjc8h\" (UID: \"66fc82e8-4b83-46d1-beec-4b668ecd1eeb\") " pod="openshift-insights/insights-runtime-extractor-xjc8h" Apr 23 17:55:39.304309 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.304292 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/66fc82e8-4b83-46d1-beec-4b668ecd1eeb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xjc8h\" (UID: \"66fc82e8-4b83-46d1-beec-4b668ecd1eeb\") " pod="openshift-insights/insights-runtime-extractor-xjc8h" Apr 23 17:55:39.304480 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.304458 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/66fc82e8-4b83-46d1-beec-4b668ecd1eeb-crio-socket\") pod \"insights-runtime-extractor-xjc8h\" (UID: \"66fc82e8-4b83-46d1-beec-4b668ecd1eeb\") " pod="openshift-insights/insights-runtime-extractor-xjc8h" Apr 23 17:55:39.304572 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.304552 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/66fc82e8-4b83-46d1-beec-4b668ecd1eeb-data-volume\") pod \"insights-runtime-extractor-xjc8h\" (UID: \"66fc82e8-4b83-46d1-beec-4b668ecd1eeb\") " pod="openshift-insights/insights-runtime-extractor-xjc8h" Apr 23 17:55:39.304736 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.304720 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/66fc82e8-4b83-46d1-beec-4b668ecd1eeb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xjc8h\" (UID: \"66fc82e8-4b83-46d1-beec-4b668ecd1eeb\") " pod="openshift-insights/insights-runtime-extractor-xjc8h" Apr 23 17:55:39.306560 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.306544 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/66fc82e8-4b83-46d1-beec-4b668ecd1eeb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xjc8h\" (UID: \"66fc82e8-4b83-46d1-beec-4b668ecd1eeb\") " pod="openshift-insights/insights-runtime-extractor-xjc8h" Apr 23 17:55:39.318383 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.318350 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5n8v\" (UniqueName: \"kubernetes.io/projected/66fc82e8-4b83-46d1-beec-4b668ecd1eeb-kube-api-access-f5n8v\") pod \"insights-runtime-extractor-xjc8h\" (UID: \"66fc82e8-4b83-46d1-beec-4b668ecd1eeb\") " pod="openshift-insights/insights-runtime-extractor-xjc8h" Apr 23 17:55:39.362455 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.362413 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-w9st4" Apr 23 17:55:39.370273 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.370245 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xmnns" Apr 23 17:55:39.410457 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.410389 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-xjc8h" Apr 23 17:55:39.570888 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.570808 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-xjc8h"] Apr 23 17:55:39.574565 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:55:39.574533 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66fc82e8_4b83_46d1_beec_4b668ecd1eeb.slice/crio-7c2b5995657cf9481b4338d6743518b21c511b4725c2280457c38cbd350a8cd3 WatchSource:0}: Error finding container 7c2b5995657cf9481b4338d6743518b21c511b4725c2280457c38cbd350a8cd3: Status 404 returned error can't find the container with id 7c2b5995657cf9481b4338d6743518b21c511b4725c2280457c38cbd350a8cd3 Apr 23 17:55:39.736097 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.736063 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xmnns"] Apr 23 17:55:39.738693 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.738668 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-w9st4"] Apr 23 17:55:39.739797 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:55:39.739772 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8390fd8_fb61_4d17_af7d_e393c2393937.slice/crio-f085a5af98095e4bb62d913f82fb3441eccba8243fdd928bf92034d003642afa WatchSource:0}: Error finding container f085a5af98095e4bb62d913f82fb3441eccba8243fdd928bf92034d003642afa: Status 404 returned error can't find the container with id f085a5af98095e4bb62d913f82fb3441eccba8243fdd928bf92034d003642afa Apr 23 17:55:39.742037 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:55:39.742011 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf0eb35a_032a_4b56_a1d6_f08cffb1de45.slice/crio-c3455891f68508d35f627c7327331ec879919cd16b3d7b2de314b33e95834237 WatchSource:0}: Error finding container c3455891f68508d35f627c7327331ec879919cd16b3d7b2de314b33e95834237: Status 404 returned error can't find the container with id c3455891f68508d35f627c7327331ec879919cd16b3d7b2de314b33e95834237 Apr 23 17:55:39.845503 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.845474 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 17:55:39.848997 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.848980 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:39.851850 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.851830 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 23 17:55:39.851986 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.851909 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 23 17:55:39.852046 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.852006 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 23 17:55:39.852354 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.852326 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 23 17:55:39.852354 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.852346 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 23 17:55:39.852354 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.852328 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 23 17:55:39.852556 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.852329 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 23 17:55:39.852556 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.852374 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 23 17:55:39.852721 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.852705 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-1g4g554ad5e6j\"" Apr 23 17:55:39.852838 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.852821 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 23 17:55:39.852838 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.852831 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 23 17:55:39.852978 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.852906 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 23 17:55:39.852978 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.852952 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-jbg8q\"" Apr 23 17:55:39.853144 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.853132 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 23 17:55:39.856114 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.856091 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 23 17:55:39.871718 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.871690 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 17:55:39.909926 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.909887 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-config-out\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:39.909926 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.909925 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:39.910135 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.909964 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:39.910135 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.910038 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-web-config\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:39.910135 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.910103 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:39.910135 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.910120 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:39.910266 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.910159 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:39.910266 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.910177 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-config\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:39.910266 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.910192 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:39.910266 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.910215 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:39.910266 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.910237 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:39.910434 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.910272 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:39.910434 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.910297 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:39.910434 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.910322 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb9k5\" (UniqueName: \"kubernetes.io/projected/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-kube-api-access-hb9k5\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:39.910434 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.910360 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:39.910434 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.910403 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:39.910434 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.910419 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:39.910434 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:39.910435 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:40.011131 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:40.011099 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:40.011131 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:40.011135 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hb9k5\" (UniqueName: \"kubernetes.io/projected/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-kube-api-access-hb9k5\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:40.011600 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:40.011154 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:40.011600 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:40.011236 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:40.011600 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:40.011276 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:40.011600 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:40.011302 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:40.011600 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:40.011443 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-config-out\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:40.011600 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:40.011487 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:40.011600 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:40.011498 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:40.011600 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:40.011601 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:40.012024 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:40.011636 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-web-config\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:40.012024 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:40.011698 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:40.012024 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:40.011720 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:40.012024 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:40.011751 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:40.012024 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:40.011778 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-config\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:40.012024 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:40.011801 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:40.012024 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:40.011840 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:40.012024 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:40.011865 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:40.012024 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:40.011891 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:40.012692 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:40.012584 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:40.014364 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:40.014103 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-config-out\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:40.014364 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:40.014299 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:40.015330 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:40.014649 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:40.015330 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:40.015051 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:40.015330 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:40.015289 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:40.015544 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:40.015337 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:40.015544 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:40.015520 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:40.015889 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:40.015869 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:40.015995 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:40.015953 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:40.015995 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:40.015961 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-web-config\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:40.016118 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:40.016016 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:40.016309 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:40.016286 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:40.016590 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:40.016573 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-config\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:40.017003 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:40.016983 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:40.017598 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:40.017565 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:40.022033 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:40.022010 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb9k5\" (UniqueName: \"kubernetes.io/projected/51d006ec-4b09-46fb-9905-6d8cdcb0a4ff-kube-api-access-hb9k5\") pod \"prometheus-k8s-0\" (UID: \"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:40.048603 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:40.048570 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xjc8h" event={"ID":"66fc82e8-4b83-46d1-beec-4b668ecd1eeb","Type":"ContainerStarted","Data":"e6e8fe7a3286ad9d89cb34323508985100dc1531918262c72b588e1d80f85c5d"} Apr 23 17:55:40.048603 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:40.048607 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xjc8h" event={"ID":"66fc82e8-4b83-46d1-beec-4b668ecd1eeb","Type":"ContainerStarted","Data":"7c2b5995657cf9481b4338d6743518b21c511b4725c2280457c38cbd350a8cd3"} Apr 23 17:55:40.049549 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:40.049517 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-w9st4" event={"ID":"cf0eb35a-032a-4b56-a1d6-f08cffb1de45","Type":"ContainerStarted","Data":"c3455891f68508d35f627c7327331ec879919cd16b3d7b2de314b33e95834237"} Apr 23 17:55:40.050370 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:40.050350 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xmnns" event={"ID":"d8390fd8-fb61-4d17-af7d-e393c2393937","Type":"ContainerStarted","Data":"f085a5af98095e4bb62d913f82fb3441eccba8243fdd928bf92034d003642afa"} Apr 23 17:55:40.158424 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:40.158252 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:40.334346 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:40.334282 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 17:55:40.398634 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:55:40.398594 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51d006ec_4b09_46fb_9905_6d8cdcb0a4ff.slice/crio-b8ae3a4b331907667ed2876adc91851c9796fad9d6e13ebc2ab4720d55789334 WatchSource:0}: Error finding container b8ae3a4b331907667ed2876adc91851c9796fad9d6e13ebc2ab4720d55789334: Status 404 returned error can't find the container with id b8ae3a4b331907667ed2876adc91851c9796fad9d6e13ebc2ab4720d55789334 Apr 23 17:55:41.056486 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:41.056450 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff","Type":"ContainerStarted","Data":"b8ae3a4b331907667ed2876adc91851c9796fad9d6e13ebc2ab4720d55789334"} Apr 23 17:55:41.058162 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:41.058134 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xjc8h" event={"ID":"66fc82e8-4b83-46d1-beec-4b668ecd1eeb","Type":"ContainerStarted","Data":"d5488a60c8bbbb27b87d135501414e625c6bdc9c01b5d544393a70ede1e50a8e"} Apr 23 17:55:43.065423 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:43.065382 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xmnns" event={"ID":"d8390fd8-fb61-4d17-af7d-e393c2393937","Type":"ContainerStarted","Data":"27640fb7163b703c701a061111084500ff20b946fc788ee37a32b21931420eb1"} Apr 23 17:55:43.066806 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:43.066784 2568 generic.go:358] "Generic (PLEG): container finished" podID="51d006ec-4b09-46fb-9905-6d8cdcb0a4ff" containerID="9e7e12b2668eb7aba3762c55cebe724a9bbdcb0fce2e0e2b04be221e623ec159" exitCode=0 Apr 23 17:55:43.066907 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:43.066868 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff","Type":"ContainerDied","Data":"9e7e12b2668eb7aba3762c55cebe724a9bbdcb0fce2e0e2b04be221e623ec159"} Apr 23 17:55:43.068769 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:43.068742 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xjc8h" event={"ID":"66fc82e8-4b83-46d1-beec-4b668ecd1eeb","Type":"ContainerStarted","Data":"f2ad29fd4996d78ed23c3834d72fbb05580ff1f4b69388f9776dc4be7f961de4"} Apr 23 17:55:43.070157 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:43.070129 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-w9st4" event={"ID":"cf0eb35a-032a-4b56-a1d6-f08cffb1de45","Type":"ContainerStarted","Data":"e4ddb3b8e62f4d78f5b7e7dd16ed8b9ded3811fb7d32167f98c0cf0dfc21a5f7"} Apr 23 17:55:43.070224 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:43.070158 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-w9st4" event={"ID":"cf0eb35a-032a-4b56-a1d6-f08cffb1de45","Type":"ContainerStarted","Data":"490b7f1521c484b27c91416cdd7c9aff281d288d8f1ad08e95c180673357eb6e"} Apr 23 17:55:43.070289 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:43.070265 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-w9st4" Apr 23 17:55:43.089278 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:43.087118 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-xmnns" podStartSLOduration=1.512369423 podStartE2EDuration="4.087097243s" podCreationTimestamp="2026-04-23 17:55:39 +0000 UTC" firstStartedPulling="2026-04-23 17:55:39.741770656 +0000 UTC m=+197.661863122" lastFinishedPulling="2026-04-23 17:55:42.316498475 +0000 UTC m=+200.236590942" observedRunningTime="2026-04-23 17:55:43.084663549 +0000 UTC m=+201.004756024" watchObservedRunningTime="2026-04-23 17:55:43.087097243 +0000 UTC m=+201.007189718" Apr 23 17:55:43.116948 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:43.116871 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-w9st4" podStartSLOduration=1.549337237 podStartE2EDuration="4.116855258s" podCreationTimestamp="2026-04-23 17:55:39 +0000 UTC" firstStartedPulling="2026-04-23 17:55:39.743917166 +0000 UTC m=+197.664009618" lastFinishedPulling="2026-04-23 17:55:42.311435169 +0000 UTC m=+200.231527639" observedRunningTime="2026-04-23 17:55:43.115818315 +0000 UTC m=+201.035910789" watchObservedRunningTime="2026-04-23 17:55:43.116855258 +0000 UTC m=+201.036947788" Apr 23 17:55:43.163024 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:43.162969 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-xjc8h" podStartSLOduration=1.459720351 podStartE2EDuration="4.162954274s" podCreationTimestamp="2026-04-23 17:55:39 +0000 UTC" firstStartedPulling="2026-04-23 17:55:39.635036287 +0000 UTC m=+197.555128740" lastFinishedPulling="2026-04-23 17:55:42.33827021 +0000 UTC m=+200.258362663" observedRunningTime="2026-04-23 17:55:43.161844417 +0000 UTC m=+201.081936892" watchObservedRunningTime="2026-04-23 17:55:43.162954274 +0000 UTC m=+201.083046749" Apr 23 17:55:46.084442 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:46.084410 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff","Type":"ContainerStarted","Data":"a9ab0e85efb75736b2e3daeb34c322c7d88a5415ef012e6507871a78e1cf0f08"} Apr 23 17:55:46.084442 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:46.084445 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff","Type":"ContainerStarted","Data":"a3220799276c6f4112d42d8045782bd9e92ba957728514b992bc2e26b5e1808d"} Apr 23 17:55:48.091904 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:48.091869 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff","Type":"ContainerStarted","Data":"f2b10a9b6dba7a167dc89320c82c3baa8451845f18fef4029dbc1f314b286d2d"} Apr 23 17:55:48.091904 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:48.091909 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff","Type":"ContainerStarted","Data":"0bf9aa3c807ba84729690e1427ddf912107e93999c5854fbcb0795a117cb7b99"} Apr 23 17:55:49.096889 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:49.096853 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff","Type":"ContainerStarted","Data":"d1e9fdb8b055b08a112b87f25821ac006f27433b3ac03a49c269260a762c4bfa"} Apr 23 17:55:49.096889 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:49.096889 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"51d006ec-4b09-46fb-9905-6d8cdcb0a4ff","Type":"ContainerStarted","Data":"f0675911190cbb968629061de48aed8b828a3c9b63e89599b6352f4d2d14aaef"} Apr 23 17:55:49.135358 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:49.135302 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.705709563 podStartE2EDuration="10.13528759s" podCreationTimestamp="2026-04-23 17:55:39 +0000 UTC" firstStartedPulling="2026-04-23 17:55:40.40222465 +0000 UTC m=+198.322317109" lastFinishedPulling="2026-04-23 17:55:47.831802684 +0000 UTC m=+205.751895136" observedRunningTime="2026-04-23 17:55:49.135197536 +0000 UTC m=+207.055290047" watchObservedRunningTime="2026-04-23 17:55:49.13528759 +0000 UTC m=+207.055380067" Apr 23 17:55:50.159033 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:50.158997 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:53.007406 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:53.007375 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lwrk7" Apr 23 17:55:53.078644 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:55:53.078608 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-w9st4" Apr 23 17:56:01.500606 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:56:01.500564 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f51b2065-4e39-46ce-8dda-c894907d1f96-original-pull-secret\") pod \"global-pull-secret-syncer-8vtxf\" (UID: \"f51b2065-4e39-46ce-8dda-c894907d1f96\") " pod="kube-system/global-pull-secret-syncer-8vtxf" Apr 23 17:56:01.501029 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:56:01.500626 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9fe17f32-d37d-48dc-b42b-74bbc44d26d2-metrics-certs\") pod \"network-metrics-daemon-2dh7j\" (UID: \"9fe17f32-d37d-48dc-b42b-74bbc44d26d2\") " pod="openshift-multus/network-metrics-daemon-2dh7j" Apr 23 17:56:01.502950 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:56:01.502905 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f51b2065-4e39-46ce-8dda-c894907d1f96-original-pull-secret\") pod \"global-pull-secret-syncer-8vtxf\" (UID: \"f51b2065-4e39-46ce-8dda-c894907d1f96\") " pod="kube-system/global-pull-secret-syncer-8vtxf" Apr 23 17:56:01.503049 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:56:01.503037 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9fe17f32-d37d-48dc-b42b-74bbc44d26d2-metrics-certs\") pod \"network-metrics-daemon-2dh7j\" (UID: \"9fe17f32-d37d-48dc-b42b-74bbc44d26d2\") " pod="openshift-multus/network-metrics-daemon-2dh7j" Apr 23 17:56:01.504997 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:56:01.504981 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8vtxf" Apr 23 17:56:01.514839 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:56:01.514816 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2dh7j" Apr 23 17:56:01.601971 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:56:01.601916 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mr7qv\" (UniqueName: \"kubernetes.io/projected/705ed263-d57f-4121-a61c-7fca69acb455-kube-api-access-mr7qv\") pod \"network-check-target-7vcgr\" (UID: \"705ed263-d57f-4121-a61c-7fca69acb455\") " pod="openshift-network-diagnostics/network-check-target-7vcgr" Apr 23 17:56:01.604449 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:56:01.604423 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr7qv\" (UniqueName: \"kubernetes.io/projected/705ed263-d57f-4121-a61c-7fca69acb455-kube-api-access-mr7qv\") pod \"network-check-target-7vcgr\" (UID: \"705ed263-d57f-4121-a61c-7fca69acb455\") " pod="openshift-network-diagnostics/network-check-target-7vcgr" Apr 23 17:56:01.636101 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:56:01.636066 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-8vtxf"] Apr 23 17:56:01.641773 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:56:01.641741 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf51b2065_4e39_46ce_8dda_c894907d1f96.slice/crio-b69892a1ae393f179d452b1b076690a89bc5bdb8ace63338465806d7a0785935 WatchSource:0}: Error finding container b69892a1ae393f179d452b1b076690a89bc5bdb8ace63338465806d7a0785935: Status 404 returned error can't find the container with id b69892a1ae393f179d452b1b076690a89bc5bdb8ace63338465806d7a0785935 Apr 23 17:56:01.659085 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:56:01.659057 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2dh7j"] Apr 23 17:56:01.662421 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:56:01.662391 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fe17f32_d37d_48dc_b42b_74bbc44d26d2.slice/crio-6964b2633f5e5179fd1db7c620c33954185201b10e91d5ee228feb04faf69a62 WatchSource:0}: Error finding container 6964b2633f5e5179fd1db7c620c33954185201b10e91d5ee228feb04faf69a62: Status 404 returned error can't find the container with id 6964b2633f5e5179fd1db7c620c33954185201b10e91d5ee228feb04faf69a62 Apr 23 17:56:01.811422 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:56:01.811324 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7vcgr" Apr 23 17:56:01.931911 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:56:01.931880 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-7vcgr"] Apr 23 17:56:01.936723 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:56:01.936694 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod705ed263_d57f_4121_a61c_7fca69acb455.slice/crio-f977ca3cbf97458ffc9fc42113afe5f70ab14c97ad462d8892d793f0e7dcda48 WatchSource:0}: Error finding container f977ca3cbf97458ffc9fc42113afe5f70ab14c97ad462d8892d793f0e7dcda48: Status 404 returned error can't find the container with id f977ca3cbf97458ffc9fc42113afe5f70ab14c97ad462d8892d793f0e7dcda48 Apr 23 17:56:02.135433 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:56:02.135375 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-7vcgr" event={"ID":"705ed263-d57f-4121-a61c-7fca69acb455","Type":"ContainerStarted","Data":"f977ca3cbf97458ffc9fc42113afe5f70ab14c97ad462d8892d793f0e7dcda48"} Apr 23 17:56:02.136629 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:56:02.136582 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-8vtxf" event={"ID":"f51b2065-4e39-46ce-8dda-c894907d1f96","Type":"ContainerStarted","Data":"b69892a1ae393f179d452b1b076690a89bc5bdb8ace63338465806d7a0785935"} Apr 23 17:56:02.137776 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:56:02.137741 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2dh7j" event={"ID":"9fe17f32-d37d-48dc-b42b-74bbc44d26d2","Type":"ContainerStarted","Data":"6964b2633f5e5179fd1db7c620c33954185201b10e91d5ee228feb04faf69a62"} Apr 23 17:56:03.146358 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:56:03.146303 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2dh7j" event={"ID":"9fe17f32-d37d-48dc-b42b-74bbc44d26d2","Type":"ContainerStarted","Data":"439cff470ec4b1127caea6af6ad6106afe29fab06026697092e255740b2b83f5"} Apr 23 17:56:04.152813 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:56:04.152770 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2dh7j" event={"ID":"9fe17f32-d37d-48dc-b42b-74bbc44d26d2","Type":"ContainerStarted","Data":"3a2627e023ac7941f25fb01e9de11887bd87a8cb33c89c42576eb8286a8c8f47"} Apr 23 17:56:07.163372 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:56:07.163335 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-7vcgr" event={"ID":"705ed263-d57f-4121-a61c-7fca69acb455","Type":"ContainerStarted","Data":"6013748649d68b8f5f651cab47f5ac0e2517286a26f27eb49a66889bd20c7f4e"} Apr 23 17:56:07.163815 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:56:07.163435 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-7vcgr" Apr 23 17:56:07.164662 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:56:07.164640 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-8vtxf" event={"ID":"f51b2065-4e39-46ce-8dda-c894907d1f96","Type":"ContainerStarted","Data":"832e88548d9b23874c1e92a1bb9efbab8d0b6e916f68155bb82814c3955d46dd"} Apr 23 17:56:07.180700 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:56:07.180657 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-2dh7j" podStartSLOduration=120.0066681 podStartE2EDuration="2m1.180642031s" podCreationTimestamp="2026-04-23 17:54:06 +0000 UTC" firstStartedPulling="2026-04-23 17:56:01.664208624 +0000 UTC m=+219.584301077" lastFinishedPulling="2026-04-23 17:56:02.838182552 +0000 UTC m=+220.758275008" observedRunningTime="2026-04-23 17:56:04.179262982 +0000 UTC m=+222.099355457" watchObservedRunningTime="2026-04-23 17:56:07.180642031 +0000 UTC m=+225.100734507" Apr 23 17:56:07.180860 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:56:07.180760 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-7vcgr" podStartSLOduration=115.893446328 podStartE2EDuration="2m0.180753732s" podCreationTimestamp="2026-04-23 17:54:07 +0000 UTC" firstStartedPulling="2026-04-23 17:56:01.938534982 +0000 UTC m=+219.858627436" lastFinishedPulling="2026-04-23 17:56:06.225842384 +0000 UTC m=+224.145934840" observedRunningTime="2026-04-23 17:56:07.180159606 +0000 UTC m=+225.100252082" watchObservedRunningTime="2026-04-23 17:56:07.180753732 +0000 UTC m=+225.100846208" Apr 23 17:56:07.195335 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:56:07.195285 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-8vtxf" podStartSLOduration=114.609111683 podStartE2EDuration="1m59.195271101s" podCreationTimestamp="2026-04-23 17:54:08 +0000 UTC" firstStartedPulling="2026-04-23 17:56:01.643448505 +0000 UTC m=+219.563540958" lastFinishedPulling="2026-04-23 17:56:06.22960792 +0000 UTC m=+224.149700376" observedRunningTime="2026-04-23 17:56:07.194563064 +0000 UTC m=+225.114655540" watchObservedRunningTime="2026-04-23 17:56:07.195271101 +0000 UTC m=+225.115363576" Apr 23 17:56:38.170198 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:56:38.170170 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-7vcgr" Apr 23 17:56:40.159044 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:56:40.158994 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:40.177277 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:56:40.177248 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:40.267059 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:56:40.267033 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:22.514265 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:57:22.514240 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwrk7_52f4d4c2-438f-4486-8ffb-4afbaad14bf1/ovn-acl-logging/0.log" Apr 23 17:57:22.515275 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:57:22.515254 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwrk7_52f4d4c2-438f-4486-8ffb-4afbaad14bf1/ovn-acl-logging/0.log" Apr 23 17:57:22.518235 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:57:22.518215 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_8dc4456d0c7e39a0756b16fde0e83318/kube-rbac-proxy-crio/4.log" Apr 23 17:57:22.518919 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:57:22.518900 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_8dc4456d0c7e39a0756b16fde0e83318/kube-rbac-proxy-crio/4.log" Apr 23 17:59:17.127147 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:59:17.127103 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-bjccc"] Apr 23 17:59:17.132506 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:59:17.132469 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-bjccc" Apr 23 17:59:17.137453 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:59:17.137430 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-8fzpf\"" Apr 23 17:59:17.137815 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:59:17.137753 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 23 17:59:17.138263 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:59:17.138244 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 23 17:59:17.138366 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:59:17.138262 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 23 17:59:17.143376 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:59:17.143350 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-bjccc"] Apr 23 17:59:17.264658 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:59:17.264627 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/5a16254d-c5e1-4567-8042-5cdab52e09a5-data\") pod \"seaweedfs-86cc847c5c-bjccc\" (UID: \"5a16254d-c5e1-4567-8042-5cdab52e09a5\") " pod="kserve/seaweedfs-86cc847c5c-bjccc" Apr 23 17:59:17.264844 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:59:17.264684 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njwk5\" (UniqueName: \"kubernetes.io/projected/5a16254d-c5e1-4567-8042-5cdab52e09a5-kube-api-access-njwk5\") pod \"seaweedfs-86cc847c5c-bjccc\" (UID: \"5a16254d-c5e1-4567-8042-5cdab52e09a5\") " pod="kserve/seaweedfs-86cc847c5c-bjccc" Apr 23 17:59:17.365109 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:59:17.365063 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/5a16254d-c5e1-4567-8042-5cdab52e09a5-data\") pod \"seaweedfs-86cc847c5c-bjccc\" (UID: \"5a16254d-c5e1-4567-8042-5cdab52e09a5\") " pod="kserve/seaweedfs-86cc847c5c-bjccc" Apr 23 17:59:17.365282 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:59:17.365129 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-njwk5\" (UniqueName: \"kubernetes.io/projected/5a16254d-c5e1-4567-8042-5cdab52e09a5-kube-api-access-njwk5\") pod \"seaweedfs-86cc847c5c-bjccc\" (UID: \"5a16254d-c5e1-4567-8042-5cdab52e09a5\") " pod="kserve/seaweedfs-86cc847c5c-bjccc" Apr 23 17:59:17.365511 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:59:17.365486 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/5a16254d-c5e1-4567-8042-5cdab52e09a5-data\") pod \"seaweedfs-86cc847c5c-bjccc\" (UID: \"5a16254d-c5e1-4567-8042-5cdab52e09a5\") " pod="kserve/seaweedfs-86cc847c5c-bjccc" Apr 23 17:59:17.373980 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:59:17.373955 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-njwk5\" (UniqueName: \"kubernetes.io/projected/5a16254d-c5e1-4567-8042-5cdab52e09a5-kube-api-access-njwk5\") pod \"seaweedfs-86cc847c5c-bjccc\" (UID: \"5a16254d-c5e1-4567-8042-5cdab52e09a5\") " pod="kserve/seaweedfs-86cc847c5c-bjccc" Apr 23 17:59:17.441112 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:59:17.441012 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-bjccc" Apr 23 17:59:17.560833 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:59:17.560801 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-bjccc"] Apr 23 17:59:17.563747 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:59:17.563716 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a16254d_c5e1_4567_8042_5cdab52e09a5.slice/crio-cb0071d92494f3fe57e0bcebfadba203d1eebd7451593c2fdd61b960ffcd900e WatchSource:0}: Error finding container cb0071d92494f3fe57e0bcebfadba203d1eebd7451593c2fdd61b960ffcd900e: Status 404 returned error can't find the container with id cb0071d92494f3fe57e0bcebfadba203d1eebd7451593c2fdd61b960ffcd900e Apr 23 17:59:17.565010 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:59:17.564993 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 17:59:17.642210 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:59:17.642173 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-bjccc" event={"ID":"5a16254d-c5e1-4567-8042-5cdab52e09a5","Type":"ContainerStarted","Data":"cb0071d92494f3fe57e0bcebfadba203d1eebd7451593c2fdd61b960ffcd900e"} Apr 23 17:59:20.651707 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:59:20.651670 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-bjccc" event={"ID":"5a16254d-c5e1-4567-8042-5cdab52e09a5","Type":"ContainerStarted","Data":"75b38dbc0f3e8ebfe72a6625502fcb8e60b3cfbd080bc4accd49f9b62ca4b678"} Apr 23 17:59:20.652132 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:59:20.651800 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-bjccc" Apr 23 17:59:20.669754 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:59:20.669707 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-bjccc" podStartSLOduration=1.199897235 podStartE2EDuration="3.669693846s" podCreationTimestamp="2026-04-23 17:59:17 +0000 UTC" firstStartedPulling="2026-04-23 17:59:17.565118392 +0000 UTC m=+415.485210844" lastFinishedPulling="2026-04-23 17:59:20.034914993 +0000 UTC m=+417.955007455" observedRunningTime="2026-04-23 17:59:20.668245711 +0000 UTC m=+418.588338188" watchObservedRunningTime="2026-04-23 17:59:20.669693846 +0000 UTC m=+418.589786320" Apr 23 17:59:26.657649 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:59:26.657611 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-bjccc" Apr 23 17:59:52.196258 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:59:52.196223 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-6fc5d867c5-gddln"] Apr 23 17:59:52.201399 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:59:52.201378 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6fc5d867c5-gddln" Apr 23 17:59:52.203486 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:59:52.203467 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-d4nd7\"" Apr 23 17:59:52.203598 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:59:52.203495 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 23 17:59:52.212324 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:59:52.212025 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6fc5d867c5-gddln"] Apr 23 17:59:52.307168 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:59:52.307136 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8vgz\" (UniqueName: \"kubernetes.io/projected/5b72518b-17cb-450f-a719-9ea96e0f11a3-kube-api-access-g8vgz\") pod \"kserve-controller-manager-6fc5d867c5-gddln\" (UID: \"5b72518b-17cb-450f-a719-9ea96e0f11a3\") " pod="kserve/kserve-controller-manager-6fc5d867c5-gddln" Apr 23 17:59:52.307329 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:59:52.307177 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5b72518b-17cb-450f-a719-9ea96e0f11a3-cert\") pod \"kserve-controller-manager-6fc5d867c5-gddln\" (UID: \"5b72518b-17cb-450f-a719-9ea96e0f11a3\") " pod="kserve/kserve-controller-manager-6fc5d867c5-gddln" Apr 23 17:59:52.407592 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:59:52.407552 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g8vgz\" (UniqueName: \"kubernetes.io/projected/5b72518b-17cb-450f-a719-9ea96e0f11a3-kube-api-access-g8vgz\") pod \"kserve-controller-manager-6fc5d867c5-gddln\" (UID: \"5b72518b-17cb-450f-a719-9ea96e0f11a3\") " pod="kserve/kserve-controller-manager-6fc5d867c5-gddln" Apr 23 17:59:52.407739 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:59:52.407601 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5b72518b-17cb-450f-a719-9ea96e0f11a3-cert\") pod \"kserve-controller-manager-6fc5d867c5-gddln\" (UID: \"5b72518b-17cb-450f-a719-9ea96e0f11a3\") " pod="kserve/kserve-controller-manager-6fc5d867c5-gddln" Apr 23 17:59:52.409997 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:59:52.409969 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5b72518b-17cb-450f-a719-9ea96e0f11a3-cert\") pod \"kserve-controller-manager-6fc5d867c5-gddln\" (UID: \"5b72518b-17cb-450f-a719-9ea96e0f11a3\") " pod="kserve/kserve-controller-manager-6fc5d867c5-gddln" Apr 23 17:59:52.416393 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:59:52.416374 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8vgz\" (UniqueName: \"kubernetes.io/projected/5b72518b-17cb-450f-a719-9ea96e0f11a3-kube-api-access-g8vgz\") pod \"kserve-controller-manager-6fc5d867c5-gddln\" (UID: \"5b72518b-17cb-450f-a719-9ea96e0f11a3\") " pod="kserve/kserve-controller-manager-6fc5d867c5-gddln" Apr 23 17:59:52.513867 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:59:52.513784 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6fc5d867c5-gddln" Apr 23 17:59:52.632616 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:59:52.632582 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6fc5d867c5-gddln"] Apr 23 17:59:52.635656 ip-10-0-131-144 kubenswrapper[2568]: W0423 17:59:52.635629 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b72518b_17cb_450f_a719_9ea96e0f11a3.slice/crio-bffba284f8f23f23136d3d105afbca11e89189b1760178fa8535416fbed99940 WatchSource:0}: Error finding container bffba284f8f23f23136d3d105afbca11e89189b1760178fa8535416fbed99940: Status 404 returned error can't find the container with id bffba284f8f23f23136d3d105afbca11e89189b1760178fa8535416fbed99940 Apr 23 17:59:52.731706 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:59:52.731668 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6fc5d867c5-gddln" event={"ID":"5b72518b-17cb-450f-a719-9ea96e0f11a3","Type":"ContainerStarted","Data":"bffba284f8f23f23136d3d105afbca11e89189b1760178fa8535416fbed99940"} Apr 23 17:59:55.741746 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:59:55.741711 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6fc5d867c5-gddln" event={"ID":"5b72518b-17cb-450f-a719-9ea96e0f11a3","Type":"ContainerStarted","Data":"089b4a1f4ff7d5045ff0768f085c14ca0da9ed32e77db36057a6d151d5d6f2f5"} Apr 23 17:59:55.742148 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:59:55.741764 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-6fc5d867c5-gddln" Apr 23 17:59:55.761940 ip-10-0-131-144 kubenswrapper[2568]: I0423 17:59:55.761893 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-6fc5d867c5-gddln" podStartSLOduration=1.3166014879999999 podStartE2EDuration="3.761877774s" podCreationTimestamp="2026-04-23 17:59:52 +0000 UTC" firstStartedPulling="2026-04-23 17:59:52.636726445 +0000 UTC m=+450.556818898" lastFinishedPulling="2026-04-23 17:59:55.082002728 +0000 UTC m=+453.002095184" observedRunningTime="2026-04-23 17:59:55.76013255 +0000 UTC m=+453.680225024" watchObservedRunningTime="2026-04-23 17:59:55.761877774 +0000 UTC m=+453.681970467" Apr 23 18:00:26.749175 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:00:26.749142 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-6fc5d867c5-gddln" Apr 23 18:00:43.853653 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:00:43.853619 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-tz8tt"] Apr 23 18:00:43.855560 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:00:43.855544 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-tz8tt" Apr 23 18:00:43.868028 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:00:43.868001 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-tz8tt"] Apr 23 18:00:43.965793 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:00:43.965748 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxn7k\" (UniqueName: \"kubernetes.io/projected/cf98f8c0-a0ce-4624-80ee-cbc30558c504-kube-api-access-xxn7k\") pod \"s3-init-tz8tt\" (UID: \"cf98f8c0-a0ce-4624-80ee-cbc30558c504\") " pod="kserve/s3-init-tz8tt" Apr 23 18:00:44.066794 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:00:44.066748 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xxn7k\" (UniqueName: \"kubernetes.io/projected/cf98f8c0-a0ce-4624-80ee-cbc30558c504-kube-api-access-xxn7k\") pod \"s3-init-tz8tt\" (UID: \"cf98f8c0-a0ce-4624-80ee-cbc30558c504\") " pod="kserve/s3-init-tz8tt" Apr 23 18:00:44.076561 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:00:44.076528 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxn7k\" (UniqueName: \"kubernetes.io/projected/cf98f8c0-a0ce-4624-80ee-cbc30558c504-kube-api-access-xxn7k\") pod \"s3-init-tz8tt\" (UID: \"cf98f8c0-a0ce-4624-80ee-cbc30558c504\") " pod="kserve/s3-init-tz8tt" Apr 23 18:00:44.176443 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:00:44.176348 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-tz8tt" Apr 23 18:00:44.297693 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:00:44.297659 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-tz8tt"] Apr 23 18:00:44.301296 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:00:44.301270 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf98f8c0_a0ce_4624_80ee_cbc30558c504.slice/crio-9cf4248fc91a9e623e3cfe984566dd868000a78d0c002521dbe1aefcac78abcc WatchSource:0}: Error finding container 9cf4248fc91a9e623e3cfe984566dd868000a78d0c002521dbe1aefcac78abcc: Status 404 returned error can't find the container with id 9cf4248fc91a9e623e3cfe984566dd868000a78d0c002521dbe1aefcac78abcc Apr 23 18:00:44.864858 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:00:44.864817 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-tz8tt" event={"ID":"cf98f8c0-a0ce-4624-80ee-cbc30558c504","Type":"ContainerStarted","Data":"9cf4248fc91a9e623e3cfe984566dd868000a78d0c002521dbe1aefcac78abcc"} Apr 23 18:00:51.886412 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:00:51.886379 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-tz8tt" event={"ID":"cf98f8c0-a0ce-4624-80ee-cbc30558c504","Type":"ContainerStarted","Data":"341f2e34d09631c3e534801c589e5188c19f4be154f472f18bb2c664dc9e7a6d"} Apr 23 18:00:51.902110 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:00:51.902061 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-tz8tt" podStartSLOduration=2.067683613 podStartE2EDuration="8.902044715s" podCreationTimestamp="2026-04-23 18:00:43 +0000 UTC" firstStartedPulling="2026-04-23 18:00:44.303614047 +0000 UTC m=+502.223706500" lastFinishedPulling="2026-04-23 18:00:51.137975133 +0000 UTC m=+509.058067602" observedRunningTime="2026-04-23 18:00:51.901573616 +0000 UTC m=+509.821666093" watchObservedRunningTime="2026-04-23 18:00:51.902044715 +0000 UTC m=+509.822137189" Apr 23 18:00:54.895380 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:00:54.895344 2568 generic.go:358] "Generic (PLEG): container finished" podID="cf98f8c0-a0ce-4624-80ee-cbc30558c504" containerID="341f2e34d09631c3e534801c589e5188c19f4be154f472f18bb2c664dc9e7a6d" exitCode=0 Apr 23 18:00:54.895759 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:00:54.895420 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-tz8tt" event={"ID":"cf98f8c0-a0ce-4624-80ee-cbc30558c504","Type":"ContainerDied","Data":"341f2e34d09631c3e534801c589e5188c19f4be154f472f18bb2c664dc9e7a6d"} Apr 23 18:00:56.023347 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:00:56.023321 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-tz8tt" Apr 23 18:00:56.165974 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:00:56.165868 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxn7k\" (UniqueName: \"kubernetes.io/projected/cf98f8c0-a0ce-4624-80ee-cbc30558c504-kube-api-access-xxn7k\") pod \"cf98f8c0-a0ce-4624-80ee-cbc30558c504\" (UID: \"cf98f8c0-a0ce-4624-80ee-cbc30558c504\") " Apr 23 18:00:56.168124 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:00:56.168091 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf98f8c0-a0ce-4624-80ee-cbc30558c504-kube-api-access-xxn7k" (OuterVolumeSpecName: "kube-api-access-xxn7k") pod "cf98f8c0-a0ce-4624-80ee-cbc30558c504" (UID: "cf98f8c0-a0ce-4624-80ee-cbc30558c504"). InnerVolumeSpecName "kube-api-access-xxn7k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:00:56.266698 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:00:56.266658 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xxn7k\" (UniqueName: \"kubernetes.io/projected/cf98f8c0-a0ce-4624-80ee-cbc30558c504-kube-api-access-xxn7k\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:00:56.901522 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:00:56.901439 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-tz8tt" Apr 23 18:00:56.901522 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:00:56.901449 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-tz8tt" event={"ID":"cf98f8c0-a0ce-4624-80ee-cbc30558c504","Type":"ContainerDied","Data":"9cf4248fc91a9e623e3cfe984566dd868000a78d0c002521dbe1aefcac78abcc"} Apr 23 18:00:56.901522 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:00:56.901479 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9cf4248fc91a9e623e3cfe984566dd868000a78d0c002521dbe1aefcac78abcc" Apr 23 18:01:30.273104 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:30.273073 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-lxbm9"] Apr 23 18:01:30.273593 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:30.273309 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cf98f8c0-a0ce-4624-80ee-cbc30558c504" containerName="s3-init" Apr 23 18:01:30.273593 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:30.273319 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf98f8c0-a0ce-4624-80ee-cbc30558c504" containerName="s3-init" Apr 23 18:01:30.273593 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:30.273370 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="cf98f8c0-a0ce-4624-80ee-cbc30558c504" containerName="s3-init" Apr 23 18:01:30.307891 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:30.307854 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-lxbm9"] Apr 23 18:01:30.308065 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:30.308002 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-lxbm9" Apr 23 18:01:30.310307 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:30.310288 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 23 18:01:30.310429 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:30.310351 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom\"" Apr 23 18:01:30.414046 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:30.414009 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/96c11d8f-83f1-4985-a4bb-dd57e96de581-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-lxbm9\" (UID: \"96c11d8f-83f1-4985-a4bb-dd57e96de581\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-lxbm9" Apr 23 18:01:30.414046 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:30.414054 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/96c11d8f-83f1-4985-a4bb-dd57e96de581-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-lxbm9\" (UID: \"96c11d8f-83f1-4985-a4bb-dd57e96de581\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-lxbm9" Apr 23 18:01:30.414257 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:30.414149 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggdfk\" (UniqueName: \"kubernetes.io/projected/96c11d8f-83f1-4985-a4bb-dd57e96de581-kube-api-access-ggdfk\") pod \"seaweedfs-tls-custom-5c88b85bb7-lxbm9\" (UID: \"96c11d8f-83f1-4985-a4bb-dd57e96de581\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-lxbm9" Apr 23 18:01:30.515130 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:30.515088 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/96c11d8f-83f1-4985-a4bb-dd57e96de581-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-lxbm9\" (UID: \"96c11d8f-83f1-4985-a4bb-dd57e96de581\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-lxbm9" Apr 23 18:01:30.515130 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:30.515133 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/96c11d8f-83f1-4985-a4bb-dd57e96de581-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-lxbm9\" (UID: \"96c11d8f-83f1-4985-a4bb-dd57e96de581\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-lxbm9" Apr 23 18:01:30.515395 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:30.515182 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ggdfk\" (UniqueName: \"kubernetes.io/projected/96c11d8f-83f1-4985-a4bb-dd57e96de581-kube-api-access-ggdfk\") pod \"seaweedfs-tls-custom-5c88b85bb7-lxbm9\" (UID: \"96c11d8f-83f1-4985-a4bb-dd57e96de581\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-lxbm9" Apr 23 18:01:30.515639 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:30.515613 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/96c11d8f-83f1-4985-a4bb-dd57e96de581-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-lxbm9\" (UID: \"96c11d8f-83f1-4985-a4bb-dd57e96de581\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-lxbm9" Apr 23 18:01:30.517479 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:30.517460 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/96c11d8f-83f1-4985-a4bb-dd57e96de581-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-lxbm9\" (UID: \"96c11d8f-83f1-4985-a4bb-dd57e96de581\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-lxbm9" Apr 23 18:01:30.523859 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:30.523805 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggdfk\" (UniqueName: \"kubernetes.io/projected/96c11d8f-83f1-4985-a4bb-dd57e96de581-kube-api-access-ggdfk\") pod \"seaweedfs-tls-custom-5c88b85bb7-lxbm9\" (UID: \"96c11d8f-83f1-4985-a4bb-dd57e96de581\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-lxbm9" Apr 23 18:01:30.617055 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:30.617016 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-lxbm9" Apr 23 18:01:30.735525 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:30.735328 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-lxbm9"] Apr 23 18:01:30.738051 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:01:30.738024 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96c11d8f_83f1_4985_a4bb_dd57e96de581.slice/crio-5d410fbcc7aebc28e0c6ddd6bb0bc72be715c220bd759a809c0f21dccfae6f87 WatchSource:0}: Error finding container 5d410fbcc7aebc28e0c6ddd6bb0bc72be715c220bd759a809c0f21dccfae6f87: Status 404 returned error can't find the container with id 5d410fbcc7aebc28e0c6ddd6bb0bc72be715c220bd759a809c0f21dccfae6f87 Apr 23 18:01:30.994068 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:30.994024 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-lxbm9" event={"ID":"96c11d8f-83f1-4985-a4bb-dd57e96de581","Type":"ContainerStarted","Data":"5d410fbcc7aebc28e0c6ddd6bb0bc72be715c220bd759a809c0f21dccfae6f87"} Apr 23 18:01:31.997774 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:31.997737 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-lxbm9" event={"ID":"96c11d8f-83f1-4985-a4bb-dd57e96de581","Type":"ContainerStarted","Data":"dfd0039fa114f767ccb545a7fb33fcf5d523e21d1f90226d9b130947046acb7c"} Apr 23 18:01:32.014499 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:32.014446 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-lxbm9" podStartSLOduration=1.6656298170000001 podStartE2EDuration="2.01443191s" podCreationTimestamp="2026-04-23 18:01:30 +0000 UTC" firstStartedPulling="2026-04-23 18:01:30.739254601 +0000 UTC m=+548.659347054" lastFinishedPulling="2026-04-23 18:01:31.088056691 +0000 UTC m=+549.008149147" observedRunningTime="2026-04-23 18:01:32.013270455 +0000 UTC m=+549.933362930" watchObservedRunningTime="2026-04-23 18:01:32.01443191 +0000 UTC m=+549.934524385" Apr 23 18:01:32.304485 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:32.304406 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-custom-nlrj7"] Apr 23 18:01:32.308526 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:32.308508 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-nlrj7" Apr 23 18:01:32.314398 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:32.314370 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-nlrj7"] Apr 23 18:01:32.432185 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:32.432143 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jhln\" (UniqueName: \"kubernetes.io/projected/f056c1db-118b-4363-a38e-e50097295112-kube-api-access-2jhln\") pod \"s3-tls-init-custom-nlrj7\" (UID: \"f056c1db-118b-4363-a38e-e50097295112\") " pod="kserve/s3-tls-init-custom-nlrj7" Apr 23 18:01:32.532950 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:32.532896 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2jhln\" (UniqueName: \"kubernetes.io/projected/f056c1db-118b-4363-a38e-e50097295112-kube-api-access-2jhln\") pod \"s3-tls-init-custom-nlrj7\" (UID: \"f056c1db-118b-4363-a38e-e50097295112\") " pod="kserve/s3-tls-init-custom-nlrj7" Apr 23 18:01:32.542384 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:32.542354 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jhln\" (UniqueName: \"kubernetes.io/projected/f056c1db-118b-4363-a38e-e50097295112-kube-api-access-2jhln\") pod \"s3-tls-init-custom-nlrj7\" (UID: \"f056c1db-118b-4363-a38e-e50097295112\") " pod="kserve/s3-tls-init-custom-nlrj7" Apr 23 18:01:32.632893 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:32.632855 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-nlrj7" Apr 23 18:01:32.750636 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:32.750603 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-nlrj7"] Apr 23 18:01:32.753487 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:01:32.753453 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf056c1db_118b_4363_a38e_e50097295112.slice/crio-cb17b23e06f029517fff78d3bf8112019bcc2d0c0cfdb3874c52377753f79671 WatchSource:0}: Error finding container cb17b23e06f029517fff78d3bf8112019bcc2d0c0cfdb3874c52377753f79671: Status 404 returned error can't find the container with id cb17b23e06f029517fff78d3bf8112019bcc2d0c0cfdb3874c52377753f79671 Apr 23 18:01:33.002328 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:33.002295 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-nlrj7" event={"ID":"f056c1db-118b-4363-a38e-e50097295112","Type":"ContainerStarted","Data":"1b34219851f1e69922f4c27ebd8e73dfa3149bee4610ede5a94f3bdd559427fe"} Apr 23 18:01:33.002328 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:33.002328 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-nlrj7" event={"ID":"f056c1db-118b-4363-a38e-e50097295112","Type":"ContainerStarted","Data":"cb17b23e06f029517fff78d3bf8112019bcc2d0c0cfdb3874c52377753f79671"} Apr 23 18:01:33.029826 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:33.029775 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-custom-nlrj7" podStartSLOduration=1.029761302 podStartE2EDuration="1.029761302s" podCreationTimestamp="2026-04-23 18:01:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:01:33.028156947 +0000 UTC m=+550.948249421" watchObservedRunningTime="2026-04-23 18:01:33.029761302 +0000 UTC m=+550.949853777" Apr 23 18:01:39.019088 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:39.019052 2568 generic.go:358] "Generic (PLEG): container finished" podID="f056c1db-118b-4363-a38e-e50097295112" containerID="1b34219851f1e69922f4c27ebd8e73dfa3149bee4610ede5a94f3bdd559427fe" exitCode=0 Apr 23 18:01:39.019506 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:39.019118 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-nlrj7" event={"ID":"f056c1db-118b-4363-a38e-e50097295112","Type":"ContainerDied","Data":"1b34219851f1e69922f4c27ebd8e73dfa3149bee4610ede5a94f3bdd559427fe"} Apr 23 18:01:40.148266 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:40.148242 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-nlrj7" Apr 23 18:01:40.288121 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:40.288036 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jhln\" (UniqueName: \"kubernetes.io/projected/f056c1db-118b-4363-a38e-e50097295112-kube-api-access-2jhln\") pod \"f056c1db-118b-4363-a38e-e50097295112\" (UID: \"f056c1db-118b-4363-a38e-e50097295112\") " Apr 23 18:01:40.290036 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:40.290000 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f056c1db-118b-4363-a38e-e50097295112-kube-api-access-2jhln" (OuterVolumeSpecName: "kube-api-access-2jhln") pod "f056c1db-118b-4363-a38e-e50097295112" (UID: "f056c1db-118b-4363-a38e-e50097295112"). InnerVolumeSpecName "kube-api-access-2jhln". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:01:40.389167 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:40.389130 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2jhln\" (UniqueName: \"kubernetes.io/projected/f056c1db-118b-4363-a38e-e50097295112-kube-api-access-2jhln\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:01:41.025176 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:41.025103 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-nlrj7" Apr 23 18:01:41.025176 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:41.025107 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-nlrj7" event={"ID":"f056c1db-118b-4363-a38e-e50097295112","Type":"ContainerDied","Data":"cb17b23e06f029517fff78d3bf8112019bcc2d0c0cfdb3874c52377753f79671"} Apr 23 18:01:41.025176 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:41.025137 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb17b23e06f029517fff78d3bf8112019bcc2d0c0cfdb3874c52377753f79671" Apr 23 18:01:43.795767 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:43.795736 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-serving-shx7t"] Apr 23 18:01:43.796295 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:43.796159 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f056c1db-118b-4363-a38e-e50097295112" containerName="s3-tls-init-custom" Apr 23 18:01:43.796295 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:43.796178 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f056c1db-118b-4363-a38e-e50097295112" containerName="s3-tls-init-custom" Apr 23 18:01:43.796295 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:43.796239 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="f056c1db-118b-4363-a38e-e50097295112" containerName="s3-tls-init-custom" Apr 23 18:01:43.798146 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:43.798124 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-shx7t" Apr 23 18:01:43.799897 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:43.799877 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 23 18:01:43.808785 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:43.808767 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-shx7t"] Apr 23 18:01:43.918114 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:43.918078 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksp7g\" (UniqueName: \"kubernetes.io/projected/4aad7b75-b3fc-4a86-abf3-cd9964ed28a3-kube-api-access-ksp7g\") pod \"s3-tls-init-serving-shx7t\" (UID: \"4aad7b75-b3fc-4a86-abf3-cd9964ed28a3\") " pod="kserve/s3-tls-init-serving-shx7t" Apr 23 18:01:44.018948 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:44.018897 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ksp7g\" (UniqueName: \"kubernetes.io/projected/4aad7b75-b3fc-4a86-abf3-cd9964ed28a3-kube-api-access-ksp7g\") pod \"s3-tls-init-serving-shx7t\" (UID: \"4aad7b75-b3fc-4a86-abf3-cd9964ed28a3\") " pod="kserve/s3-tls-init-serving-shx7t" Apr 23 18:01:44.028289 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:44.028264 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksp7g\" (UniqueName: \"kubernetes.io/projected/4aad7b75-b3fc-4a86-abf3-cd9964ed28a3-kube-api-access-ksp7g\") pod \"s3-tls-init-serving-shx7t\" (UID: \"4aad7b75-b3fc-4a86-abf3-cd9964ed28a3\") " pod="kserve/s3-tls-init-serving-shx7t" Apr 23 18:01:44.120087 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:44.120052 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-shx7t" Apr 23 18:01:44.237973 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:44.237942 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-shx7t"] Apr 23 18:01:44.241230 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:01:44.241203 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4aad7b75_b3fc_4a86_abf3_cd9964ed28a3.slice/crio-ef8f807f616a12ec6e1818261f70541fee1f32bff0af339e1bf6d32e76e82f2c WatchSource:0}: Error finding container ef8f807f616a12ec6e1818261f70541fee1f32bff0af339e1bf6d32e76e82f2c: Status 404 returned error can't find the container with id ef8f807f616a12ec6e1818261f70541fee1f32bff0af339e1bf6d32e76e82f2c Apr 23 18:01:45.038262 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:45.038229 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-shx7t" event={"ID":"4aad7b75-b3fc-4a86-abf3-cd9964ed28a3","Type":"ContainerStarted","Data":"ba915a60f69280669d778112d5ef8484eaaff45dac8dbb9d3f4337a5b7d1520b"} Apr 23 18:01:45.038262 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:45.038263 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-shx7t" event={"ID":"4aad7b75-b3fc-4a86-abf3-cd9964ed28a3","Type":"ContainerStarted","Data":"ef8f807f616a12ec6e1818261f70541fee1f32bff0af339e1bf6d32e76e82f2c"} Apr 23 18:01:45.054978 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:45.054905 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-serving-shx7t" podStartSLOduration=2.05488686 podStartE2EDuration="2.05488686s" podCreationTimestamp="2026-04-23 18:01:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:01:45.052417554 +0000 UTC m=+562.972510028" watchObservedRunningTime="2026-04-23 18:01:45.05488686 +0000 UTC m=+562.974979338" Apr 23 18:01:49.050361 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:49.050272 2568 generic.go:358] "Generic (PLEG): container finished" podID="4aad7b75-b3fc-4a86-abf3-cd9964ed28a3" containerID="ba915a60f69280669d778112d5ef8484eaaff45dac8dbb9d3f4337a5b7d1520b" exitCode=0 Apr 23 18:01:49.050361 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:49.050328 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-shx7t" event={"ID":"4aad7b75-b3fc-4a86-abf3-cd9964ed28a3","Type":"ContainerDied","Data":"ba915a60f69280669d778112d5ef8484eaaff45dac8dbb9d3f4337a5b7d1520b"} Apr 23 18:01:50.178465 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:50.178441 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-shx7t" Apr 23 18:01:50.268664 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:50.268638 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksp7g\" (UniqueName: \"kubernetes.io/projected/4aad7b75-b3fc-4a86-abf3-cd9964ed28a3-kube-api-access-ksp7g\") pod \"4aad7b75-b3fc-4a86-abf3-cd9964ed28a3\" (UID: \"4aad7b75-b3fc-4a86-abf3-cd9964ed28a3\") " Apr 23 18:01:50.270502 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:50.270468 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4aad7b75-b3fc-4a86-abf3-cd9964ed28a3-kube-api-access-ksp7g" (OuterVolumeSpecName: "kube-api-access-ksp7g") pod "4aad7b75-b3fc-4a86-abf3-cd9964ed28a3" (UID: "4aad7b75-b3fc-4a86-abf3-cd9964ed28a3"). InnerVolumeSpecName "kube-api-access-ksp7g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:01:50.369863 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:50.369827 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ksp7g\" (UniqueName: \"kubernetes.io/projected/4aad7b75-b3fc-4a86-abf3-cd9964ed28a3-kube-api-access-ksp7g\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:01:51.057162 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:51.057136 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-shx7t" Apr 23 18:01:51.057330 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:51.057136 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-shx7t" event={"ID":"4aad7b75-b3fc-4a86-abf3-cd9964ed28a3","Type":"ContainerDied","Data":"ef8f807f616a12ec6e1818261f70541fee1f32bff0af339e1bf6d32e76e82f2c"} Apr 23 18:01:51.057330 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:01:51.057239 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef8f807f616a12ec6e1818261f70541fee1f32bff0af339e1bf6d32e76e82f2c" Apr 23 18:02:00.477474 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:00.477437 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh"] Apr 23 18:02:00.478013 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:00.477727 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4aad7b75-b3fc-4a86-abf3-cd9964ed28a3" containerName="s3-tls-init-serving" Apr 23 18:02:00.478013 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:00.477743 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aad7b75-b3fc-4a86-abf3-cd9964ed28a3" containerName="s3-tls-init-serving" Apr 23 18:02:00.478013 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:00.477803 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="4aad7b75-b3fc-4a86-abf3-cd9964ed28a3" containerName="s3-tls-init-serving" Apr 23 18:02:00.479987 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:00.479967 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" Apr 23 18:02:00.481881 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:00.481850 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-predictor-serving-cert\"" Apr 23 18:02:00.482036 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:00.481940 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\"" Apr 23 18:02:00.482036 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:00.481979 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 18:02:00.482036 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:00.482010 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-nl4mc\"" Apr 23 18:02:00.482227 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:00.482040 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 18:02:00.489731 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:00.489708 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh"] Apr 23 18:02:00.539862 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:00.539832 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/deafc019-aef5-49a0-8284-0aa4ec490643-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-7d685d4755-rxnvh\" (UID: \"deafc019-aef5-49a0-8284-0aa4ec490643\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" Apr 23 18:02:00.540020 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:00.539877 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/deafc019-aef5-49a0-8284-0aa4ec490643-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-7d685d4755-rxnvh\" (UID: \"deafc019-aef5-49a0-8284-0aa4ec490643\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" Apr 23 18:02:00.540020 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:00.539906 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/deafc019-aef5-49a0-8284-0aa4ec490643-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-predictor-7d685d4755-rxnvh\" (UID: \"deafc019-aef5-49a0-8284-0aa4ec490643\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" Apr 23 18:02:00.540020 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:00.539987 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhtd8\" (UniqueName: \"kubernetes.io/projected/deafc019-aef5-49a0-8284-0aa4ec490643-kube-api-access-nhtd8\") pod \"isvc-sklearn-batcher-predictor-7d685d4755-rxnvh\" (UID: \"deafc019-aef5-49a0-8284-0aa4ec490643\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" Apr 23 18:02:00.640751 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:00.640718 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/deafc019-aef5-49a0-8284-0aa4ec490643-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-7d685d4755-rxnvh\" (UID: \"deafc019-aef5-49a0-8284-0aa4ec490643\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" Apr 23 18:02:00.640915 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:00.640759 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/deafc019-aef5-49a0-8284-0aa4ec490643-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-predictor-7d685d4755-rxnvh\" (UID: \"deafc019-aef5-49a0-8284-0aa4ec490643\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" Apr 23 18:02:00.640915 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:00.640803 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nhtd8\" (UniqueName: \"kubernetes.io/projected/deafc019-aef5-49a0-8284-0aa4ec490643-kube-api-access-nhtd8\") pod \"isvc-sklearn-batcher-predictor-7d685d4755-rxnvh\" (UID: \"deafc019-aef5-49a0-8284-0aa4ec490643\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" Apr 23 18:02:00.640915 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:00.640848 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/deafc019-aef5-49a0-8284-0aa4ec490643-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-7d685d4755-rxnvh\" (UID: \"deafc019-aef5-49a0-8284-0aa4ec490643\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" Apr 23 18:02:00.641168 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:00.641147 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/deafc019-aef5-49a0-8284-0aa4ec490643-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-7d685d4755-rxnvh\" (UID: \"deafc019-aef5-49a0-8284-0aa4ec490643\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" Apr 23 18:02:00.641419 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:00.641397 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/deafc019-aef5-49a0-8284-0aa4ec490643-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-predictor-7d685d4755-rxnvh\" (UID: \"deafc019-aef5-49a0-8284-0aa4ec490643\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" Apr 23 18:02:00.643230 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:00.643209 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/deafc019-aef5-49a0-8284-0aa4ec490643-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-7d685d4755-rxnvh\" (UID: \"deafc019-aef5-49a0-8284-0aa4ec490643\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" Apr 23 18:02:00.649513 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:00.649491 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhtd8\" (UniqueName: \"kubernetes.io/projected/deafc019-aef5-49a0-8284-0aa4ec490643-kube-api-access-nhtd8\") pod \"isvc-sklearn-batcher-predictor-7d685d4755-rxnvh\" (UID: \"deafc019-aef5-49a0-8284-0aa4ec490643\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" Apr 23 18:02:00.790082 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:00.789999 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" Apr 23 18:02:00.909859 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:00.909832 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh"] Apr 23 18:02:00.912365 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:02:00.912334 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddeafc019_aef5_49a0_8284_0aa4ec490643.slice/crio-d328671de7cb9f57390084bf229db0c584bf4450719894b72f1cb5b0c3b536ce WatchSource:0}: Error finding container d328671de7cb9f57390084bf229db0c584bf4450719894b72f1cb5b0c3b536ce: Status 404 returned error can't find the container with id d328671de7cb9f57390084bf229db0c584bf4450719894b72f1cb5b0c3b536ce Apr 23 18:02:01.082729 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:01.082693 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" event={"ID":"deafc019-aef5-49a0-8284-0aa4ec490643","Type":"ContainerStarted","Data":"d328671de7cb9f57390084bf229db0c584bf4450719894b72f1cb5b0c3b536ce"} Apr 23 18:02:05.096711 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:05.096676 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" event={"ID":"deafc019-aef5-49a0-8284-0aa4ec490643","Type":"ContainerStarted","Data":"58233605ee23d997c35f2b2c73574ee7038684b4dedf1a9ce737842de5c82b7c"} Apr 23 18:02:09.108530 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:09.108496 2568 generic.go:358] "Generic (PLEG): container finished" podID="deafc019-aef5-49a0-8284-0aa4ec490643" containerID="58233605ee23d997c35f2b2c73574ee7038684b4dedf1a9ce737842de5c82b7c" exitCode=0 Apr 23 18:02:09.108888 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:09.108569 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" event={"ID":"deafc019-aef5-49a0-8284-0aa4ec490643","Type":"ContainerDied","Data":"58233605ee23d997c35f2b2c73574ee7038684b4dedf1a9ce737842de5c82b7c"} Apr 23 18:02:22.154282 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:22.154241 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" event={"ID":"deafc019-aef5-49a0-8284-0aa4ec490643","Type":"ContainerStarted","Data":"6c61447d71a630b2550a6b159e63c749bb9c6564209be73d1aca94462db87932"} Apr 23 18:02:22.541337 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:22.541249 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwrk7_52f4d4c2-438f-4486-8ffb-4afbaad14bf1/ovn-acl-logging/0.log" Apr 23 18:02:22.541337 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:22.541254 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwrk7_52f4d4c2-438f-4486-8ffb-4afbaad14bf1/ovn-acl-logging/0.log" Apr 23 18:02:22.548273 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:22.548201 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_8dc4456d0c7e39a0756b16fde0e83318/kube-rbac-proxy-crio/4.log" Apr 23 18:02:22.548868 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:22.548847 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_8dc4456d0c7e39a0756b16fde0e83318/kube-rbac-proxy-crio/4.log" Apr 23 18:02:25.165780 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:25.165742 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" event={"ID":"deafc019-aef5-49a0-8284-0aa4ec490643","Type":"ContainerStarted","Data":"b1908e4f07a45f2e53100c54a3c451f9f6909b992b2e98c38d66c4bf42d9bbe0"} Apr 23 18:02:28.181468 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:28.181429 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" event={"ID":"deafc019-aef5-49a0-8284-0aa4ec490643","Type":"ContainerStarted","Data":"f118d24c2274e77a6d6418000dc40dd7638bcc18c55ec760f52356fc75e95230"} Apr 23 18:02:28.181871 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:28.181665 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" Apr 23 18:02:28.181871 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:28.181780 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" Apr 23 18:02:28.182891 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:28.182851 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" podUID="deafc019-aef5-49a0-8284-0aa4ec490643" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.16:8080: connect: connection refused" Apr 23 18:02:28.201044 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:28.200997 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" podStartSLOduration=1.912424595 podStartE2EDuration="28.200983898s" podCreationTimestamp="2026-04-23 18:02:00 +0000 UTC" firstStartedPulling="2026-04-23 18:02:00.914306356 +0000 UTC m=+578.834398809" lastFinishedPulling="2026-04-23 18:02:27.202865659 +0000 UTC m=+605.122958112" observedRunningTime="2026-04-23 18:02:28.200173097 +0000 UTC m=+606.120265571" watchObservedRunningTime="2026-04-23 18:02:28.200983898 +0000 UTC m=+606.121076374" Apr 23 18:02:29.184601 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:29.184570 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" Apr 23 18:02:29.185069 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:29.184687 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" podUID="deafc019-aef5-49a0-8284-0aa4ec490643" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.16:8080: connect: connection refused" Apr 23 18:02:29.185508 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:29.185482 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" podUID="deafc019-aef5-49a0-8284-0aa4ec490643" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:02:29.194796 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:29.194778 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" Apr 23 18:02:30.187561 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:30.187518 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" podUID="deafc019-aef5-49a0-8284-0aa4ec490643" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.16:8080: connect: connection refused" Apr 23 18:02:30.187943 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:30.187724 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" podUID="deafc019-aef5-49a0-8284-0aa4ec490643" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:02:31.189890 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:31.189847 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" podUID="deafc019-aef5-49a0-8284-0aa4ec490643" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.16:8080: connect: connection refused" Apr 23 18:02:31.190356 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:31.190204 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" podUID="deafc019-aef5-49a0-8284-0aa4ec490643" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:02:41.190665 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:41.190616 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" podUID="deafc019-aef5-49a0-8284-0aa4ec490643" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.16:8080: connect: connection refused" Apr 23 18:02:41.191149 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:41.191078 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" podUID="deafc019-aef5-49a0-8284-0aa4ec490643" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:02:51.190058 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:51.190005 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" podUID="deafc019-aef5-49a0-8284-0aa4ec490643" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.16:8080: connect: connection refused" Apr 23 18:02:51.190450 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:02:51.190398 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" podUID="deafc019-aef5-49a0-8284-0aa4ec490643" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:03:01.190379 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:01.190329 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" podUID="deafc019-aef5-49a0-8284-0aa4ec490643" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.16:8080: connect: connection refused" Apr 23 18:03:01.190883 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:01.190797 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" podUID="deafc019-aef5-49a0-8284-0aa4ec490643" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:03:11.190819 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:11.190755 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" podUID="deafc019-aef5-49a0-8284-0aa4ec490643" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.16:8080: connect: connection refused" Apr 23 18:03:11.191253 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:11.191186 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" podUID="deafc019-aef5-49a0-8284-0aa4ec490643" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:03:21.190542 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:21.190493 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" podUID="deafc019-aef5-49a0-8284-0aa4ec490643" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.16:8080: connect: connection refused" Apr 23 18:03:21.191075 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:21.191049 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" podUID="deafc019-aef5-49a0-8284-0aa4ec490643" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:03:31.191045 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:31.191011 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" Apr 23 18:03:31.191473 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:31.191182 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" Apr 23 18:03:45.570385 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:45.570351 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh"] Apr 23 18:03:45.570860 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:45.570717 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" podUID="deafc019-aef5-49a0-8284-0aa4ec490643" containerName="kserve-container" containerID="cri-o://6c61447d71a630b2550a6b159e63c749bb9c6564209be73d1aca94462db87932" gracePeriod=30 Apr 23 18:03:45.570860 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:45.570721 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" podUID="deafc019-aef5-49a0-8284-0aa4ec490643" containerName="agent" containerID="cri-o://f118d24c2274e77a6d6418000dc40dd7638bcc18c55ec760f52356fc75e95230" gracePeriod=30 Apr 23 18:03:45.570860 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:45.570754 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" podUID="deafc019-aef5-49a0-8284-0aa4ec490643" containerName="kube-rbac-proxy" containerID="cri-o://b1908e4f07a45f2e53100c54a3c451f9f6909b992b2e98c38d66c4bf42d9bbe0" gracePeriod=30 Apr 23 18:03:45.695828 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:45.695796 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw"] Apr 23 18:03:45.698174 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:45.698153 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" Apr 23 18:03:45.700045 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:45.700024 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\"" Apr 23 18:03:45.700132 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:45.700024 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-custom-predictor-serving-cert\"" Apr 23 18:03:45.708689 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:45.708664 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw"] Apr 23 18:03:45.842152 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:45.842118 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/69efbdf7-2636-4f73-b23d-4b90e5ab00a7-proxy-tls\") pod \"isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw\" (UID: \"69efbdf7-2636-4f73-b23d-4b90e5ab00a7\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" Apr 23 18:03:45.842334 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:45.842188 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/69efbdf7-2636-4f73-b23d-4b90e5ab00a7-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw\" (UID: \"69efbdf7-2636-4f73-b23d-4b90e5ab00a7\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" Apr 23 18:03:45.842334 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:45.842227 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/69efbdf7-2636-4f73-b23d-4b90e5ab00a7-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw\" (UID: \"69efbdf7-2636-4f73-b23d-4b90e5ab00a7\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" Apr 23 18:03:45.842334 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:45.842245 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zlmx\" (UniqueName: \"kubernetes.io/projected/69efbdf7-2636-4f73-b23d-4b90e5ab00a7-kube-api-access-5zlmx\") pod \"isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw\" (UID: \"69efbdf7-2636-4f73-b23d-4b90e5ab00a7\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" Apr 23 18:03:45.943456 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:45.943420 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/69efbdf7-2636-4f73-b23d-4b90e5ab00a7-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw\" (UID: \"69efbdf7-2636-4f73-b23d-4b90e5ab00a7\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" Apr 23 18:03:45.943456 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:45.943460 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5zlmx\" (UniqueName: \"kubernetes.io/projected/69efbdf7-2636-4f73-b23d-4b90e5ab00a7-kube-api-access-5zlmx\") pod \"isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw\" (UID: \"69efbdf7-2636-4f73-b23d-4b90e5ab00a7\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" Apr 23 18:03:45.943706 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:45.943500 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/69efbdf7-2636-4f73-b23d-4b90e5ab00a7-proxy-tls\") pod \"isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw\" (UID: \"69efbdf7-2636-4f73-b23d-4b90e5ab00a7\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" Apr 23 18:03:45.943706 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:03:45.943650 2568 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-serving-cert: secret "isvc-sklearn-batcher-custom-predictor-serving-cert" not found Apr 23 18:03:45.943706 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:45.943678 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/69efbdf7-2636-4f73-b23d-4b90e5ab00a7-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw\" (UID: \"69efbdf7-2636-4f73-b23d-4b90e5ab00a7\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" Apr 23 18:03:45.943849 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:03:45.943714 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69efbdf7-2636-4f73-b23d-4b90e5ab00a7-proxy-tls podName:69efbdf7-2636-4f73-b23d-4b90e5ab00a7 nodeName:}" failed. No retries permitted until 2026-04-23 18:03:46.443693004 +0000 UTC m=+684.363785474 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/69efbdf7-2636-4f73-b23d-4b90e5ab00a7-proxy-tls") pod "isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" (UID: "69efbdf7-2636-4f73-b23d-4b90e5ab00a7") : secret "isvc-sklearn-batcher-custom-predictor-serving-cert" not found Apr 23 18:03:45.943917 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:45.943846 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/69efbdf7-2636-4f73-b23d-4b90e5ab00a7-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw\" (UID: \"69efbdf7-2636-4f73-b23d-4b90e5ab00a7\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" Apr 23 18:03:45.944290 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:45.944268 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/69efbdf7-2636-4f73-b23d-4b90e5ab00a7-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw\" (UID: \"69efbdf7-2636-4f73-b23d-4b90e5ab00a7\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" Apr 23 18:03:45.954303 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:45.954272 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zlmx\" (UniqueName: \"kubernetes.io/projected/69efbdf7-2636-4f73-b23d-4b90e5ab00a7-kube-api-access-5zlmx\") pod \"isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw\" (UID: \"69efbdf7-2636-4f73-b23d-4b90e5ab00a7\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" Apr 23 18:03:46.392990 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:46.392958 2568 generic.go:358] "Generic (PLEG): container finished" podID="deafc019-aef5-49a0-8284-0aa4ec490643" containerID="b1908e4f07a45f2e53100c54a3c451f9f6909b992b2e98c38d66c4bf42d9bbe0" exitCode=2 Apr 23 18:03:46.393169 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:46.393031 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" event={"ID":"deafc019-aef5-49a0-8284-0aa4ec490643","Type":"ContainerDied","Data":"b1908e4f07a45f2e53100c54a3c451f9f6909b992b2e98c38d66c4bf42d9bbe0"} Apr 23 18:03:46.447635 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:46.447602 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/69efbdf7-2636-4f73-b23d-4b90e5ab00a7-proxy-tls\") pod \"isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw\" (UID: \"69efbdf7-2636-4f73-b23d-4b90e5ab00a7\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" Apr 23 18:03:46.450004 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:46.449980 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/69efbdf7-2636-4f73-b23d-4b90e5ab00a7-proxy-tls\") pod \"isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw\" (UID: \"69efbdf7-2636-4f73-b23d-4b90e5ab00a7\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" Apr 23 18:03:46.608772 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:46.608734 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" Apr 23 18:03:46.735523 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:46.735465 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw"] Apr 23 18:03:46.742770 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:03:46.742738 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69efbdf7_2636_4f73_b23d_4b90e5ab00a7.slice/crio-fc215b02c8e3e88717f555a7f12689dba1311c645b8bcb9b6031365dd62dbe35 WatchSource:0}: Error finding container fc215b02c8e3e88717f555a7f12689dba1311c645b8bcb9b6031365dd62dbe35: Status 404 returned error can't find the container with id fc215b02c8e3e88717f555a7f12689dba1311c645b8bcb9b6031365dd62dbe35 Apr 23 18:03:47.397642 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:47.397593 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" event={"ID":"69efbdf7-2636-4f73-b23d-4b90e5ab00a7","Type":"ContainerStarted","Data":"4bf9db6116d732b079a3a6ad3d3b11bf629254c050676b5ad9c527c0fd693a48"} Apr 23 18:03:47.397642 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:47.397644 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" event={"ID":"69efbdf7-2636-4f73-b23d-4b90e5ab00a7","Type":"ContainerStarted","Data":"fc215b02c8e3e88717f555a7f12689dba1311c645b8bcb9b6031365dd62dbe35"} Apr 23 18:03:49.185495 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:49.185445 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" podUID="deafc019-aef5-49a0-8284-0aa4ec490643" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.16:8643/healthz\": dial tcp 10.133.0.16:8643: connect: connection refused" Apr 23 18:03:50.408189 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:50.408156 2568 generic.go:358] "Generic (PLEG): container finished" podID="deafc019-aef5-49a0-8284-0aa4ec490643" containerID="6c61447d71a630b2550a6b159e63c749bb9c6564209be73d1aca94462db87932" exitCode=0 Apr 23 18:03:50.408574 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:50.408236 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" event={"ID":"deafc019-aef5-49a0-8284-0aa4ec490643","Type":"ContainerDied","Data":"6c61447d71a630b2550a6b159e63c749bb9c6564209be73d1aca94462db87932"} Apr 23 18:03:51.190119 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:51.190069 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" podUID="deafc019-aef5-49a0-8284-0aa4ec490643" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.16:8080: connect: connection refused" Apr 23 18:03:51.190362 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:51.190333 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" podUID="deafc019-aef5-49a0-8284-0aa4ec490643" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:03:51.412645 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:51.412610 2568 generic.go:358] "Generic (PLEG): container finished" podID="69efbdf7-2636-4f73-b23d-4b90e5ab00a7" containerID="4bf9db6116d732b079a3a6ad3d3b11bf629254c050676b5ad9c527c0fd693a48" exitCode=0 Apr 23 18:03:51.413075 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:51.412665 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" event={"ID":"69efbdf7-2636-4f73-b23d-4b90e5ab00a7","Type":"ContainerDied","Data":"4bf9db6116d732b079a3a6ad3d3b11bf629254c050676b5ad9c527c0fd693a48"} Apr 23 18:03:52.417413 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:52.417379 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" event={"ID":"69efbdf7-2636-4f73-b23d-4b90e5ab00a7","Type":"ContainerStarted","Data":"aeca5fda1a97672cd74a16315dc1e6b594b40385b6a072f2c16bf973b7ad2439"} Apr 23 18:03:52.417413 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:52.417419 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" event={"ID":"69efbdf7-2636-4f73-b23d-4b90e5ab00a7","Type":"ContainerStarted","Data":"f2be8bc88131d27fc7850b1560d7684d14aa5ec4e8b09a810192e8aab787cea8"} Apr 23 18:03:52.417895 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:52.417429 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" event={"ID":"69efbdf7-2636-4f73-b23d-4b90e5ab00a7","Type":"ContainerStarted","Data":"7aa01e4d043faa6a5b480e6044e9d4b54d390edcb53d284ea9e20a13cc4b5894"} Apr 23 18:03:52.417895 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:52.417736 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" Apr 23 18:03:52.417895 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:52.417770 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" Apr 23 18:03:52.417895 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:52.417782 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" Apr 23 18:03:52.419277 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:52.419252 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" podUID="69efbdf7-2636-4f73-b23d-4b90e5ab00a7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.17:5000: connect: connection refused" Apr 23 18:03:52.419909 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:52.419888 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" podUID="69efbdf7-2636-4f73-b23d-4b90e5ab00a7" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:03:52.436534 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:52.436484 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" podStartSLOduration=7.436472387 podStartE2EDuration="7.436472387s" podCreationTimestamp="2026-04-23 18:03:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:03:52.436014962 +0000 UTC m=+690.356107436" watchObservedRunningTime="2026-04-23 18:03:52.436472387 +0000 UTC m=+690.356564862" Apr 23 18:03:53.420472 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:53.420431 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" podUID="69efbdf7-2636-4f73-b23d-4b90e5ab00a7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.17:5000: connect: connection refused" Apr 23 18:03:53.420866 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:53.420831 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" podUID="69efbdf7-2636-4f73-b23d-4b90e5ab00a7" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:03:54.185717 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:54.185676 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" podUID="deafc019-aef5-49a0-8284-0aa4ec490643" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.16:8643/healthz\": dial tcp 10.133.0.16:8643: connect: connection refused" Apr 23 18:03:58.424473 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:58.424445 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" Apr 23 18:03:58.425140 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:58.425106 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" podUID="69efbdf7-2636-4f73-b23d-4b90e5ab00a7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.17:5000: connect: connection refused" Apr 23 18:03:58.425478 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:58.425459 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" podUID="69efbdf7-2636-4f73-b23d-4b90e5ab00a7" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:03:59.184857 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:59.184814 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" podUID="deafc019-aef5-49a0-8284-0aa4ec490643" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.16:8643/healthz\": dial tcp 10.133.0.16:8643: connect: connection refused" Apr 23 18:03:59.185067 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:03:59.184992 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" Apr 23 18:04:01.190652 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:01.190609 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" podUID="deafc019-aef5-49a0-8284-0aa4ec490643" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.16:8080: connect: connection refused" Apr 23 18:04:01.191081 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:01.190968 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" podUID="deafc019-aef5-49a0-8284-0aa4ec490643" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:04:04.185478 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:04.185387 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" podUID="deafc019-aef5-49a0-8284-0aa4ec490643" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.16:8643/healthz\": dial tcp 10.133.0.16:8643: connect: connection refused" Apr 23 18:04:08.425130 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:08.425086 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" podUID="69efbdf7-2636-4f73-b23d-4b90e5ab00a7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.17:5000: connect: connection refused" Apr 23 18:04:08.425575 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:08.425552 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" podUID="69efbdf7-2636-4f73-b23d-4b90e5ab00a7" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:04:09.185587 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:09.185545 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" podUID="deafc019-aef5-49a0-8284-0aa4ec490643" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.16:8643/healthz\": dial tcp 10.133.0.16:8643: connect: connection refused" Apr 23 18:04:11.189878 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:11.189838 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" podUID="deafc019-aef5-49a0-8284-0aa4ec490643" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.16:8080: connect: connection refused" Apr 23 18:04:11.190296 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:11.189999 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" Apr 23 18:04:11.190296 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:11.190183 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" podUID="deafc019-aef5-49a0-8284-0aa4ec490643" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:04:11.190296 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:11.190293 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" Apr 23 18:04:14.185044 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:14.184997 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" podUID="deafc019-aef5-49a0-8284-0aa4ec490643" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.16:8643/healthz\": dial tcp 10.133.0.16:8643: connect: connection refused" Apr 23 18:04:15.718303 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:15.718278 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" Apr 23 18:04:15.875196 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:15.875107 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhtd8\" (UniqueName: \"kubernetes.io/projected/deafc019-aef5-49a0-8284-0aa4ec490643-kube-api-access-nhtd8\") pod \"deafc019-aef5-49a0-8284-0aa4ec490643\" (UID: \"deafc019-aef5-49a0-8284-0aa4ec490643\") " Apr 23 18:04:15.875196 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:15.875159 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/deafc019-aef5-49a0-8284-0aa4ec490643-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"deafc019-aef5-49a0-8284-0aa4ec490643\" (UID: \"deafc019-aef5-49a0-8284-0aa4ec490643\") " Apr 23 18:04:15.875196 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:15.875185 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/deafc019-aef5-49a0-8284-0aa4ec490643-proxy-tls\") pod \"deafc019-aef5-49a0-8284-0aa4ec490643\" (UID: \"deafc019-aef5-49a0-8284-0aa4ec490643\") " Apr 23 18:04:15.875478 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:15.875295 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/deafc019-aef5-49a0-8284-0aa4ec490643-kserve-provision-location\") pod \"deafc019-aef5-49a0-8284-0aa4ec490643\" (UID: \"deafc019-aef5-49a0-8284-0aa4ec490643\") " Apr 23 18:04:15.875581 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:15.875545 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/deafc019-aef5-49a0-8284-0aa4ec490643-isvc-sklearn-batcher-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-batcher-kube-rbac-proxy-sar-config") pod "deafc019-aef5-49a0-8284-0aa4ec490643" (UID: "deafc019-aef5-49a0-8284-0aa4ec490643"). InnerVolumeSpecName "isvc-sklearn-batcher-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:04:15.875712 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:15.875599 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/deafc019-aef5-49a0-8284-0aa4ec490643-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "deafc019-aef5-49a0-8284-0aa4ec490643" (UID: "deafc019-aef5-49a0-8284-0aa4ec490643"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:04:15.877441 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:15.877414 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deafc019-aef5-49a0-8284-0aa4ec490643-kube-api-access-nhtd8" (OuterVolumeSpecName: "kube-api-access-nhtd8") pod "deafc019-aef5-49a0-8284-0aa4ec490643" (UID: "deafc019-aef5-49a0-8284-0aa4ec490643"). InnerVolumeSpecName "kube-api-access-nhtd8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:04:15.877915 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:15.877895 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deafc019-aef5-49a0-8284-0aa4ec490643-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "deafc019-aef5-49a0-8284-0aa4ec490643" (UID: "deafc019-aef5-49a0-8284-0aa4ec490643"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:04:15.975902 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:15.975868 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nhtd8\" (UniqueName: \"kubernetes.io/projected/deafc019-aef5-49a0-8284-0aa4ec490643-kube-api-access-nhtd8\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:04:15.975902 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:15.975897 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/deafc019-aef5-49a0-8284-0aa4ec490643-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:04:15.975902 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:15.975911 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/deafc019-aef5-49a0-8284-0aa4ec490643-proxy-tls\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:04:15.976145 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:15.975921 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/deafc019-aef5-49a0-8284-0aa4ec490643-kserve-provision-location\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:04:16.491757 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:16.491721 2568 generic.go:358] "Generic (PLEG): container finished" podID="deafc019-aef5-49a0-8284-0aa4ec490643" containerID="f118d24c2274e77a6d6418000dc40dd7638bcc18c55ec760f52356fc75e95230" exitCode=0 Apr 23 18:04:16.491952 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:16.491769 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" event={"ID":"deafc019-aef5-49a0-8284-0aa4ec490643","Type":"ContainerDied","Data":"f118d24c2274e77a6d6418000dc40dd7638bcc18c55ec760f52356fc75e95230"} Apr 23 18:04:16.491952 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:16.491797 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" event={"ID":"deafc019-aef5-49a0-8284-0aa4ec490643","Type":"ContainerDied","Data":"d328671de7cb9f57390084bf229db0c584bf4450719894b72f1cb5b0c3b536ce"} Apr 23 18:04:16.491952 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:16.491813 2568 scope.go:117] "RemoveContainer" containerID="f118d24c2274e77a6d6418000dc40dd7638bcc18c55ec760f52356fc75e95230" Apr 23 18:04:16.491952 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:16.491814 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh" Apr 23 18:04:16.499917 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:16.499898 2568 scope.go:117] "RemoveContainer" containerID="b1908e4f07a45f2e53100c54a3c451f9f6909b992b2e98c38d66c4bf42d9bbe0" Apr 23 18:04:16.507193 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:16.507173 2568 scope.go:117] "RemoveContainer" containerID="6c61447d71a630b2550a6b159e63c749bb9c6564209be73d1aca94462db87932" Apr 23 18:04:16.514533 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:16.514473 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh"] Apr 23 18:04:16.514592 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:16.514573 2568 scope.go:117] "RemoveContainer" containerID="58233605ee23d997c35f2b2c73574ee7038684b4dedf1a9ce737842de5c82b7c" Apr 23 18:04:16.521530 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:16.521503 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7d685d4755-rxnvh"] Apr 23 18:04:16.522172 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:16.522149 2568 scope.go:117] "RemoveContainer" containerID="f118d24c2274e77a6d6418000dc40dd7638bcc18c55ec760f52356fc75e95230" Apr 23 18:04:16.522449 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:04:16.522421 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f118d24c2274e77a6d6418000dc40dd7638bcc18c55ec760f52356fc75e95230\": container with ID starting with f118d24c2274e77a6d6418000dc40dd7638bcc18c55ec760f52356fc75e95230 not found: ID does not exist" containerID="f118d24c2274e77a6d6418000dc40dd7638bcc18c55ec760f52356fc75e95230" Apr 23 18:04:16.522497 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:16.522453 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f118d24c2274e77a6d6418000dc40dd7638bcc18c55ec760f52356fc75e95230"} err="failed to get container status \"f118d24c2274e77a6d6418000dc40dd7638bcc18c55ec760f52356fc75e95230\": rpc error: code = NotFound desc = could not find container \"f118d24c2274e77a6d6418000dc40dd7638bcc18c55ec760f52356fc75e95230\": container with ID starting with f118d24c2274e77a6d6418000dc40dd7638bcc18c55ec760f52356fc75e95230 not found: ID does not exist" Apr 23 18:04:16.522497 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:16.522485 2568 scope.go:117] "RemoveContainer" containerID="b1908e4f07a45f2e53100c54a3c451f9f6909b992b2e98c38d66c4bf42d9bbe0" Apr 23 18:04:16.522711 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:04:16.522692 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1908e4f07a45f2e53100c54a3c451f9f6909b992b2e98c38d66c4bf42d9bbe0\": container with ID starting with b1908e4f07a45f2e53100c54a3c451f9f6909b992b2e98c38d66c4bf42d9bbe0 not found: ID does not exist" containerID="b1908e4f07a45f2e53100c54a3c451f9f6909b992b2e98c38d66c4bf42d9bbe0" Apr 23 18:04:16.522753 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:16.522718 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1908e4f07a45f2e53100c54a3c451f9f6909b992b2e98c38d66c4bf42d9bbe0"} err="failed to get container status \"b1908e4f07a45f2e53100c54a3c451f9f6909b992b2e98c38d66c4bf42d9bbe0\": rpc error: code = NotFound desc = could not find container \"b1908e4f07a45f2e53100c54a3c451f9f6909b992b2e98c38d66c4bf42d9bbe0\": container with ID starting with b1908e4f07a45f2e53100c54a3c451f9f6909b992b2e98c38d66c4bf42d9bbe0 not found: ID does not exist" Apr 23 18:04:16.522753 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:16.522734 2568 scope.go:117] "RemoveContainer" containerID="6c61447d71a630b2550a6b159e63c749bb9c6564209be73d1aca94462db87932" Apr 23 18:04:16.522997 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:04:16.522981 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c61447d71a630b2550a6b159e63c749bb9c6564209be73d1aca94462db87932\": container with ID starting with 6c61447d71a630b2550a6b159e63c749bb9c6564209be73d1aca94462db87932 not found: ID does not exist" containerID="6c61447d71a630b2550a6b159e63c749bb9c6564209be73d1aca94462db87932" Apr 23 18:04:16.523046 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:16.523002 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c61447d71a630b2550a6b159e63c749bb9c6564209be73d1aca94462db87932"} err="failed to get container status \"6c61447d71a630b2550a6b159e63c749bb9c6564209be73d1aca94462db87932\": rpc error: code = NotFound desc = could not find container \"6c61447d71a630b2550a6b159e63c749bb9c6564209be73d1aca94462db87932\": container with ID starting with 6c61447d71a630b2550a6b159e63c749bb9c6564209be73d1aca94462db87932 not found: ID does not exist" Apr 23 18:04:16.523046 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:16.523017 2568 scope.go:117] "RemoveContainer" containerID="58233605ee23d997c35f2b2c73574ee7038684b4dedf1a9ce737842de5c82b7c" Apr 23 18:04:16.523262 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:04:16.523245 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58233605ee23d997c35f2b2c73574ee7038684b4dedf1a9ce737842de5c82b7c\": container with ID starting with 58233605ee23d997c35f2b2c73574ee7038684b4dedf1a9ce737842de5c82b7c not found: ID does not exist" containerID="58233605ee23d997c35f2b2c73574ee7038684b4dedf1a9ce737842de5c82b7c" Apr 23 18:04:16.523304 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:16.523266 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58233605ee23d997c35f2b2c73574ee7038684b4dedf1a9ce737842de5c82b7c"} err="failed to get container status \"58233605ee23d997c35f2b2c73574ee7038684b4dedf1a9ce737842de5c82b7c\": rpc error: code = NotFound desc = could not find container \"58233605ee23d997c35f2b2c73574ee7038684b4dedf1a9ce737842de5c82b7c\": container with ID starting with 58233605ee23d997c35f2b2c73574ee7038684b4dedf1a9ce737842de5c82b7c not found: ID does not exist" Apr 23 18:04:16.599249 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:16.599218 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="deafc019-aef5-49a0-8284-0aa4ec490643" path="/var/lib/kubelet/pods/deafc019-aef5-49a0-8284-0aa4ec490643/volumes" Apr 23 18:04:18.425339 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:18.425295 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" podUID="69efbdf7-2636-4f73-b23d-4b90e5ab00a7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.17:5000: connect: connection refused" Apr 23 18:04:18.425785 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:18.425745 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" podUID="69efbdf7-2636-4f73-b23d-4b90e5ab00a7" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:04:28.425620 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:28.425569 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" podUID="69efbdf7-2636-4f73-b23d-4b90e5ab00a7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.17:5000: connect: connection refused" Apr 23 18:04:28.426097 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:28.426039 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" podUID="69efbdf7-2636-4f73-b23d-4b90e5ab00a7" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:04:38.425517 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:38.425470 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" podUID="69efbdf7-2636-4f73-b23d-4b90e5ab00a7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.17:5000: connect: connection refused" Apr 23 18:04:38.425959 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:38.425887 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" podUID="69efbdf7-2636-4f73-b23d-4b90e5ab00a7" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:04:48.425555 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:48.425512 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" podUID="69efbdf7-2636-4f73-b23d-4b90e5ab00a7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.17:5000: connect: connection refused" Apr 23 18:04:48.426059 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:48.425974 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" podUID="69efbdf7-2636-4f73-b23d-4b90e5ab00a7" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:04:58.425886 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:58.425848 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" Apr 23 18:04:58.426410 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:04:58.426001 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" Apr 23 18:05:10.749173 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:10.749132 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw"] Apr 23 18:05:10.749637 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:10.749576 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" podUID="69efbdf7-2636-4f73-b23d-4b90e5ab00a7" containerName="kserve-container" containerID="cri-o://7aa01e4d043faa6a5b480e6044e9d4b54d390edcb53d284ea9e20a13cc4b5894" gracePeriod=30 Apr 23 18:05:10.749637 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:10.749593 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" podUID="69efbdf7-2636-4f73-b23d-4b90e5ab00a7" containerName="agent" containerID="cri-o://aeca5fda1a97672cd74a16315dc1e6b594b40385b6a072f2c16bf973b7ad2439" gracePeriod=30 Apr 23 18:05:10.749834 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:10.749618 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" podUID="69efbdf7-2636-4f73-b23d-4b90e5ab00a7" containerName="kube-rbac-proxy" containerID="cri-o://f2be8bc88131d27fc7850b1560d7684d14aa5ec4e8b09a810192e8aab787cea8" gracePeriod=30 Apr 23 18:05:10.826660 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:10.826628 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-sghjq"] Apr 23 18:05:10.826898 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:10.826887 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="deafc019-aef5-49a0-8284-0aa4ec490643" containerName="kserve-container" Apr 23 18:05:10.826957 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:10.826900 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="deafc019-aef5-49a0-8284-0aa4ec490643" containerName="kserve-container" Apr 23 18:05:10.826957 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:10.826917 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="deafc019-aef5-49a0-8284-0aa4ec490643" containerName="storage-initializer" Apr 23 18:05:10.826957 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:10.826923 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="deafc019-aef5-49a0-8284-0aa4ec490643" containerName="storage-initializer" Apr 23 18:05:10.826957 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:10.826951 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="deafc019-aef5-49a0-8284-0aa4ec490643" containerName="agent" Apr 23 18:05:10.827082 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:10.826960 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="deafc019-aef5-49a0-8284-0aa4ec490643" containerName="agent" Apr 23 18:05:10.827082 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:10.826973 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="deafc019-aef5-49a0-8284-0aa4ec490643" containerName="kube-rbac-proxy" Apr 23 18:05:10.827082 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:10.826978 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="deafc019-aef5-49a0-8284-0aa4ec490643" containerName="kube-rbac-proxy" Apr 23 18:05:10.827082 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:10.827020 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="deafc019-aef5-49a0-8284-0aa4ec490643" containerName="kube-rbac-proxy" Apr 23 18:05:10.827082 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:10.827028 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="deafc019-aef5-49a0-8284-0aa4ec490643" containerName="kserve-container" Apr 23 18:05:10.827082 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:10.827036 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="deafc019-aef5-49a0-8284-0aa4ec490643" containerName="agent" Apr 23 18:05:10.829083 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:10.829066 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-sghjq" Apr 23 18:05:10.831121 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:10.831099 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-predictor-serving-cert\"" Apr 23 18:05:10.831121 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:10.831115 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-kube-rbac-proxy-sar-config\"" Apr 23 18:05:10.839817 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:10.839793 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-sghjq"] Apr 23 18:05:10.995447 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:10.995418 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e2e8b183-1919-403e-8b9f-e260d2938ab7-message-dumper-kube-rbac-proxy-sar-config\") pod \"message-dumper-predictor-c7d86bcbd-sghjq\" (UID: \"e2e8b183-1919-403e-8b9f-e260d2938ab7\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-sghjq" Apr 23 18:05:10.995447 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:10.995459 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tltkx\" (UniqueName: \"kubernetes.io/projected/e2e8b183-1919-403e-8b9f-e260d2938ab7-kube-api-access-tltkx\") pod \"message-dumper-predictor-c7d86bcbd-sghjq\" (UID: \"e2e8b183-1919-403e-8b9f-e260d2938ab7\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-sghjq" Apr 23 18:05:10.995678 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:10.995480 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e2e8b183-1919-403e-8b9f-e260d2938ab7-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-sghjq\" (UID: \"e2e8b183-1919-403e-8b9f-e260d2938ab7\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-sghjq" Apr 23 18:05:11.096387 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:11.096359 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e2e8b183-1919-403e-8b9f-e260d2938ab7-message-dumper-kube-rbac-proxy-sar-config\") pod \"message-dumper-predictor-c7d86bcbd-sghjq\" (UID: \"e2e8b183-1919-403e-8b9f-e260d2938ab7\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-sghjq" Apr 23 18:05:11.096586 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:11.096408 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tltkx\" (UniqueName: \"kubernetes.io/projected/e2e8b183-1919-403e-8b9f-e260d2938ab7-kube-api-access-tltkx\") pod \"message-dumper-predictor-c7d86bcbd-sghjq\" (UID: \"e2e8b183-1919-403e-8b9f-e260d2938ab7\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-sghjq" Apr 23 18:05:11.096586 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:11.096438 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e2e8b183-1919-403e-8b9f-e260d2938ab7-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-sghjq\" (UID: \"e2e8b183-1919-403e-8b9f-e260d2938ab7\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-sghjq" Apr 23 18:05:11.096586 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:05:11.096568 2568 secret.go:189] Couldn't get secret kserve-ci-e2e-test/message-dumper-predictor-serving-cert: secret "message-dumper-predictor-serving-cert" not found Apr 23 18:05:11.096747 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:05:11.096636 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2e8b183-1919-403e-8b9f-e260d2938ab7-proxy-tls podName:e2e8b183-1919-403e-8b9f-e260d2938ab7 nodeName:}" failed. No retries permitted until 2026-04-23 18:05:11.596614805 +0000 UTC m=+769.516707259 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/e2e8b183-1919-403e-8b9f-e260d2938ab7-proxy-tls") pod "message-dumper-predictor-c7d86bcbd-sghjq" (UID: "e2e8b183-1919-403e-8b9f-e260d2938ab7") : secret "message-dumper-predictor-serving-cert" not found Apr 23 18:05:11.097072 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:11.097048 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e2e8b183-1919-403e-8b9f-e260d2938ab7-message-dumper-kube-rbac-proxy-sar-config\") pod \"message-dumper-predictor-c7d86bcbd-sghjq\" (UID: \"e2e8b183-1919-403e-8b9f-e260d2938ab7\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-sghjq" Apr 23 18:05:11.106988 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:11.106956 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tltkx\" (UniqueName: \"kubernetes.io/projected/e2e8b183-1919-403e-8b9f-e260d2938ab7-kube-api-access-tltkx\") pod \"message-dumper-predictor-c7d86bcbd-sghjq\" (UID: \"e2e8b183-1919-403e-8b9f-e260d2938ab7\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-sghjq" Apr 23 18:05:11.600089 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:11.600048 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e2e8b183-1919-403e-8b9f-e260d2938ab7-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-sghjq\" (UID: \"e2e8b183-1919-403e-8b9f-e260d2938ab7\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-sghjq" Apr 23 18:05:11.602390 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:11.602352 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e2e8b183-1919-403e-8b9f-e260d2938ab7-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-sghjq\" (UID: \"e2e8b183-1919-403e-8b9f-e260d2938ab7\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-sghjq" Apr 23 18:05:11.646192 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:11.646160 2568 generic.go:358] "Generic (PLEG): container finished" podID="69efbdf7-2636-4f73-b23d-4b90e5ab00a7" containerID="f2be8bc88131d27fc7850b1560d7684d14aa5ec4e8b09a810192e8aab787cea8" exitCode=2 Apr 23 18:05:11.646385 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:11.646235 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" event={"ID":"69efbdf7-2636-4f73-b23d-4b90e5ab00a7","Type":"ContainerDied","Data":"f2be8bc88131d27fc7850b1560d7684d14aa5ec4e8b09a810192e8aab787cea8"} Apr 23 18:05:11.739028 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:11.738994 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-sghjq" Apr 23 18:05:11.857736 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:11.857584 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-sghjq"] Apr 23 18:05:11.862121 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:05:11.862083 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2e8b183_1919_403e_8b9f_e260d2938ab7.slice/crio-98188dae3e69ae9bb94b1df73b9beed1f3a4e69bfe895d1bd7970c68d497f992 WatchSource:0}: Error finding container 98188dae3e69ae9bb94b1df73b9beed1f3a4e69bfe895d1bd7970c68d497f992: Status 404 returned error can't find the container with id 98188dae3e69ae9bb94b1df73b9beed1f3a4e69bfe895d1bd7970c68d497f992 Apr 23 18:05:11.863905 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:11.863890 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 18:05:12.650251 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:12.650212 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-sghjq" event={"ID":"e2e8b183-1919-403e-8b9f-e260d2938ab7","Type":"ContainerStarted","Data":"98188dae3e69ae9bb94b1df73b9beed1f3a4e69bfe895d1bd7970c68d497f992"} Apr 23 18:05:13.421439 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:13.421394 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" podUID="69efbdf7-2636-4f73-b23d-4b90e5ab00a7" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.17:8643/healthz\": dial tcp 10.133.0.17:8643: connect: connection refused" Apr 23 18:05:13.654742 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:13.654708 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-sghjq" event={"ID":"e2e8b183-1919-403e-8b9f-e260d2938ab7","Type":"ContainerStarted","Data":"fdc2985a467a79e5032bdb985d3a2b5b3ab55eedd7c8ef69cc13028da0d0f86f"} Apr 23 18:05:13.654742 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:13.654746 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-sghjq" event={"ID":"e2e8b183-1919-403e-8b9f-e260d2938ab7","Type":"ContainerStarted","Data":"42d7457d8cba025a0eb87ad5d586ccf115493a2b3926fcdeaf5c1fb5a37eacf2"} Apr 23 18:05:13.654993 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:13.654973 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-sghjq" Apr 23 18:05:13.675019 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:13.674899 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-sghjq" podStartSLOduration=2.693283858 podStartE2EDuration="3.6748837s" podCreationTimestamp="2026-04-23 18:05:10 +0000 UTC" firstStartedPulling="2026-04-23 18:05:11.86409538 +0000 UTC m=+769.784187839" lastFinishedPulling="2026-04-23 18:05:12.845695224 +0000 UTC m=+770.765787681" observedRunningTime="2026-04-23 18:05:13.672995805 +0000 UTC m=+771.593088277" watchObservedRunningTime="2026-04-23 18:05:13.6748837 +0000 UTC m=+771.594976175" Apr 23 18:05:14.657771 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:14.657739 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-sghjq" Apr 23 18:05:14.660064 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:14.660035 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-sghjq" Apr 23 18:05:15.662792 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:15.662753 2568 generic.go:358] "Generic (PLEG): container finished" podID="69efbdf7-2636-4f73-b23d-4b90e5ab00a7" containerID="7aa01e4d043faa6a5b480e6044e9d4b54d390edcb53d284ea9e20a13cc4b5894" exitCode=0 Apr 23 18:05:15.663269 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:15.662819 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" event={"ID":"69efbdf7-2636-4f73-b23d-4b90e5ab00a7","Type":"ContainerDied","Data":"7aa01e4d043faa6a5b480e6044e9d4b54d390edcb53d284ea9e20a13cc4b5894"} Apr 23 18:05:18.421227 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:18.421177 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" podUID="69efbdf7-2636-4f73-b23d-4b90e5ab00a7" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.17:8643/healthz\": dial tcp 10.133.0.17:8643: connect: connection refused" Apr 23 18:05:18.425467 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:18.425441 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" podUID="69efbdf7-2636-4f73-b23d-4b90e5ab00a7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.17:5000: connect: connection refused" Apr 23 18:05:18.425785 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:18.425764 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" podUID="69efbdf7-2636-4f73-b23d-4b90e5ab00a7" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:05:21.671845 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:21.671818 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-sghjq" Apr 23 18:05:23.420605 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:23.420565 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" podUID="69efbdf7-2636-4f73-b23d-4b90e5ab00a7" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.17:8643/healthz\": dial tcp 10.133.0.17:8643: connect: connection refused" Apr 23 18:05:23.420996 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:23.420695 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" Apr 23 18:05:28.421636 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:28.421594 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" podUID="69efbdf7-2636-4f73-b23d-4b90e5ab00a7" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.17:8643/healthz\": dial tcp 10.133.0.17:8643: connect: connection refused" Apr 23 18:05:28.424978 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:28.424952 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" podUID="69efbdf7-2636-4f73-b23d-4b90e5ab00a7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.17:5000: connect: connection refused" Apr 23 18:05:28.425240 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:28.425216 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" podUID="69efbdf7-2636-4f73-b23d-4b90e5ab00a7" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:05:30.887418 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:30.887383 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb"] Apr 23 18:05:30.891054 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:30.891033 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" Apr 23 18:05:30.893068 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:30.893048 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-predictor-serving-cert\"" Apr 23 18:05:30.893168 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:30.893079 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-kube-rbac-proxy-sar-config\"" Apr 23 18:05:30.902084 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:30.902062 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb"] Apr 23 18:05:30.950308 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:30.950279 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b967c25c-c771-4049-b58e-99ba94d6022f-proxy-tls\") pod \"isvc-logger-predictor-69b866599d-9j5jb\" (UID: \"b967c25c-c771-4049-b58e-99ba94d6022f\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" Apr 23 18:05:30.950482 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:30.950371 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b967c25c-c771-4049-b58e-99ba94d6022f-isvc-logger-kube-rbac-proxy-sar-config\") pod \"isvc-logger-predictor-69b866599d-9j5jb\" (UID: \"b967c25c-c771-4049-b58e-99ba94d6022f\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" Apr 23 18:05:30.950482 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:30.950403 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b967c25c-c771-4049-b58e-99ba94d6022f-kserve-provision-location\") pod \"isvc-logger-predictor-69b866599d-9j5jb\" (UID: \"b967c25c-c771-4049-b58e-99ba94d6022f\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" Apr 23 18:05:30.950482 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:30.950439 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh2sz\" (UniqueName: \"kubernetes.io/projected/b967c25c-c771-4049-b58e-99ba94d6022f-kube-api-access-kh2sz\") pod \"isvc-logger-predictor-69b866599d-9j5jb\" (UID: \"b967c25c-c771-4049-b58e-99ba94d6022f\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" Apr 23 18:05:31.051031 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:31.050987 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b967c25c-c771-4049-b58e-99ba94d6022f-proxy-tls\") pod \"isvc-logger-predictor-69b866599d-9j5jb\" (UID: \"b967c25c-c771-4049-b58e-99ba94d6022f\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" Apr 23 18:05:31.051223 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:31.051056 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b967c25c-c771-4049-b58e-99ba94d6022f-isvc-logger-kube-rbac-proxy-sar-config\") pod \"isvc-logger-predictor-69b866599d-9j5jb\" (UID: \"b967c25c-c771-4049-b58e-99ba94d6022f\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" Apr 23 18:05:31.051223 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:31.051077 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b967c25c-c771-4049-b58e-99ba94d6022f-kserve-provision-location\") pod \"isvc-logger-predictor-69b866599d-9j5jb\" (UID: \"b967c25c-c771-4049-b58e-99ba94d6022f\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" Apr 23 18:05:31.051223 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:31.051105 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kh2sz\" (UniqueName: \"kubernetes.io/projected/b967c25c-c771-4049-b58e-99ba94d6022f-kube-api-access-kh2sz\") pod \"isvc-logger-predictor-69b866599d-9j5jb\" (UID: \"b967c25c-c771-4049-b58e-99ba94d6022f\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" Apr 23 18:05:31.051223 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:05:31.051156 2568 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-logger-predictor-serving-cert: secret "isvc-logger-predictor-serving-cert" not found Apr 23 18:05:31.051223 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:05:31.051226 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b967c25c-c771-4049-b58e-99ba94d6022f-proxy-tls podName:b967c25c-c771-4049-b58e-99ba94d6022f nodeName:}" failed. No retries permitted until 2026-04-23 18:05:31.551204585 +0000 UTC m=+789.471297051 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/b967c25c-c771-4049-b58e-99ba94d6022f-proxy-tls") pod "isvc-logger-predictor-69b866599d-9j5jb" (UID: "b967c25c-c771-4049-b58e-99ba94d6022f") : secret "isvc-logger-predictor-serving-cert" not found Apr 23 18:05:31.051514 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:31.051491 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b967c25c-c771-4049-b58e-99ba94d6022f-kserve-provision-location\") pod \"isvc-logger-predictor-69b866599d-9j5jb\" (UID: \"b967c25c-c771-4049-b58e-99ba94d6022f\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" Apr 23 18:05:31.051837 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:31.051815 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b967c25c-c771-4049-b58e-99ba94d6022f-isvc-logger-kube-rbac-proxy-sar-config\") pod \"isvc-logger-predictor-69b866599d-9j5jb\" (UID: \"b967c25c-c771-4049-b58e-99ba94d6022f\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" Apr 23 18:05:31.060048 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:31.060020 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh2sz\" (UniqueName: \"kubernetes.io/projected/b967c25c-c771-4049-b58e-99ba94d6022f-kube-api-access-kh2sz\") pod \"isvc-logger-predictor-69b866599d-9j5jb\" (UID: \"b967c25c-c771-4049-b58e-99ba94d6022f\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" Apr 23 18:05:31.553715 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:31.553680 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b967c25c-c771-4049-b58e-99ba94d6022f-proxy-tls\") pod \"isvc-logger-predictor-69b866599d-9j5jb\" (UID: \"b967c25c-c771-4049-b58e-99ba94d6022f\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" Apr 23 18:05:31.556076 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:31.556052 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b967c25c-c771-4049-b58e-99ba94d6022f-proxy-tls\") pod \"isvc-logger-predictor-69b866599d-9j5jb\" (UID: \"b967c25c-c771-4049-b58e-99ba94d6022f\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" Apr 23 18:05:31.801177 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:31.801137 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" Apr 23 18:05:31.930426 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:31.930399 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb"] Apr 23 18:05:31.933078 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:05:31.933048 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb967c25c_c771_4049_b58e_99ba94d6022f.slice/crio-988988277c9f811f3cbfd7dc57632a26034d9f0b9ca96022026580cee72d90af WatchSource:0}: Error finding container 988988277c9f811f3cbfd7dc57632a26034d9f0b9ca96022026580cee72d90af: Status 404 returned error can't find the container with id 988988277c9f811f3cbfd7dc57632a26034d9f0b9ca96022026580cee72d90af Apr 23 18:05:32.711961 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:32.711858 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" event={"ID":"b967c25c-c771-4049-b58e-99ba94d6022f","Type":"ContainerStarted","Data":"630edd76daa2480bab8fe8ddc52ad0d1a2c9e4dfa8a48e894a0cf00941373673"} Apr 23 18:05:32.711961 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:32.711897 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" event={"ID":"b967c25c-c771-4049-b58e-99ba94d6022f","Type":"ContainerStarted","Data":"988988277c9f811f3cbfd7dc57632a26034d9f0b9ca96022026580cee72d90af"} Apr 23 18:05:33.420814 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:33.420766 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" podUID="69efbdf7-2636-4f73-b23d-4b90e5ab00a7" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.17:8643/healthz\": dial tcp 10.133.0.17:8643: connect: connection refused" Apr 23 18:05:35.723010 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:35.722977 2568 generic.go:358] "Generic (PLEG): container finished" podID="b967c25c-c771-4049-b58e-99ba94d6022f" containerID="630edd76daa2480bab8fe8ddc52ad0d1a2c9e4dfa8a48e894a0cf00941373673" exitCode=0 Apr 23 18:05:35.723375 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:35.723052 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" event={"ID":"b967c25c-c771-4049-b58e-99ba94d6022f","Type":"ContainerDied","Data":"630edd76daa2480bab8fe8ddc52ad0d1a2c9e4dfa8a48e894a0cf00941373673"} Apr 23 18:05:36.728036 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:36.728004 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" event={"ID":"b967c25c-c771-4049-b58e-99ba94d6022f","Type":"ContainerStarted","Data":"5d84420988c9556e7c28a8dc010fb394a1a0457e09cc151342cb75a8e1980d24"} Apr 23 18:05:36.728422 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:36.728045 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" event={"ID":"b967c25c-c771-4049-b58e-99ba94d6022f","Type":"ContainerStarted","Data":"2bc8f05ad732d1000907d1cd39e5bc895419d0412b6e07207cbb9a3117b4cac5"} Apr 23 18:05:36.728422 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:36.728056 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" event={"ID":"b967c25c-c771-4049-b58e-99ba94d6022f","Type":"ContainerStarted","Data":"65ffb5f34346220dcf7d00617ee31559d29ff2a4bcd880909ecfa1bf9aa31b3a"} Apr 23 18:05:36.728422 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:36.728392 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" Apr 23 18:05:36.728547 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:36.728525 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" Apr 23 18:05:36.729797 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:36.729770 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" podUID="b967c25c-c771-4049-b58e-99ba94d6022f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.19:8080: connect: connection refused" Apr 23 18:05:36.748054 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:36.748015 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" podStartSLOduration=6.748003399 podStartE2EDuration="6.748003399s" podCreationTimestamp="2026-04-23 18:05:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:05:36.746696578 +0000 UTC m=+794.666789054" watchObservedRunningTime="2026-04-23 18:05:36.748003399 +0000 UTC m=+794.668095870" Apr 23 18:05:37.730686 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:37.730654 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" Apr 23 18:05:37.731153 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:37.730786 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" podUID="b967c25c-c771-4049-b58e-99ba94d6022f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.19:8080: connect: connection refused" Apr 23 18:05:37.731758 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:37.731733 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" podUID="b967c25c-c771-4049-b58e-99ba94d6022f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:05:38.421350 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:38.421309 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" podUID="69efbdf7-2636-4f73-b23d-4b90e5ab00a7" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.17:8643/healthz\": dial tcp 10.133.0.17:8643: connect: connection refused" Apr 23 18:05:38.425643 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:38.425621 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" podUID="69efbdf7-2636-4f73-b23d-4b90e5ab00a7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.17:5000: connect: connection refused" Apr 23 18:05:38.425741 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:38.425728 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" Apr 23 18:05:38.426038 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:38.426017 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" podUID="69efbdf7-2636-4f73-b23d-4b90e5ab00a7" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:05:38.426159 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:38.426144 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" Apr 23 18:05:38.738730 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:38.738644 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" podUID="b967c25c-c771-4049-b58e-99ba94d6022f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.19:8080: connect: connection refused" Apr 23 18:05:38.739177 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:38.739054 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" podUID="b967c25c-c771-4049-b58e-99ba94d6022f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:05:40.885403 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:40.885379 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" Apr 23 18:05:40.921233 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:40.921208 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/69efbdf7-2636-4f73-b23d-4b90e5ab00a7-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"69efbdf7-2636-4f73-b23d-4b90e5ab00a7\" (UID: \"69efbdf7-2636-4f73-b23d-4b90e5ab00a7\") " Apr 23 18:05:40.921411 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:40.921267 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/69efbdf7-2636-4f73-b23d-4b90e5ab00a7-kserve-provision-location\") pod \"69efbdf7-2636-4f73-b23d-4b90e5ab00a7\" (UID: \"69efbdf7-2636-4f73-b23d-4b90e5ab00a7\") " Apr 23 18:05:40.921411 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:40.921321 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/69efbdf7-2636-4f73-b23d-4b90e5ab00a7-proxy-tls\") pod \"69efbdf7-2636-4f73-b23d-4b90e5ab00a7\" (UID: \"69efbdf7-2636-4f73-b23d-4b90e5ab00a7\") " Apr 23 18:05:40.921411 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:40.921373 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zlmx\" (UniqueName: \"kubernetes.io/projected/69efbdf7-2636-4f73-b23d-4b90e5ab00a7-kube-api-access-5zlmx\") pod \"69efbdf7-2636-4f73-b23d-4b90e5ab00a7\" (UID: \"69efbdf7-2636-4f73-b23d-4b90e5ab00a7\") " Apr 23 18:05:40.921625 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:40.921595 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69efbdf7-2636-4f73-b23d-4b90e5ab00a7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "69efbdf7-2636-4f73-b23d-4b90e5ab00a7" (UID: "69efbdf7-2636-4f73-b23d-4b90e5ab00a7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:05:40.921625 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:40.921610 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69efbdf7-2636-4f73-b23d-4b90e5ab00a7-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config") pod "69efbdf7-2636-4f73-b23d-4b90e5ab00a7" (UID: "69efbdf7-2636-4f73-b23d-4b90e5ab00a7"). InnerVolumeSpecName "isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:05:40.923361 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:40.923341 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69efbdf7-2636-4f73-b23d-4b90e5ab00a7-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "69efbdf7-2636-4f73-b23d-4b90e5ab00a7" (UID: "69efbdf7-2636-4f73-b23d-4b90e5ab00a7"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:05:40.923432 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:40.923416 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69efbdf7-2636-4f73-b23d-4b90e5ab00a7-kube-api-access-5zlmx" (OuterVolumeSpecName: "kube-api-access-5zlmx") pod "69efbdf7-2636-4f73-b23d-4b90e5ab00a7" (UID: "69efbdf7-2636-4f73-b23d-4b90e5ab00a7"). InnerVolumeSpecName "kube-api-access-5zlmx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:05:41.021958 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:41.021851 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/69efbdf7-2636-4f73-b23d-4b90e5ab00a7-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:05:41.021958 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:41.021882 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/69efbdf7-2636-4f73-b23d-4b90e5ab00a7-kserve-provision-location\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:05:41.021958 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:41.021893 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/69efbdf7-2636-4f73-b23d-4b90e5ab00a7-proxy-tls\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:05:41.021958 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:41.021906 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5zlmx\" (UniqueName: \"kubernetes.io/projected/69efbdf7-2636-4f73-b23d-4b90e5ab00a7-kube-api-access-5zlmx\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:05:41.748448 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:41.748414 2568 generic.go:358] "Generic (PLEG): container finished" podID="69efbdf7-2636-4f73-b23d-4b90e5ab00a7" containerID="aeca5fda1a97672cd74a16315dc1e6b594b40385b6a072f2c16bf973b7ad2439" exitCode=0 Apr 23 18:05:41.748614 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:41.748514 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" event={"ID":"69efbdf7-2636-4f73-b23d-4b90e5ab00a7","Type":"ContainerDied","Data":"aeca5fda1a97672cd74a16315dc1e6b594b40385b6a072f2c16bf973b7ad2439"} Apr 23 18:05:41.748614 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:41.748546 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" event={"ID":"69efbdf7-2636-4f73-b23d-4b90e5ab00a7","Type":"ContainerDied","Data":"fc215b02c8e3e88717f555a7f12689dba1311c645b8bcb9b6031365dd62dbe35"} Apr 23 18:05:41.748614 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:41.748562 2568 scope.go:117] "RemoveContainer" containerID="aeca5fda1a97672cd74a16315dc1e6b594b40385b6a072f2c16bf973b7ad2439" Apr 23 18:05:41.748614 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:41.748519 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw" Apr 23 18:05:41.757013 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:41.756953 2568 scope.go:117] "RemoveContainer" containerID="f2be8bc88131d27fc7850b1560d7684d14aa5ec4e8b09a810192e8aab787cea8" Apr 23 18:05:41.763796 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:41.763778 2568 scope.go:117] "RemoveContainer" containerID="7aa01e4d043faa6a5b480e6044e9d4b54d390edcb53d284ea9e20a13cc4b5894" Apr 23 18:05:41.771923 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:41.771886 2568 scope.go:117] "RemoveContainer" containerID="4bf9db6116d732b079a3a6ad3d3b11bf629254c050676b5ad9c527c0fd693a48" Apr 23 18:05:41.772418 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:41.772392 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw"] Apr 23 18:05:41.776223 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:41.776201 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-cb8555798-jp4fw"] Apr 23 18:05:41.779703 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:41.779684 2568 scope.go:117] "RemoveContainer" containerID="aeca5fda1a97672cd74a16315dc1e6b594b40385b6a072f2c16bf973b7ad2439" Apr 23 18:05:41.779982 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:05:41.779961 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aeca5fda1a97672cd74a16315dc1e6b594b40385b6a072f2c16bf973b7ad2439\": container with ID starting with aeca5fda1a97672cd74a16315dc1e6b594b40385b6a072f2c16bf973b7ad2439 not found: ID does not exist" containerID="aeca5fda1a97672cd74a16315dc1e6b594b40385b6a072f2c16bf973b7ad2439" Apr 23 18:05:41.780062 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:41.779990 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aeca5fda1a97672cd74a16315dc1e6b594b40385b6a072f2c16bf973b7ad2439"} err="failed to get container status \"aeca5fda1a97672cd74a16315dc1e6b594b40385b6a072f2c16bf973b7ad2439\": rpc error: code = NotFound desc = could not find container \"aeca5fda1a97672cd74a16315dc1e6b594b40385b6a072f2c16bf973b7ad2439\": container with ID starting with aeca5fda1a97672cd74a16315dc1e6b594b40385b6a072f2c16bf973b7ad2439 not found: ID does not exist" Apr 23 18:05:41.780062 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:41.780012 2568 scope.go:117] "RemoveContainer" containerID="f2be8bc88131d27fc7850b1560d7684d14aa5ec4e8b09a810192e8aab787cea8" Apr 23 18:05:41.780288 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:05:41.780271 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2be8bc88131d27fc7850b1560d7684d14aa5ec4e8b09a810192e8aab787cea8\": container with ID starting with f2be8bc88131d27fc7850b1560d7684d14aa5ec4e8b09a810192e8aab787cea8 not found: ID does not exist" containerID="f2be8bc88131d27fc7850b1560d7684d14aa5ec4e8b09a810192e8aab787cea8" Apr 23 18:05:41.780338 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:41.780291 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2be8bc88131d27fc7850b1560d7684d14aa5ec4e8b09a810192e8aab787cea8"} err="failed to get container status \"f2be8bc88131d27fc7850b1560d7684d14aa5ec4e8b09a810192e8aab787cea8\": rpc error: code = NotFound desc = could not find container \"f2be8bc88131d27fc7850b1560d7684d14aa5ec4e8b09a810192e8aab787cea8\": container with ID starting with f2be8bc88131d27fc7850b1560d7684d14aa5ec4e8b09a810192e8aab787cea8 not found: ID does not exist" Apr 23 18:05:41.780338 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:41.780305 2568 scope.go:117] "RemoveContainer" containerID="7aa01e4d043faa6a5b480e6044e9d4b54d390edcb53d284ea9e20a13cc4b5894" Apr 23 18:05:41.780546 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:05:41.780527 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7aa01e4d043faa6a5b480e6044e9d4b54d390edcb53d284ea9e20a13cc4b5894\": container with ID starting with 7aa01e4d043faa6a5b480e6044e9d4b54d390edcb53d284ea9e20a13cc4b5894 not found: ID does not exist" containerID="7aa01e4d043faa6a5b480e6044e9d4b54d390edcb53d284ea9e20a13cc4b5894" Apr 23 18:05:41.780582 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:41.780553 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7aa01e4d043faa6a5b480e6044e9d4b54d390edcb53d284ea9e20a13cc4b5894"} err="failed to get container status \"7aa01e4d043faa6a5b480e6044e9d4b54d390edcb53d284ea9e20a13cc4b5894\": rpc error: code = NotFound desc = could not find container \"7aa01e4d043faa6a5b480e6044e9d4b54d390edcb53d284ea9e20a13cc4b5894\": container with ID starting with 7aa01e4d043faa6a5b480e6044e9d4b54d390edcb53d284ea9e20a13cc4b5894 not found: ID does not exist" Apr 23 18:05:41.780582 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:41.780572 2568 scope.go:117] "RemoveContainer" containerID="4bf9db6116d732b079a3a6ad3d3b11bf629254c050676b5ad9c527c0fd693a48" Apr 23 18:05:41.780808 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:05:41.780791 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bf9db6116d732b079a3a6ad3d3b11bf629254c050676b5ad9c527c0fd693a48\": container with ID starting with 4bf9db6116d732b079a3a6ad3d3b11bf629254c050676b5ad9c527c0fd693a48 not found: ID does not exist" containerID="4bf9db6116d732b079a3a6ad3d3b11bf629254c050676b5ad9c527c0fd693a48" Apr 23 18:05:41.780862 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:41.780811 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bf9db6116d732b079a3a6ad3d3b11bf629254c050676b5ad9c527c0fd693a48"} err="failed to get container status \"4bf9db6116d732b079a3a6ad3d3b11bf629254c050676b5ad9c527c0fd693a48\": rpc error: code = NotFound desc = could not find container \"4bf9db6116d732b079a3a6ad3d3b11bf629254c050676b5ad9c527c0fd693a48\": container with ID starting with 4bf9db6116d732b079a3a6ad3d3b11bf629254c050676b5ad9c527c0fd693a48 not found: ID does not exist" Apr 23 18:05:42.599267 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:42.599231 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69efbdf7-2636-4f73-b23d-4b90e5ab00a7" path="/var/lib/kubelet/pods/69efbdf7-2636-4f73-b23d-4b90e5ab00a7/volumes" Apr 23 18:05:43.742376 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:43.742337 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" Apr 23 18:05:43.743057 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:43.743022 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" podUID="b967c25c-c771-4049-b58e-99ba94d6022f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.19:8080: connect: connection refused" Apr 23 18:05:43.743424 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:43.743388 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" podUID="b967c25c-c771-4049-b58e-99ba94d6022f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:05:53.742883 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:53.742833 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" podUID="b967c25c-c771-4049-b58e-99ba94d6022f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.19:8080: connect: connection refused" Apr 23 18:05:53.743391 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:05:53.743284 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" podUID="b967c25c-c771-4049-b58e-99ba94d6022f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:06:03.743547 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:03.743491 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" podUID="b967c25c-c771-4049-b58e-99ba94d6022f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.19:8080: connect: connection refused" Apr 23 18:06:03.743999 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:03.743974 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" podUID="b967c25c-c771-4049-b58e-99ba94d6022f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:06:13.743520 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:13.743479 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" podUID="b967c25c-c771-4049-b58e-99ba94d6022f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.19:8080: connect: connection refused" Apr 23 18:06:13.744070 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:13.743970 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" podUID="b967c25c-c771-4049-b58e-99ba94d6022f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:06:23.742906 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:23.742859 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" podUID="b967c25c-c771-4049-b58e-99ba94d6022f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.19:8080: connect: connection refused" Apr 23 18:06:23.743384 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:23.743222 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" podUID="b967c25c-c771-4049-b58e-99ba94d6022f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:06:33.743119 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:33.743062 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" podUID="b967c25c-c771-4049-b58e-99ba94d6022f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.19:8080: connect: connection refused" Apr 23 18:06:33.743668 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:33.743574 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" podUID="b967c25c-c771-4049-b58e-99ba94d6022f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:06:43.744070 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:43.744037 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" Apr 23 18:06:43.744502 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:43.744101 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" Apr 23 18:06:55.894557 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:55.894491 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-predictor-c7d86bcbd-sghjq_e2e8b183-1919-403e-8b9f-e260d2938ab7/kserve-container/0.log" Apr 23 18:06:56.102093 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.102059 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb"] Apr 23 18:06:56.102435 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.102403 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" podUID="b967c25c-c771-4049-b58e-99ba94d6022f" containerName="kserve-container" containerID="cri-o://65ffb5f34346220dcf7d00617ee31559d29ff2a4bcd880909ecfa1bf9aa31b3a" gracePeriod=30 Apr 23 18:06:56.102564 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.102434 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" podUID="b967c25c-c771-4049-b58e-99ba94d6022f" containerName="agent" containerID="cri-o://5d84420988c9556e7c28a8dc010fb394a1a0457e09cc151342cb75a8e1980d24" gracePeriod=30 Apr 23 18:06:56.102564 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.102435 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" podUID="b967c25c-c771-4049-b58e-99ba94d6022f" containerName="kube-rbac-proxy" containerID="cri-o://2bc8f05ad732d1000907d1cd39e5bc895419d0412b6e07207cbb9a3117b4cac5" gracePeriod=30 Apr 23 18:06:56.147665 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.147595 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4gkg"] Apr 23 18:06:56.147901 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.147888 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="69efbdf7-2636-4f73-b23d-4b90e5ab00a7" containerName="storage-initializer" Apr 23 18:06:56.147970 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.147903 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="69efbdf7-2636-4f73-b23d-4b90e5ab00a7" containerName="storage-initializer" Apr 23 18:06:56.147970 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.147910 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="69efbdf7-2636-4f73-b23d-4b90e5ab00a7" containerName="agent" Apr 23 18:06:56.147970 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.147915 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="69efbdf7-2636-4f73-b23d-4b90e5ab00a7" containerName="agent" Apr 23 18:06:56.147970 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.147924 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="69efbdf7-2636-4f73-b23d-4b90e5ab00a7" containerName="kserve-container" Apr 23 18:06:56.147970 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.147946 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="69efbdf7-2636-4f73-b23d-4b90e5ab00a7" containerName="kserve-container" Apr 23 18:06:56.147970 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.147961 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="69efbdf7-2636-4f73-b23d-4b90e5ab00a7" containerName="kube-rbac-proxy" Apr 23 18:06:56.147970 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.147966 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="69efbdf7-2636-4f73-b23d-4b90e5ab00a7" containerName="kube-rbac-proxy" Apr 23 18:06:56.148199 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.148012 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="69efbdf7-2636-4f73-b23d-4b90e5ab00a7" containerName="kserve-container" Apr 23 18:06:56.148199 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.148021 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="69efbdf7-2636-4f73-b23d-4b90e5ab00a7" containerName="agent" Apr 23 18:06:56.148199 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.148028 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="69efbdf7-2636-4f73-b23d-4b90e5ab00a7" containerName="kube-rbac-proxy" Apr 23 18:06:56.151474 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.151449 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4gkg" Apr 23 18:06:56.153358 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.153334 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-kube-rbac-proxy-sar-config\"" Apr 23 18:06:56.153358 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.153353 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-predictor-serving-cert\"" Apr 23 18:06:56.160868 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.160847 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4gkg"] Apr 23 18:06:56.195845 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.195819 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-sghjq"] Apr 23 18:06:56.196252 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.196221 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-sghjq" podUID="e2e8b183-1919-403e-8b9f-e260d2938ab7" containerName="kube-rbac-proxy" containerID="cri-o://fdc2985a467a79e5032bdb985d3a2b5b3ab55eedd7c8ef69cc13028da0d0f86f" gracePeriod=30 Apr 23 18:06:56.196252 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.196221 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-sghjq" podUID="e2e8b183-1919-403e-8b9f-e260d2938ab7" containerName="kserve-container" containerID="cri-o://42d7457d8cba025a0eb87ad5d586ccf115493a2b3926fcdeaf5c1fb5a37eacf2" gracePeriod=30 Apr 23 18:06:56.299316 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.299276 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d7ce4542-2f87-4b41-919e-89c1c90d88ec-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-predictor-bdf964bd-j4gkg\" (UID: \"d7ce4542-2f87-4b41-919e-89c1c90d88ec\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4gkg" Apr 23 18:06:56.299504 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.299327 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d7ce4542-2f87-4b41-919e-89c1c90d88ec-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-j4gkg\" (UID: \"d7ce4542-2f87-4b41-919e-89c1c90d88ec\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4gkg" Apr 23 18:06:56.299504 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.299444 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjmsr\" (UniqueName: \"kubernetes.io/projected/d7ce4542-2f87-4b41-919e-89c1c90d88ec-kube-api-access-pjmsr\") pod \"isvc-lightgbm-predictor-bdf964bd-j4gkg\" (UID: \"d7ce4542-2f87-4b41-919e-89c1c90d88ec\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4gkg" Apr 23 18:06:56.299618 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.299515 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d7ce4542-2f87-4b41-919e-89c1c90d88ec-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bdf964bd-j4gkg\" (UID: \"d7ce4542-2f87-4b41-919e-89c1c90d88ec\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4gkg" Apr 23 18:06:56.400498 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.400403 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d7ce4542-2f87-4b41-919e-89c1c90d88ec-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-predictor-bdf964bd-j4gkg\" (UID: \"d7ce4542-2f87-4b41-919e-89c1c90d88ec\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4gkg" Apr 23 18:06:56.400498 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.400443 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d7ce4542-2f87-4b41-919e-89c1c90d88ec-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-j4gkg\" (UID: \"d7ce4542-2f87-4b41-919e-89c1c90d88ec\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4gkg" Apr 23 18:06:56.400713 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.400610 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pjmsr\" (UniqueName: \"kubernetes.io/projected/d7ce4542-2f87-4b41-919e-89c1c90d88ec-kube-api-access-pjmsr\") pod \"isvc-lightgbm-predictor-bdf964bd-j4gkg\" (UID: \"d7ce4542-2f87-4b41-919e-89c1c90d88ec\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4gkg" Apr 23 18:06:56.400713 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.400684 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d7ce4542-2f87-4b41-919e-89c1c90d88ec-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bdf964bd-j4gkg\" (UID: \"d7ce4542-2f87-4b41-919e-89c1c90d88ec\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4gkg" Apr 23 18:06:56.401053 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.401032 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d7ce4542-2f87-4b41-919e-89c1c90d88ec-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bdf964bd-j4gkg\" (UID: \"d7ce4542-2f87-4b41-919e-89c1c90d88ec\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4gkg" Apr 23 18:06:56.401209 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.401193 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d7ce4542-2f87-4b41-919e-89c1c90d88ec-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-predictor-bdf964bd-j4gkg\" (UID: \"d7ce4542-2f87-4b41-919e-89c1c90d88ec\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4gkg" Apr 23 18:06:56.402740 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.402718 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d7ce4542-2f87-4b41-919e-89c1c90d88ec-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-j4gkg\" (UID: \"d7ce4542-2f87-4b41-919e-89c1c90d88ec\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4gkg" Apr 23 18:06:56.409113 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.409093 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjmsr\" (UniqueName: \"kubernetes.io/projected/d7ce4542-2f87-4b41-919e-89c1c90d88ec-kube-api-access-pjmsr\") pod \"isvc-lightgbm-predictor-bdf964bd-j4gkg\" (UID: \"d7ce4542-2f87-4b41-919e-89c1c90d88ec\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4gkg" Apr 23 18:06:56.428244 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.428228 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-sghjq" Apr 23 18:06:56.463099 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.463066 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4gkg" Apr 23 18:06:56.582330 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.582271 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4gkg"] Apr 23 18:06:56.584963 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:06:56.584922 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7ce4542_2f87_4b41_919e_89c1c90d88ec.slice/crio-0bf60072ed520e78ed566d043081dbd636c4188c9019bab83744949b83e1f43f WatchSource:0}: Error finding container 0bf60072ed520e78ed566d043081dbd636c4188c9019bab83744949b83e1f43f: Status 404 returned error can't find the container with id 0bf60072ed520e78ed566d043081dbd636c4188c9019bab83744949b83e1f43f Apr 23 18:06:56.601767 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.601743 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e2e8b183-1919-403e-8b9f-e260d2938ab7-message-dumper-kube-rbac-proxy-sar-config\") pod \"e2e8b183-1919-403e-8b9f-e260d2938ab7\" (UID: \"e2e8b183-1919-403e-8b9f-e260d2938ab7\") " Apr 23 18:06:56.601873 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.601797 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tltkx\" (UniqueName: \"kubernetes.io/projected/e2e8b183-1919-403e-8b9f-e260d2938ab7-kube-api-access-tltkx\") pod \"e2e8b183-1919-403e-8b9f-e260d2938ab7\" (UID: \"e2e8b183-1919-403e-8b9f-e260d2938ab7\") " Apr 23 18:06:56.601873 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.601819 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e2e8b183-1919-403e-8b9f-e260d2938ab7-proxy-tls\") pod \"e2e8b183-1919-403e-8b9f-e260d2938ab7\" (UID: \"e2e8b183-1919-403e-8b9f-e260d2938ab7\") " Apr 23 18:06:56.602175 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.602147 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2e8b183-1919-403e-8b9f-e260d2938ab7-message-dumper-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "message-dumper-kube-rbac-proxy-sar-config") pod "e2e8b183-1919-403e-8b9f-e260d2938ab7" (UID: "e2e8b183-1919-403e-8b9f-e260d2938ab7"). InnerVolumeSpecName "message-dumper-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:06:56.603854 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.603831 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2e8b183-1919-403e-8b9f-e260d2938ab7-kube-api-access-tltkx" (OuterVolumeSpecName: "kube-api-access-tltkx") pod "e2e8b183-1919-403e-8b9f-e260d2938ab7" (UID: "e2e8b183-1919-403e-8b9f-e260d2938ab7"). InnerVolumeSpecName "kube-api-access-tltkx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:06:56.603960 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.603898 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2e8b183-1919-403e-8b9f-e260d2938ab7-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e2e8b183-1919-403e-8b9f-e260d2938ab7" (UID: "e2e8b183-1919-403e-8b9f-e260d2938ab7"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:06:56.703023 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.702989 2568 reconciler_common.go:299] "Volume detached for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e2e8b183-1919-403e-8b9f-e260d2938ab7-message-dumper-kube-rbac-proxy-sar-config\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:06:56.703023 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.703018 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tltkx\" (UniqueName: \"kubernetes.io/projected/e2e8b183-1919-403e-8b9f-e260d2938ab7-kube-api-access-tltkx\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:06:56.703023 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.703028 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e2e8b183-1919-403e-8b9f-e260d2938ab7-proxy-tls\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:06:56.966692 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.966599 2568 generic.go:358] "Generic (PLEG): container finished" podID="e2e8b183-1919-403e-8b9f-e260d2938ab7" containerID="fdc2985a467a79e5032bdb985d3a2b5b3ab55eedd7c8ef69cc13028da0d0f86f" exitCode=2 Apr 23 18:06:56.966692 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.966625 2568 generic.go:358] "Generic (PLEG): container finished" podID="e2e8b183-1919-403e-8b9f-e260d2938ab7" containerID="42d7457d8cba025a0eb87ad5d586ccf115493a2b3926fcdeaf5c1fb5a37eacf2" exitCode=2 Apr 23 18:06:56.967241 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.966696 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-sghjq" Apr 23 18:06:56.967241 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.966687 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-sghjq" event={"ID":"e2e8b183-1919-403e-8b9f-e260d2938ab7","Type":"ContainerDied","Data":"fdc2985a467a79e5032bdb985d3a2b5b3ab55eedd7c8ef69cc13028da0d0f86f"} Apr 23 18:06:56.967241 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.966823 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-sghjq" event={"ID":"e2e8b183-1919-403e-8b9f-e260d2938ab7","Type":"ContainerDied","Data":"42d7457d8cba025a0eb87ad5d586ccf115493a2b3926fcdeaf5c1fb5a37eacf2"} Apr 23 18:06:56.967241 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.966845 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-sghjq" event={"ID":"e2e8b183-1919-403e-8b9f-e260d2938ab7","Type":"ContainerDied","Data":"98188dae3e69ae9bb94b1df73b9beed1f3a4e69bfe895d1bd7970c68d497f992"} Apr 23 18:06:56.967241 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.966866 2568 scope.go:117] "RemoveContainer" containerID="fdc2985a467a79e5032bdb985d3a2b5b3ab55eedd7c8ef69cc13028da0d0f86f" Apr 23 18:06:56.968374 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.968335 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4gkg" event={"ID":"d7ce4542-2f87-4b41-919e-89c1c90d88ec","Type":"ContainerStarted","Data":"f6979b71754a315d465defed5a864784547ba5eadab22d7e8cc25b5706b7a264"} Apr 23 18:06:56.968503 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.968382 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4gkg" event={"ID":"d7ce4542-2f87-4b41-919e-89c1c90d88ec","Type":"ContainerStarted","Data":"0bf60072ed520e78ed566d043081dbd636c4188c9019bab83744949b83e1f43f"} Apr 23 18:06:56.971037 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.971014 2568 generic.go:358] "Generic (PLEG): container finished" podID="b967c25c-c771-4049-b58e-99ba94d6022f" containerID="2bc8f05ad732d1000907d1cd39e5bc895419d0412b6e07207cbb9a3117b4cac5" exitCode=2 Apr 23 18:06:56.971153 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.971083 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" event={"ID":"b967c25c-c771-4049-b58e-99ba94d6022f","Type":"ContainerDied","Data":"2bc8f05ad732d1000907d1cd39e5bc895419d0412b6e07207cbb9a3117b4cac5"} Apr 23 18:06:56.975537 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.975442 2568 scope.go:117] "RemoveContainer" containerID="42d7457d8cba025a0eb87ad5d586ccf115493a2b3926fcdeaf5c1fb5a37eacf2" Apr 23 18:06:56.983347 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.983326 2568 scope.go:117] "RemoveContainer" containerID="fdc2985a467a79e5032bdb985d3a2b5b3ab55eedd7c8ef69cc13028da0d0f86f" Apr 23 18:06:56.983609 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:06:56.983588 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdc2985a467a79e5032bdb985d3a2b5b3ab55eedd7c8ef69cc13028da0d0f86f\": container with ID starting with fdc2985a467a79e5032bdb985d3a2b5b3ab55eedd7c8ef69cc13028da0d0f86f not found: ID does not exist" containerID="fdc2985a467a79e5032bdb985d3a2b5b3ab55eedd7c8ef69cc13028da0d0f86f" Apr 23 18:06:56.983653 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.983618 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdc2985a467a79e5032bdb985d3a2b5b3ab55eedd7c8ef69cc13028da0d0f86f"} err="failed to get container status \"fdc2985a467a79e5032bdb985d3a2b5b3ab55eedd7c8ef69cc13028da0d0f86f\": rpc error: code = NotFound desc = could not find container \"fdc2985a467a79e5032bdb985d3a2b5b3ab55eedd7c8ef69cc13028da0d0f86f\": container with ID starting with fdc2985a467a79e5032bdb985d3a2b5b3ab55eedd7c8ef69cc13028da0d0f86f not found: ID does not exist" Apr 23 18:06:56.983653 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.983637 2568 scope.go:117] "RemoveContainer" containerID="42d7457d8cba025a0eb87ad5d586ccf115493a2b3926fcdeaf5c1fb5a37eacf2" Apr 23 18:06:56.983922 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:06:56.983888 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42d7457d8cba025a0eb87ad5d586ccf115493a2b3926fcdeaf5c1fb5a37eacf2\": container with ID starting with 42d7457d8cba025a0eb87ad5d586ccf115493a2b3926fcdeaf5c1fb5a37eacf2 not found: ID does not exist" containerID="42d7457d8cba025a0eb87ad5d586ccf115493a2b3926fcdeaf5c1fb5a37eacf2" Apr 23 18:06:56.983922 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.983919 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42d7457d8cba025a0eb87ad5d586ccf115493a2b3926fcdeaf5c1fb5a37eacf2"} err="failed to get container status \"42d7457d8cba025a0eb87ad5d586ccf115493a2b3926fcdeaf5c1fb5a37eacf2\": rpc error: code = NotFound desc = could not find container \"42d7457d8cba025a0eb87ad5d586ccf115493a2b3926fcdeaf5c1fb5a37eacf2\": container with ID starting with 42d7457d8cba025a0eb87ad5d586ccf115493a2b3926fcdeaf5c1fb5a37eacf2 not found: ID does not exist" Apr 23 18:06:56.983922 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.983955 2568 scope.go:117] "RemoveContainer" containerID="fdc2985a467a79e5032bdb985d3a2b5b3ab55eedd7c8ef69cc13028da0d0f86f" Apr 23 18:06:56.984219 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.984199 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdc2985a467a79e5032bdb985d3a2b5b3ab55eedd7c8ef69cc13028da0d0f86f"} err="failed to get container status \"fdc2985a467a79e5032bdb985d3a2b5b3ab55eedd7c8ef69cc13028da0d0f86f\": rpc error: code = NotFound desc = could not find container \"fdc2985a467a79e5032bdb985d3a2b5b3ab55eedd7c8ef69cc13028da0d0f86f\": container with ID starting with fdc2985a467a79e5032bdb985d3a2b5b3ab55eedd7c8ef69cc13028da0d0f86f not found: ID does not exist" Apr 23 18:06:56.984269 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.984220 2568 scope.go:117] "RemoveContainer" containerID="42d7457d8cba025a0eb87ad5d586ccf115493a2b3926fcdeaf5c1fb5a37eacf2" Apr 23 18:06:56.984474 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:56.984455 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42d7457d8cba025a0eb87ad5d586ccf115493a2b3926fcdeaf5c1fb5a37eacf2"} err="failed to get container status \"42d7457d8cba025a0eb87ad5d586ccf115493a2b3926fcdeaf5c1fb5a37eacf2\": rpc error: code = NotFound desc = could not find container \"42d7457d8cba025a0eb87ad5d586ccf115493a2b3926fcdeaf5c1fb5a37eacf2\": container with ID starting with 42d7457d8cba025a0eb87ad5d586ccf115493a2b3926fcdeaf5c1fb5a37eacf2 not found: ID does not exist" Apr 23 18:06:57.006056 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:57.006021 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-sghjq"] Apr 23 18:06:57.010829 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:57.010800 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-sghjq"] Apr 23 18:06:58.599204 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:58.599166 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2e8b183-1919-403e-8b9f-e260d2938ab7" path="/var/lib/kubelet/pods/e2e8b183-1919-403e-8b9f-e260d2938ab7/volumes" Apr 23 18:06:58.739374 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:06:58.739335 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" podUID="b967c25c-c771-4049-b58e-99ba94d6022f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.19:8643/healthz\": dial tcp 10.133.0.19:8643: connect: connection refused" Apr 23 18:07:00.989416 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:00.989381 2568 generic.go:358] "Generic (PLEG): container finished" podID="b967c25c-c771-4049-b58e-99ba94d6022f" containerID="65ffb5f34346220dcf7d00617ee31559d29ff2a4bcd880909ecfa1bf9aa31b3a" exitCode=0 Apr 23 18:07:00.989865 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:00.989452 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" event={"ID":"b967c25c-c771-4049-b58e-99ba94d6022f","Type":"ContainerDied","Data":"65ffb5f34346220dcf7d00617ee31559d29ff2a4bcd880909ecfa1bf9aa31b3a"} Apr 23 18:07:00.990820 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:00.990798 2568 generic.go:358] "Generic (PLEG): container finished" podID="d7ce4542-2f87-4b41-919e-89c1c90d88ec" containerID="f6979b71754a315d465defed5a864784547ba5eadab22d7e8cc25b5706b7a264" exitCode=0 Apr 23 18:07:00.990950 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:00.990838 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4gkg" event={"ID":"d7ce4542-2f87-4b41-919e-89c1c90d88ec","Type":"ContainerDied","Data":"f6979b71754a315d465defed5a864784547ba5eadab22d7e8cc25b5706b7a264"} Apr 23 18:07:03.739233 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:03.739133 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" podUID="b967c25c-c771-4049-b58e-99ba94d6022f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.19:8643/healthz\": dial tcp 10.133.0.19:8643: connect: connection refused" Apr 23 18:07:03.743567 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:03.743527 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" podUID="b967c25c-c771-4049-b58e-99ba94d6022f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.19:8080: connect: connection refused" Apr 23 18:07:03.743909 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:03.743886 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" podUID="b967c25c-c771-4049-b58e-99ba94d6022f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:07:08.017859 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:08.017823 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4gkg" event={"ID":"d7ce4542-2f87-4b41-919e-89c1c90d88ec","Type":"ContainerStarted","Data":"f1ddeeafa88bdf7a154c36a205f6cff8a6ee36a66895bd693227ea00f94ef82c"} Apr 23 18:07:08.017859 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:08.017863 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4gkg" event={"ID":"d7ce4542-2f87-4b41-919e-89c1c90d88ec","Type":"ContainerStarted","Data":"70362469ac383c910a28e984cef6f09285676ff0594daa6c8d238a5545f52b70"} Apr 23 18:07:08.018410 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:08.018089 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4gkg" Apr 23 18:07:08.036278 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:08.036206 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4gkg" podStartSLOduration=5.994308429 podStartE2EDuration="12.036190392s" podCreationTimestamp="2026-04-23 18:06:56 +0000 UTC" firstStartedPulling="2026-04-23 18:07:00.992129548 +0000 UTC m=+878.912222002" lastFinishedPulling="2026-04-23 18:07:07.034011504 +0000 UTC m=+884.954103965" observedRunningTime="2026-04-23 18:07:08.035402417 +0000 UTC m=+885.955494892" watchObservedRunningTime="2026-04-23 18:07:08.036190392 +0000 UTC m=+885.956282867" Apr 23 18:07:08.739807 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:08.739768 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" podUID="b967c25c-c771-4049-b58e-99ba94d6022f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.19:8643/healthz\": dial tcp 10.133.0.19:8643: connect: connection refused" Apr 23 18:07:08.739995 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:08.739896 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" Apr 23 18:07:09.020582 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:09.020497 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4gkg" Apr 23 18:07:09.021755 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:09.021728 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4gkg" podUID="d7ce4542-2f87-4b41-919e-89c1c90d88ec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 23 18:07:10.023126 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:10.023084 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4gkg" podUID="d7ce4542-2f87-4b41-919e-89c1c90d88ec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 23 18:07:13.739347 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:13.739297 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" podUID="b967c25c-c771-4049-b58e-99ba94d6022f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.19:8643/healthz\": dial tcp 10.133.0.19:8643: connect: connection refused" Apr 23 18:07:13.743711 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:13.743678 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" podUID="b967c25c-c771-4049-b58e-99ba94d6022f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.19:8080: connect: connection refused" Apr 23 18:07:13.743996 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:13.743972 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" podUID="b967c25c-c771-4049-b58e-99ba94d6022f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:07:15.027418 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:15.027388 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4gkg" Apr 23 18:07:15.028008 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:15.027982 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4gkg" podUID="d7ce4542-2f87-4b41-919e-89c1c90d88ec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 23 18:07:18.739552 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:18.739501 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" podUID="b967c25c-c771-4049-b58e-99ba94d6022f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.19:8643/healthz\": dial tcp 10.133.0.19:8643: connect: connection refused" Apr 23 18:07:22.569553 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:22.569530 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwrk7_52f4d4c2-438f-4486-8ffb-4afbaad14bf1/ovn-acl-logging/0.log" Apr 23 18:07:22.569950 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:22.569552 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwrk7_52f4d4c2-438f-4486-8ffb-4afbaad14bf1/ovn-acl-logging/0.log" Apr 23 18:07:22.573524 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:22.573496 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_8dc4456d0c7e39a0756b16fde0e83318/kube-rbac-proxy-crio/4.log" Apr 23 18:07:22.573778 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:22.573758 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_8dc4456d0c7e39a0756b16fde0e83318/kube-rbac-proxy-crio/4.log" Apr 23 18:07:23.739055 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:23.739010 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" podUID="b967c25c-c771-4049-b58e-99ba94d6022f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.19:8643/healthz\": dial tcp 10.133.0.19:8643: connect: connection refused" Apr 23 18:07:23.743285 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:23.743257 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" podUID="b967c25c-c771-4049-b58e-99ba94d6022f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.19:8080: connect: connection refused" Apr 23 18:07:23.743410 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:23.743393 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" Apr 23 18:07:23.743709 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:23.743682 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" podUID="b967c25c-c771-4049-b58e-99ba94d6022f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:07:23.743826 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:23.743811 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" Apr 23 18:07:25.028477 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:25.028436 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4gkg" podUID="d7ce4542-2f87-4b41-919e-89c1c90d88ec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 23 18:07:26.744117 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:26.744093 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" Apr 23 18:07:26.827064 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:26.827021 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b967c25c-c771-4049-b58e-99ba94d6022f-isvc-logger-kube-rbac-proxy-sar-config\") pod \"b967c25c-c771-4049-b58e-99ba94d6022f\" (UID: \"b967c25c-c771-4049-b58e-99ba94d6022f\") " Apr 23 18:07:26.827266 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:26.827090 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kh2sz\" (UniqueName: \"kubernetes.io/projected/b967c25c-c771-4049-b58e-99ba94d6022f-kube-api-access-kh2sz\") pod \"b967c25c-c771-4049-b58e-99ba94d6022f\" (UID: \"b967c25c-c771-4049-b58e-99ba94d6022f\") " Apr 23 18:07:26.827266 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:26.827187 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b967c25c-c771-4049-b58e-99ba94d6022f-kserve-provision-location\") pod \"b967c25c-c771-4049-b58e-99ba94d6022f\" (UID: \"b967c25c-c771-4049-b58e-99ba94d6022f\") " Apr 23 18:07:26.827266 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:26.827227 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b967c25c-c771-4049-b58e-99ba94d6022f-proxy-tls\") pod \"b967c25c-c771-4049-b58e-99ba94d6022f\" (UID: \"b967c25c-c771-4049-b58e-99ba94d6022f\") " Apr 23 18:07:26.827431 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:26.827399 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b967c25c-c771-4049-b58e-99ba94d6022f-isvc-logger-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-logger-kube-rbac-proxy-sar-config") pod "b967c25c-c771-4049-b58e-99ba94d6022f" (UID: "b967c25c-c771-4049-b58e-99ba94d6022f"). InnerVolumeSpecName "isvc-logger-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:07:26.827526 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:26.827498 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b967c25c-c771-4049-b58e-99ba94d6022f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b967c25c-c771-4049-b58e-99ba94d6022f" (UID: "b967c25c-c771-4049-b58e-99ba94d6022f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:07:26.829217 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:26.829184 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b967c25c-c771-4049-b58e-99ba94d6022f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b967c25c-c771-4049-b58e-99ba94d6022f" (UID: "b967c25c-c771-4049-b58e-99ba94d6022f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:07:26.829327 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:26.829245 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b967c25c-c771-4049-b58e-99ba94d6022f-kube-api-access-kh2sz" (OuterVolumeSpecName: "kube-api-access-kh2sz") pod "b967c25c-c771-4049-b58e-99ba94d6022f" (UID: "b967c25c-c771-4049-b58e-99ba94d6022f"). InnerVolumeSpecName "kube-api-access-kh2sz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:07:26.928614 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:26.928531 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b967c25c-c771-4049-b58e-99ba94d6022f-isvc-logger-kube-rbac-proxy-sar-config\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:07:26.928614 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:26.928563 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kh2sz\" (UniqueName: \"kubernetes.io/projected/b967c25c-c771-4049-b58e-99ba94d6022f-kube-api-access-kh2sz\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:07:26.928614 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:26.928574 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b967c25c-c771-4049-b58e-99ba94d6022f-kserve-provision-location\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:07:26.928614 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:26.928586 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b967c25c-c771-4049-b58e-99ba94d6022f-proxy-tls\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:07:27.072091 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:27.072061 2568 generic.go:358] "Generic (PLEG): container finished" podID="b967c25c-c771-4049-b58e-99ba94d6022f" containerID="5d84420988c9556e7c28a8dc010fb394a1a0457e09cc151342cb75a8e1980d24" exitCode=0 Apr 23 18:07:27.072251 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:27.072114 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" event={"ID":"b967c25c-c771-4049-b58e-99ba94d6022f","Type":"ContainerDied","Data":"5d84420988c9556e7c28a8dc010fb394a1a0457e09cc151342cb75a8e1980d24"} Apr 23 18:07:27.072251 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:27.072160 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" Apr 23 18:07:27.072251 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:27.072168 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb" event={"ID":"b967c25c-c771-4049-b58e-99ba94d6022f","Type":"ContainerDied","Data":"988988277c9f811f3cbfd7dc57632a26034d9f0b9ca96022026580cee72d90af"} Apr 23 18:07:27.072251 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:27.072193 2568 scope.go:117] "RemoveContainer" containerID="5d84420988c9556e7c28a8dc010fb394a1a0457e09cc151342cb75a8e1980d24" Apr 23 18:07:27.080904 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:27.080882 2568 scope.go:117] "RemoveContainer" containerID="2bc8f05ad732d1000907d1cd39e5bc895419d0412b6e07207cbb9a3117b4cac5" Apr 23 18:07:27.088160 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:27.088136 2568 scope.go:117] "RemoveContainer" containerID="65ffb5f34346220dcf7d00617ee31559d29ff2a4bcd880909ecfa1bf9aa31b3a" Apr 23 18:07:27.096145 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:27.095984 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb"] Apr 23 18:07:27.096218 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:27.096185 2568 scope.go:117] "RemoveContainer" containerID="630edd76daa2480bab8fe8ddc52ad0d1a2c9e4dfa8a48e894a0cf00941373673" Apr 23 18:07:27.099962 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:27.099922 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-69b866599d-9j5jb"] Apr 23 18:07:27.103780 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:27.103760 2568 scope.go:117] "RemoveContainer" containerID="5d84420988c9556e7c28a8dc010fb394a1a0457e09cc151342cb75a8e1980d24" Apr 23 18:07:27.104110 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:07:27.104086 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d84420988c9556e7c28a8dc010fb394a1a0457e09cc151342cb75a8e1980d24\": container with ID starting with 5d84420988c9556e7c28a8dc010fb394a1a0457e09cc151342cb75a8e1980d24 not found: ID does not exist" containerID="5d84420988c9556e7c28a8dc010fb394a1a0457e09cc151342cb75a8e1980d24" Apr 23 18:07:27.104166 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:27.104116 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d84420988c9556e7c28a8dc010fb394a1a0457e09cc151342cb75a8e1980d24"} err="failed to get container status \"5d84420988c9556e7c28a8dc010fb394a1a0457e09cc151342cb75a8e1980d24\": rpc error: code = NotFound desc = could not find container \"5d84420988c9556e7c28a8dc010fb394a1a0457e09cc151342cb75a8e1980d24\": container with ID starting with 5d84420988c9556e7c28a8dc010fb394a1a0457e09cc151342cb75a8e1980d24 not found: ID does not exist" Apr 23 18:07:27.104166 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:27.104136 2568 scope.go:117] "RemoveContainer" containerID="2bc8f05ad732d1000907d1cd39e5bc895419d0412b6e07207cbb9a3117b4cac5" Apr 23 18:07:27.104402 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:07:27.104381 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bc8f05ad732d1000907d1cd39e5bc895419d0412b6e07207cbb9a3117b4cac5\": container with ID starting with 2bc8f05ad732d1000907d1cd39e5bc895419d0412b6e07207cbb9a3117b4cac5 not found: ID does not exist" containerID="2bc8f05ad732d1000907d1cd39e5bc895419d0412b6e07207cbb9a3117b4cac5" Apr 23 18:07:27.104455 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:27.104408 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bc8f05ad732d1000907d1cd39e5bc895419d0412b6e07207cbb9a3117b4cac5"} err="failed to get container status \"2bc8f05ad732d1000907d1cd39e5bc895419d0412b6e07207cbb9a3117b4cac5\": rpc error: code = NotFound desc = could not find container \"2bc8f05ad732d1000907d1cd39e5bc895419d0412b6e07207cbb9a3117b4cac5\": container with ID starting with 2bc8f05ad732d1000907d1cd39e5bc895419d0412b6e07207cbb9a3117b4cac5 not found: ID does not exist" Apr 23 18:07:27.104455 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:27.104425 2568 scope.go:117] "RemoveContainer" containerID="65ffb5f34346220dcf7d00617ee31559d29ff2a4bcd880909ecfa1bf9aa31b3a" Apr 23 18:07:27.104672 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:07:27.104656 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65ffb5f34346220dcf7d00617ee31559d29ff2a4bcd880909ecfa1bf9aa31b3a\": container with ID starting with 65ffb5f34346220dcf7d00617ee31559d29ff2a4bcd880909ecfa1bf9aa31b3a not found: ID does not exist" containerID="65ffb5f34346220dcf7d00617ee31559d29ff2a4bcd880909ecfa1bf9aa31b3a" Apr 23 18:07:27.104721 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:27.104675 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65ffb5f34346220dcf7d00617ee31559d29ff2a4bcd880909ecfa1bf9aa31b3a"} err="failed to get container status \"65ffb5f34346220dcf7d00617ee31559d29ff2a4bcd880909ecfa1bf9aa31b3a\": rpc error: code = NotFound desc = could not find container \"65ffb5f34346220dcf7d00617ee31559d29ff2a4bcd880909ecfa1bf9aa31b3a\": container with ID starting with 65ffb5f34346220dcf7d00617ee31559d29ff2a4bcd880909ecfa1bf9aa31b3a not found: ID does not exist" Apr 23 18:07:27.104721 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:27.104688 2568 scope.go:117] "RemoveContainer" containerID="630edd76daa2480bab8fe8ddc52ad0d1a2c9e4dfa8a48e894a0cf00941373673" Apr 23 18:07:27.104952 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:07:27.104914 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"630edd76daa2480bab8fe8ddc52ad0d1a2c9e4dfa8a48e894a0cf00941373673\": container with ID starting with 630edd76daa2480bab8fe8ddc52ad0d1a2c9e4dfa8a48e894a0cf00941373673 not found: ID does not exist" containerID="630edd76daa2480bab8fe8ddc52ad0d1a2c9e4dfa8a48e894a0cf00941373673" Apr 23 18:07:27.105070 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:27.105051 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"630edd76daa2480bab8fe8ddc52ad0d1a2c9e4dfa8a48e894a0cf00941373673"} err="failed to get container status \"630edd76daa2480bab8fe8ddc52ad0d1a2c9e4dfa8a48e894a0cf00941373673\": rpc error: code = NotFound desc = could not find container \"630edd76daa2480bab8fe8ddc52ad0d1a2c9e4dfa8a48e894a0cf00941373673\": container with ID starting with 630edd76daa2480bab8fe8ddc52ad0d1a2c9e4dfa8a48e894a0cf00941373673 not found: ID does not exist" Apr 23 18:07:28.598775 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:28.598741 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b967c25c-c771-4049-b58e-99ba94d6022f" path="/var/lib/kubelet/pods/b967c25c-c771-4049-b58e-99ba94d6022f/volumes" Apr 23 18:07:35.028561 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:35.028523 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4gkg" podUID="d7ce4542-2f87-4b41-919e-89c1c90d88ec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 23 18:07:45.028057 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:45.028014 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4gkg" podUID="d7ce4542-2f87-4b41-919e-89c1c90d88ec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 23 18:07:55.028121 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:07:55.028080 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4gkg" podUID="d7ce4542-2f87-4b41-919e-89c1c90d88ec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 23 18:08:05.028026 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:05.027984 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4gkg" podUID="d7ce4542-2f87-4b41-919e-89c1c90d88ec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 23 18:08:15.028232 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:15.028189 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4gkg" podUID="d7ce4542-2f87-4b41-919e-89c1c90d88ec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 23 18:08:25.028082 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:25.028046 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4gkg" Apr 23 18:08:26.263383 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:26.263347 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4gkg"] Apr 23 18:08:26.263765 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:26.263628 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4gkg" podUID="d7ce4542-2f87-4b41-919e-89c1c90d88ec" containerName="kserve-container" containerID="cri-o://70362469ac383c910a28e984cef6f09285676ff0594daa6c8d238a5545f52b70" gracePeriod=30 Apr 23 18:08:26.263765 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:26.263694 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4gkg" podUID="d7ce4542-2f87-4b41-919e-89c1c90d88ec" containerName="kube-rbac-proxy" containerID="cri-o://f1ddeeafa88bdf7a154c36a205f6cff8a6ee36a66895bd693227ea00f94ef82c" gracePeriod=30 Apr 23 18:08:26.384138 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:26.384107 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8"] Apr 23 18:08:26.384395 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:26.384380 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b967c25c-c771-4049-b58e-99ba94d6022f" containerName="storage-initializer" Apr 23 18:08:26.384451 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:26.384398 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="b967c25c-c771-4049-b58e-99ba94d6022f" containerName="storage-initializer" Apr 23 18:08:26.384451 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:26.384407 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e2e8b183-1919-403e-8b9f-e260d2938ab7" containerName="kserve-container" Apr 23 18:08:26.384451 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:26.384412 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2e8b183-1919-403e-8b9f-e260d2938ab7" containerName="kserve-container" Apr 23 18:08:26.384451 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:26.384423 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b967c25c-c771-4049-b58e-99ba94d6022f" containerName="agent" Apr 23 18:08:26.384451 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:26.384428 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="b967c25c-c771-4049-b58e-99ba94d6022f" containerName="agent" Apr 23 18:08:26.384451 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:26.384445 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b967c25c-c771-4049-b58e-99ba94d6022f" containerName="kube-rbac-proxy" Apr 23 18:08:26.384451 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:26.384451 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="b967c25c-c771-4049-b58e-99ba94d6022f" containerName="kube-rbac-proxy" Apr 23 18:08:26.384664 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:26.384458 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e2e8b183-1919-403e-8b9f-e260d2938ab7" containerName="kube-rbac-proxy" Apr 23 18:08:26.384664 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:26.384466 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2e8b183-1919-403e-8b9f-e260d2938ab7" containerName="kube-rbac-proxy" Apr 23 18:08:26.384664 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:26.384474 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b967c25c-c771-4049-b58e-99ba94d6022f" containerName="kserve-container" Apr 23 18:08:26.384664 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:26.384479 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="b967c25c-c771-4049-b58e-99ba94d6022f" containerName="kserve-container" Apr 23 18:08:26.384664 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:26.384543 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="b967c25c-c771-4049-b58e-99ba94d6022f" containerName="kube-rbac-proxy" Apr 23 18:08:26.384664 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:26.384553 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="e2e8b183-1919-403e-8b9f-e260d2938ab7" containerName="kube-rbac-proxy" Apr 23 18:08:26.384664 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:26.384560 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="b967c25c-c771-4049-b58e-99ba94d6022f" containerName="agent" Apr 23 18:08:26.384664 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:26.384567 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="e2e8b183-1919-403e-8b9f-e260d2938ab7" containerName="kserve-container" Apr 23 18:08:26.384664 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:26.384574 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="b967c25c-c771-4049-b58e-99ba94d6022f" containerName="kserve-container" Apr 23 18:08:26.387528 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:26.387512 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8" Apr 23 18:08:26.389227 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:26.389210 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-runtime-predictor-serving-cert\"" Apr 23 18:08:26.389293 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:26.389227 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\"" Apr 23 18:08:26.397223 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:26.397126 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8"] Apr 23 18:08:26.483254 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:26.483223 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljwmv\" (UniqueName: \"kubernetes.io/projected/45fb45b8-ae27-4403-be3b-94860db0002c-kube-api-access-ljwmv\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8\" (UID: \"45fb45b8-ae27-4403-be3b-94860db0002c\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8" Apr 23 18:08:26.483420 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:26.483267 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/45fb45b8-ae27-4403-be3b-94860db0002c-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8\" (UID: \"45fb45b8-ae27-4403-be3b-94860db0002c\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8" Apr 23 18:08:26.483420 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:26.483319 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/45fb45b8-ae27-4403-be3b-94860db0002c-proxy-tls\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8\" (UID: \"45fb45b8-ae27-4403-be3b-94860db0002c\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8" Apr 23 18:08:26.483420 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:26.483346 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/45fb45b8-ae27-4403-be3b-94860db0002c-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8\" (UID: \"45fb45b8-ae27-4403-be3b-94860db0002c\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8" Apr 23 18:08:26.584081 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:26.584049 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/45fb45b8-ae27-4403-be3b-94860db0002c-proxy-tls\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8\" (UID: \"45fb45b8-ae27-4403-be3b-94860db0002c\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8" Apr 23 18:08:26.584261 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:26.584100 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/45fb45b8-ae27-4403-be3b-94860db0002c-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8\" (UID: \"45fb45b8-ae27-4403-be3b-94860db0002c\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8" Apr 23 18:08:26.584261 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:26.584141 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ljwmv\" (UniqueName: \"kubernetes.io/projected/45fb45b8-ae27-4403-be3b-94860db0002c-kube-api-access-ljwmv\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8\" (UID: \"45fb45b8-ae27-4403-be3b-94860db0002c\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8" Apr 23 18:08:26.584261 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:26.584181 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/45fb45b8-ae27-4403-be3b-94860db0002c-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8\" (UID: \"45fb45b8-ae27-4403-be3b-94860db0002c\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8" Apr 23 18:08:26.584261 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:08:26.584207 2568 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-serving-cert: secret "isvc-lightgbm-runtime-predictor-serving-cert" not found Apr 23 18:08:26.584493 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:08:26.584281 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45fb45b8-ae27-4403-be3b-94860db0002c-proxy-tls podName:45fb45b8-ae27-4403-be3b-94860db0002c nodeName:}" failed. No retries permitted until 2026-04-23 18:08:27.08426217 +0000 UTC m=+965.004354637 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/45fb45b8-ae27-4403-be3b-94860db0002c-proxy-tls") pod "isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8" (UID: "45fb45b8-ae27-4403-be3b-94860db0002c") : secret "isvc-lightgbm-runtime-predictor-serving-cert" not found Apr 23 18:08:26.584539 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:26.584497 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/45fb45b8-ae27-4403-be3b-94860db0002c-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8\" (UID: \"45fb45b8-ae27-4403-be3b-94860db0002c\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8" Apr 23 18:08:26.584823 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:26.584803 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/45fb45b8-ae27-4403-be3b-94860db0002c-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8\" (UID: \"45fb45b8-ae27-4403-be3b-94860db0002c\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8" Apr 23 18:08:26.611968 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:26.611925 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljwmv\" (UniqueName: \"kubernetes.io/projected/45fb45b8-ae27-4403-be3b-94860db0002c-kube-api-access-ljwmv\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8\" (UID: \"45fb45b8-ae27-4403-be3b-94860db0002c\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8" Apr 23 18:08:27.086918 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:27.086878 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/45fb45b8-ae27-4403-be3b-94860db0002c-proxy-tls\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8\" (UID: \"45fb45b8-ae27-4403-be3b-94860db0002c\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8" Apr 23 18:08:27.089354 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:27.089333 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/45fb45b8-ae27-4403-be3b-94860db0002c-proxy-tls\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8\" (UID: \"45fb45b8-ae27-4403-be3b-94860db0002c\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8" Apr 23 18:08:27.239810 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:27.239771 2568 generic.go:358] "Generic (PLEG): container finished" podID="d7ce4542-2f87-4b41-919e-89c1c90d88ec" containerID="f1ddeeafa88bdf7a154c36a205f6cff8a6ee36a66895bd693227ea00f94ef82c" exitCode=2 Apr 23 18:08:27.239991 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:27.239820 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4gkg" event={"ID":"d7ce4542-2f87-4b41-919e-89c1c90d88ec","Type":"ContainerDied","Data":"f1ddeeafa88bdf7a154c36a205f6cff8a6ee36a66895bd693227ea00f94ef82c"} Apr 23 18:08:27.298164 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:27.298126 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8" Apr 23 18:08:27.419218 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:27.419186 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8"] Apr 23 18:08:27.422180 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:08:27.422154 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45fb45b8_ae27_4403_be3b_94860db0002c.slice/crio-34eee23ac06e0b2582dbb0c18847502ab8aec2e9bce753f39f41c3117c178a32 WatchSource:0}: Error finding container 34eee23ac06e0b2582dbb0c18847502ab8aec2e9bce753f39f41c3117c178a32: Status 404 returned error can't find the container with id 34eee23ac06e0b2582dbb0c18847502ab8aec2e9bce753f39f41c3117c178a32 Apr 23 18:08:28.243692 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:28.243648 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8" event={"ID":"45fb45b8-ae27-4403-be3b-94860db0002c","Type":"ContainerStarted","Data":"d4e70083795c6eb401deb7d137588254825e7875647036a676f151825b213f89"} Apr 23 18:08:28.243692 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:28.243684 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8" event={"ID":"45fb45b8-ae27-4403-be3b-94860db0002c","Type":"ContainerStarted","Data":"34eee23ac06e0b2582dbb0c18847502ab8aec2e9bce753f39f41c3117c178a32"} Apr 23 18:08:30.023376 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:30.023337 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4gkg" podUID="d7ce4542-2f87-4b41-919e-89c1c90d88ec" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.20:8643/healthz\": dial tcp 10.133.0.20:8643: connect: connection refused" Apr 23 18:08:30.704787 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:30.704765 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4gkg" Apr 23 18:08:30.711074 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:30.711056 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d7ce4542-2f87-4b41-919e-89c1c90d88ec-proxy-tls\") pod \"d7ce4542-2f87-4b41-919e-89c1c90d88ec\" (UID: \"d7ce4542-2f87-4b41-919e-89c1c90d88ec\") " Apr 23 18:08:30.711189 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:30.711085 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d7ce4542-2f87-4b41-919e-89c1c90d88ec-kserve-provision-location\") pod \"d7ce4542-2f87-4b41-919e-89c1c90d88ec\" (UID: \"d7ce4542-2f87-4b41-919e-89c1c90d88ec\") " Apr 23 18:08:30.711189 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:30.711128 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d7ce4542-2f87-4b41-919e-89c1c90d88ec-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"d7ce4542-2f87-4b41-919e-89c1c90d88ec\" (UID: \"d7ce4542-2f87-4b41-919e-89c1c90d88ec\") " Apr 23 18:08:30.711189 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:30.711149 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjmsr\" (UniqueName: \"kubernetes.io/projected/d7ce4542-2f87-4b41-919e-89c1c90d88ec-kube-api-access-pjmsr\") pod \"d7ce4542-2f87-4b41-919e-89c1c90d88ec\" (UID: \"d7ce4542-2f87-4b41-919e-89c1c90d88ec\") " Apr 23 18:08:30.711444 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:30.711419 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7ce4542-2f87-4b41-919e-89c1c90d88ec-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d7ce4542-2f87-4b41-919e-89c1c90d88ec" (UID: "d7ce4542-2f87-4b41-919e-89c1c90d88ec"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:08:30.711444 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:30.711447 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7ce4542-2f87-4b41-919e-89c1c90d88ec-isvc-lightgbm-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-kube-rbac-proxy-sar-config") pod "d7ce4542-2f87-4b41-919e-89c1c90d88ec" (UID: "d7ce4542-2f87-4b41-919e-89c1c90d88ec"). InnerVolumeSpecName "isvc-lightgbm-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:08:30.713106 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:30.713086 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7ce4542-2f87-4b41-919e-89c1c90d88ec-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d7ce4542-2f87-4b41-919e-89c1c90d88ec" (UID: "d7ce4542-2f87-4b41-919e-89c1c90d88ec"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:08:30.713229 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:30.713210 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7ce4542-2f87-4b41-919e-89c1c90d88ec-kube-api-access-pjmsr" (OuterVolumeSpecName: "kube-api-access-pjmsr") pod "d7ce4542-2f87-4b41-919e-89c1c90d88ec" (UID: "d7ce4542-2f87-4b41-919e-89c1c90d88ec"). InnerVolumeSpecName "kube-api-access-pjmsr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:08:30.811785 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:30.811698 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d7ce4542-2f87-4b41-919e-89c1c90d88ec-proxy-tls\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:08:30.811785 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:30.811732 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d7ce4542-2f87-4b41-919e-89c1c90d88ec-kserve-provision-location\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:08:30.811785 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:30.811748 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d7ce4542-2f87-4b41-919e-89c1c90d88ec-isvc-lightgbm-kube-rbac-proxy-sar-config\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:08:30.811785 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:30.811758 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pjmsr\" (UniqueName: \"kubernetes.io/projected/d7ce4542-2f87-4b41-919e-89c1c90d88ec-kube-api-access-pjmsr\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:08:31.251602 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:31.251570 2568 generic.go:358] "Generic (PLEG): container finished" podID="45fb45b8-ae27-4403-be3b-94860db0002c" containerID="d4e70083795c6eb401deb7d137588254825e7875647036a676f151825b213f89" exitCode=0 Apr 23 18:08:31.252023 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:31.251650 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8" event={"ID":"45fb45b8-ae27-4403-be3b-94860db0002c","Type":"ContainerDied","Data":"d4e70083795c6eb401deb7d137588254825e7875647036a676f151825b213f89"} Apr 23 18:08:31.253308 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:31.253289 2568 generic.go:358] "Generic (PLEG): container finished" podID="d7ce4542-2f87-4b41-919e-89c1c90d88ec" containerID="70362469ac383c910a28e984cef6f09285676ff0594daa6c8d238a5545f52b70" exitCode=0 Apr 23 18:08:31.253403 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:31.253357 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4gkg" event={"ID":"d7ce4542-2f87-4b41-919e-89c1c90d88ec","Type":"ContainerDied","Data":"70362469ac383c910a28e984cef6f09285676ff0594daa6c8d238a5545f52b70"} Apr 23 18:08:31.253403 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:31.253379 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4gkg" event={"ID":"d7ce4542-2f87-4b41-919e-89c1c90d88ec","Type":"ContainerDied","Data":"0bf60072ed520e78ed566d043081dbd636c4188c9019bab83744949b83e1f43f"} Apr 23 18:08:31.253403 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:31.253379 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4gkg" Apr 23 18:08:31.253403 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:31.253394 2568 scope.go:117] "RemoveContainer" containerID="f1ddeeafa88bdf7a154c36a205f6cff8a6ee36a66895bd693227ea00f94ef82c" Apr 23 18:08:31.260774 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:31.260752 2568 scope.go:117] "RemoveContainer" containerID="70362469ac383c910a28e984cef6f09285676ff0594daa6c8d238a5545f52b70" Apr 23 18:08:31.270764 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:31.270742 2568 scope.go:117] "RemoveContainer" containerID="f6979b71754a315d465defed5a864784547ba5eadab22d7e8cc25b5706b7a264" Apr 23 18:08:31.282799 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:31.282778 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4gkg"] Apr 23 18:08:31.289055 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:31.289034 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4gkg"] Apr 23 18:08:31.289438 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:31.289415 2568 scope.go:117] "RemoveContainer" containerID="f1ddeeafa88bdf7a154c36a205f6cff8a6ee36a66895bd693227ea00f94ef82c" Apr 23 18:08:31.289734 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:08:31.289712 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1ddeeafa88bdf7a154c36a205f6cff8a6ee36a66895bd693227ea00f94ef82c\": container with ID starting with f1ddeeafa88bdf7a154c36a205f6cff8a6ee36a66895bd693227ea00f94ef82c not found: ID does not exist" containerID="f1ddeeafa88bdf7a154c36a205f6cff8a6ee36a66895bd693227ea00f94ef82c" Apr 23 18:08:31.289827 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:31.289745 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1ddeeafa88bdf7a154c36a205f6cff8a6ee36a66895bd693227ea00f94ef82c"} err="failed to get container status \"f1ddeeafa88bdf7a154c36a205f6cff8a6ee36a66895bd693227ea00f94ef82c\": rpc error: code = NotFound desc = could not find container \"f1ddeeafa88bdf7a154c36a205f6cff8a6ee36a66895bd693227ea00f94ef82c\": container with ID starting with f1ddeeafa88bdf7a154c36a205f6cff8a6ee36a66895bd693227ea00f94ef82c not found: ID does not exist" Apr 23 18:08:31.289827 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:31.289774 2568 scope.go:117] "RemoveContainer" containerID="70362469ac383c910a28e984cef6f09285676ff0594daa6c8d238a5545f52b70" Apr 23 18:08:31.290102 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:08:31.290074 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70362469ac383c910a28e984cef6f09285676ff0594daa6c8d238a5545f52b70\": container with ID starting with 70362469ac383c910a28e984cef6f09285676ff0594daa6c8d238a5545f52b70 not found: ID does not exist" containerID="70362469ac383c910a28e984cef6f09285676ff0594daa6c8d238a5545f52b70" Apr 23 18:08:31.290165 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:31.290108 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70362469ac383c910a28e984cef6f09285676ff0594daa6c8d238a5545f52b70"} err="failed to get container status \"70362469ac383c910a28e984cef6f09285676ff0594daa6c8d238a5545f52b70\": rpc error: code = NotFound desc = could not find container \"70362469ac383c910a28e984cef6f09285676ff0594daa6c8d238a5545f52b70\": container with ID starting with 70362469ac383c910a28e984cef6f09285676ff0594daa6c8d238a5545f52b70 not found: ID does not exist" Apr 23 18:08:31.290165 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:31.290124 2568 scope.go:117] "RemoveContainer" containerID="f6979b71754a315d465defed5a864784547ba5eadab22d7e8cc25b5706b7a264" Apr 23 18:08:31.290372 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:08:31.290356 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6979b71754a315d465defed5a864784547ba5eadab22d7e8cc25b5706b7a264\": container with ID starting with f6979b71754a315d465defed5a864784547ba5eadab22d7e8cc25b5706b7a264 not found: ID does not exist" containerID="f6979b71754a315d465defed5a864784547ba5eadab22d7e8cc25b5706b7a264" Apr 23 18:08:31.290416 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:31.290374 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6979b71754a315d465defed5a864784547ba5eadab22d7e8cc25b5706b7a264"} err="failed to get container status \"f6979b71754a315d465defed5a864784547ba5eadab22d7e8cc25b5706b7a264\": rpc error: code = NotFound desc = could not find container \"f6979b71754a315d465defed5a864784547ba5eadab22d7e8cc25b5706b7a264\": container with ID starting with f6979b71754a315d465defed5a864784547ba5eadab22d7e8cc25b5706b7a264 not found: ID does not exist" Apr 23 18:08:32.257467 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:32.257382 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8" event={"ID":"45fb45b8-ae27-4403-be3b-94860db0002c","Type":"ContainerStarted","Data":"65839c0f1b4349e57cdda09e87b0c294ff6369b4b941c7c6bb621ea460cffced"} Apr 23 18:08:32.257467 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:32.257427 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8" event={"ID":"45fb45b8-ae27-4403-be3b-94860db0002c","Type":"ContainerStarted","Data":"e2b861d6a82a913e0852394c4cd6e2bd66c928c6fb723010c0577945441c97dd"} Apr 23 18:08:32.257963 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:32.257738 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8" Apr 23 18:08:32.257963 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:32.257768 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8" Apr 23 18:08:32.258920 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:32.258898 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8" podUID="45fb45b8-ae27-4403-be3b-94860db0002c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 23 18:08:32.277173 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:32.277131 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8" podStartSLOduration=6.277116908 podStartE2EDuration="6.277116908s" podCreationTimestamp="2026-04-23 18:08:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:08:32.275052306 +0000 UTC m=+970.195144781" watchObservedRunningTime="2026-04-23 18:08:32.277116908 +0000 UTC m=+970.197209384" Apr 23 18:08:32.598674 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:32.598644 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7ce4542-2f87-4b41-919e-89c1c90d88ec" path="/var/lib/kubelet/pods/d7ce4542-2f87-4b41-919e-89c1c90d88ec/volumes" Apr 23 18:08:33.261379 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:33.261345 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8" podUID="45fb45b8-ae27-4403-be3b-94860db0002c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 23 18:08:38.265428 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:38.265400 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8" Apr 23 18:08:38.266013 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:38.265981 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8" podUID="45fb45b8-ae27-4403-be3b-94860db0002c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 23 18:08:48.266569 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:48.266527 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8" podUID="45fb45b8-ae27-4403-be3b-94860db0002c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 23 18:08:58.266480 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:08:58.266440 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8" podUID="45fb45b8-ae27-4403-be3b-94860db0002c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 23 18:09:08.266113 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:09:08.266073 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8" podUID="45fb45b8-ae27-4403-be3b-94860db0002c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 23 18:09:18.266408 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:09:18.266355 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8" podUID="45fb45b8-ae27-4403-be3b-94860db0002c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 23 18:09:28.266370 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:09:28.266333 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8" podUID="45fb45b8-ae27-4403-be3b-94860db0002c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 23 18:09:38.266252 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:09:38.266210 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8" podUID="45fb45b8-ae27-4403-be3b-94860db0002c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 23 18:09:42.596793 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:09:42.596746 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8" podUID="45fb45b8-ae27-4403-be3b-94860db0002c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 23 18:09:52.598582 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:09:52.598553 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8" Apr 23 18:09:56.719392 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:09:56.719360 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8"] Apr 23 18:09:56.719776 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:09:56.719720 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8" podUID="45fb45b8-ae27-4403-be3b-94860db0002c" containerName="kserve-container" containerID="cri-o://e2b861d6a82a913e0852394c4cd6e2bd66c928c6fb723010c0577945441c97dd" gracePeriod=30 Apr 23 18:09:56.719828 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:09:56.719800 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8" podUID="45fb45b8-ae27-4403-be3b-94860db0002c" containerName="kube-rbac-proxy" containerID="cri-o://65839c0f1b4349e57cdda09e87b0c294ff6369b4b941c7c6bb621ea460cffced" gracePeriod=30 Apr 23 18:09:56.840417 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:09:56.840389 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-zq5vm"] Apr 23 18:09:56.840661 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:09:56.840649 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7ce4542-2f87-4b41-919e-89c1c90d88ec" containerName="storage-initializer" Apr 23 18:09:56.840730 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:09:56.840662 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7ce4542-2f87-4b41-919e-89c1c90d88ec" containerName="storage-initializer" Apr 23 18:09:56.840730 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:09:56.840671 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7ce4542-2f87-4b41-919e-89c1c90d88ec" containerName="kserve-container" Apr 23 18:09:56.840730 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:09:56.840682 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7ce4542-2f87-4b41-919e-89c1c90d88ec" containerName="kserve-container" Apr 23 18:09:56.840730 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:09:56.840698 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7ce4542-2f87-4b41-919e-89c1c90d88ec" containerName="kube-rbac-proxy" Apr 23 18:09:56.840730 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:09:56.840704 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7ce4542-2f87-4b41-919e-89c1c90d88ec" containerName="kube-rbac-proxy" Apr 23 18:09:56.840895 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:09:56.840745 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="d7ce4542-2f87-4b41-919e-89c1c90d88ec" containerName="kube-rbac-proxy" Apr 23 18:09:56.840895 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:09:56.840752 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="d7ce4542-2f87-4b41-919e-89c1c90d88ec" containerName="kserve-container" Apr 23 18:09:56.848217 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:09:56.848195 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-zq5vm" Apr 23 18:09:56.850172 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:09:56.850149 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-runtime-predictor-serving-cert\"" Apr 23 18:09:56.850289 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:09:56.850227 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 23 18:09:56.855010 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:09:56.854813 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-zq5vm"] Apr 23 18:09:56.907075 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:09:56.907041 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6236ec1f-b23f-4f89-be8e-ed5ebd0237bb-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-zq5vm\" (UID: \"6236ec1f-b23f-4f89-be8e-ed5ebd0237bb\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-zq5vm" Apr 23 18:09:56.907238 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:09:56.907077 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkvbr\" (UniqueName: \"kubernetes.io/projected/6236ec1f-b23f-4f89-be8e-ed5ebd0237bb-kube-api-access-gkvbr\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-zq5vm\" (UID: \"6236ec1f-b23f-4f89-be8e-ed5ebd0237bb\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-zq5vm" Apr 23 18:09:56.907238 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:09:56.907174 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6236ec1f-b23f-4f89-be8e-ed5ebd0237bb-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-zq5vm\" (UID: \"6236ec1f-b23f-4f89-be8e-ed5ebd0237bb\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-zq5vm" Apr 23 18:09:56.907238 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:09:56.907212 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6236ec1f-b23f-4f89-be8e-ed5ebd0237bb-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-zq5vm\" (UID: \"6236ec1f-b23f-4f89-be8e-ed5ebd0237bb\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-zq5vm" Apr 23 18:09:57.008172 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:09:57.008079 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6236ec1f-b23f-4f89-be8e-ed5ebd0237bb-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-zq5vm\" (UID: \"6236ec1f-b23f-4f89-be8e-ed5ebd0237bb\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-zq5vm" Apr 23 18:09:57.008172 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:09:57.008117 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6236ec1f-b23f-4f89-be8e-ed5ebd0237bb-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-zq5vm\" (UID: \"6236ec1f-b23f-4f89-be8e-ed5ebd0237bb\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-zq5vm" Apr 23 18:09:57.008172 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:09:57.008164 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6236ec1f-b23f-4f89-be8e-ed5ebd0237bb-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-zq5vm\" (UID: \"6236ec1f-b23f-4f89-be8e-ed5ebd0237bb\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-zq5vm" Apr 23 18:09:57.008448 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:09:57.008183 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gkvbr\" (UniqueName: \"kubernetes.io/projected/6236ec1f-b23f-4f89-be8e-ed5ebd0237bb-kube-api-access-gkvbr\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-zq5vm\" (UID: \"6236ec1f-b23f-4f89-be8e-ed5ebd0237bb\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-zq5vm" Apr 23 18:09:57.008631 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:09:57.008610 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6236ec1f-b23f-4f89-be8e-ed5ebd0237bb-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-zq5vm\" (UID: \"6236ec1f-b23f-4f89-be8e-ed5ebd0237bb\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-zq5vm" Apr 23 18:09:57.008888 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:09:57.008870 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6236ec1f-b23f-4f89-be8e-ed5ebd0237bb-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-zq5vm\" (UID: \"6236ec1f-b23f-4f89-be8e-ed5ebd0237bb\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-zq5vm" Apr 23 18:09:57.010523 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:09:57.010505 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6236ec1f-b23f-4f89-be8e-ed5ebd0237bb-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-zq5vm\" (UID: \"6236ec1f-b23f-4f89-be8e-ed5ebd0237bb\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-zq5vm" Apr 23 18:09:57.016511 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:09:57.016488 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkvbr\" (UniqueName: \"kubernetes.io/projected/6236ec1f-b23f-4f89-be8e-ed5ebd0237bb-kube-api-access-gkvbr\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-zq5vm\" (UID: \"6236ec1f-b23f-4f89-be8e-ed5ebd0237bb\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-zq5vm" Apr 23 18:09:57.158905 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:09:57.158867 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-zq5vm" Apr 23 18:09:57.277369 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:09:57.277293 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-zq5vm"] Apr 23 18:09:57.280498 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:09:57.280471 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6236ec1f_b23f_4f89_be8e_ed5ebd0237bb.slice/crio-b5021deca92977e86516b66b762bb9fd7cbfecb74471b12ed021d171b04761d7 WatchSource:0}: Error finding container b5021deca92977e86516b66b762bb9fd7cbfecb74471b12ed021d171b04761d7: Status 404 returned error can't find the container with id b5021deca92977e86516b66b762bb9fd7cbfecb74471b12ed021d171b04761d7 Apr 23 18:09:57.487281 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:09:57.487239 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-zq5vm" event={"ID":"6236ec1f-b23f-4f89-be8e-ed5ebd0237bb","Type":"ContainerStarted","Data":"8e7c705239d7376f228110d880e96044c383ef58cc5179ac8fc7aa36f014cb5f"} Apr 23 18:09:57.487281 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:09:57.487280 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-zq5vm" event={"ID":"6236ec1f-b23f-4f89-be8e-ed5ebd0237bb","Type":"ContainerStarted","Data":"b5021deca92977e86516b66b762bb9fd7cbfecb74471b12ed021d171b04761d7"} Apr 23 18:09:57.489362 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:09:57.489335 2568 generic.go:358] "Generic (PLEG): container finished" podID="45fb45b8-ae27-4403-be3b-94860db0002c" containerID="65839c0f1b4349e57cdda09e87b0c294ff6369b4b941c7c6bb621ea460cffced" exitCode=2 Apr 23 18:09:57.489613 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:09:57.489394 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8" event={"ID":"45fb45b8-ae27-4403-be3b-94860db0002c","Type":"ContainerDied","Data":"65839c0f1b4349e57cdda09e87b0c294ff6369b4b941c7c6bb621ea460cffced"} Apr 23 18:09:58.262615 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:09:58.262570 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8" podUID="45fb45b8-ae27-4403-be3b-94860db0002c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.21:8643/healthz\": dial tcp 10.133.0.21:8643: connect: connection refused" Apr 23 18:10:01.503026 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:10:01.502991 2568 generic.go:358] "Generic (PLEG): container finished" podID="6236ec1f-b23f-4f89-be8e-ed5ebd0237bb" containerID="8e7c705239d7376f228110d880e96044c383ef58cc5179ac8fc7aa36f014cb5f" exitCode=0 Apr 23 18:10:01.503394 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:10:01.503064 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-zq5vm" event={"ID":"6236ec1f-b23f-4f89-be8e-ed5ebd0237bb","Type":"ContainerDied","Data":"8e7c705239d7376f228110d880e96044c383ef58cc5179ac8fc7aa36f014cb5f"} Apr 23 18:10:01.666303 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:10:01.666245 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8" Apr 23 18:10:01.744183 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:10:01.744095 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/45fb45b8-ae27-4403-be3b-94860db0002c-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"45fb45b8-ae27-4403-be3b-94860db0002c\" (UID: \"45fb45b8-ae27-4403-be3b-94860db0002c\") " Apr 23 18:10:01.744183 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:10:01.744166 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljwmv\" (UniqueName: \"kubernetes.io/projected/45fb45b8-ae27-4403-be3b-94860db0002c-kube-api-access-ljwmv\") pod \"45fb45b8-ae27-4403-be3b-94860db0002c\" (UID: \"45fb45b8-ae27-4403-be3b-94860db0002c\") " Apr 23 18:10:01.744415 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:10:01.744217 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/45fb45b8-ae27-4403-be3b-94860db0002c-kserve-provision-location\") pod \"45fb45b8-ae27-4403-be3b-94860db0002c\" (UID: \"45fb45b8-ae27-4403-be3b-94860db0002c\") " Apr 23 18:10:01.744415 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:10:01.744245 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/45fb45b8-ae27-4403-be3b-94860db0002c-proxy-tls\") pod \"45fb45b8-ae27-4403-be3b-94860db0002c\" (UID: \"45fb45b8-ae27-4403-be3b-94860db0002c\") " Apr 23 18:10:01.744554 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:10:01.744517 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45fb45b8-ae27-4403-be3b-94860db0002c-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-runtime-kube-rbac-proxy-sar-config") pod "45fb45b8-ae27-4403-be3b-94860db0002c" (UID: "45fb45b8-ae27-4403-be3b-94860db0002c"). InnerVolumeSpecName "isvc-lightgbm-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:10:01.744667 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:10:01.744620 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45fb45b8-ae27-4403-be3b-94860db0002c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "45fb45b8-ae27-4403-be3b-94860db0002c" (UID: "45fb45b8-ae27-4403-be3b-94860db0002c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:10:01.746357 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:10:01.746330 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45fb45b8-ae27-4403-be3b-94860db0002c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "45fb45b8-ae27-4403-be3b-94860db0002c" (UID: "45fb45b8-ae27-4403-be3b-94860db0002c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:10:01.746456 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:10:01.746393 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45fb45b8-ae27-4403-be3b-94860db0002c-kube-api-access-ljwmv" (OuterVolumeSpecName: "kube-api-access-ljwmv") pod "45fb45b8-ae27-4403-be3b-94860db0002c" (UID: "45fb45b8-ae27-4403-be3b-94860db0002c"). InnerVolumeSpecName "kube-api-access-ljwmv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:10:01.845323 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:10:01.845292 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/45fb45b8-ae27-4403-be3b-94860db0002c-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:10:01.845323 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:10:01.845321 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ljwmv\" (UniqueName: \"kubernetes.io/projected/45fb45b8-ae27-4403-be3b-94860db0002c-kube-api-access-ljwmv\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:10:01.845323 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:10:01.845331 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/45fb45b8-ae27-4403-be3b-94860db0002c-kserve-provision-location\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:10:01.845554 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:10:01.845340 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/45fb45b8-ae27-4403-be3b-94860db0002c-proxy-tls\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:10:02.512624 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:10:02.510085 2568 generic.go:358] "Generic (PLEG): container finished" podID="45fb45b8-ae27-4403-be3b-94860db0002c" containerID="e2b861d6a82a913e0852394c4cd6e2bd66c928c6fb723010c0577945441c97dd" exitCode=0 Apr 23 18:10:02.512624 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:10:02.510205 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8" event={"ID":"45fb45b8-ae27-4403-be3b-94860db0002c","Type":"ContainerDied","Data":"e2b861d6a82a913e0852394c4cd6e2bd66c928c6fb723010c0577945441c97dd"} Apr 23 18:10:02.512624 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:10:02.510238 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8" event={"ID":"45fb45b8-ae27-4403-be3b-94860db0002c","Type":"ContainerDied","Data":"34eee23ac06e0b2582dbb0c18847502ab8aec2e9bce753f39f41c3117c178a32"} Apr 23 18:10:02.512624 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:10:02.510262 2568 scope.go:117] "RemoveContainer" containerID="65839c0f1b4349e57cdda09e87b0c294ff6369b4b941c7c6bb621ea460cffced" Apr 23 18:10:02.512624 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:10:02.510466 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8" Apr 23 18:10:02.531589 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:10:02.531563 2568 scope.go:117] "RemoveContainer" containerID="e2b861d6a82a913e0852394c4cd6e2bd66c928c6fb723010c0577945441c97dd" Apr 23 18:10:02.541562 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:10:02.541505 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8"] Apr 23 18:10:02.543995 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:10:02.543974 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-jqhq8"] Apr 23 18:10:02.549714 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:10:02.549697 2568 scope.go:117] "RemoveContainer" containerID="d4e70083795c6eb401deb7d137588254825e7875647036a676f151825b213f89" Apr 23 18:10:02.567293 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:10:02.567215 2568 scope.go:117] "RemoveContainer" containerID="65839c0f1b4349e57cdda09e87b0c294ff6369b4b941c7c6bb621ea460cffced" Apr 23 18:10:02.567653 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:10:02.567597 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65839c0f1b4349e57cdda09e87b0c294ff6369b4b941c7c6bb621ea460cffced\": container with ID starting with 65839c0f1b4349e57cdda09e87b0c294ff6369b4b941c7c6bb621ea460cffced not found: ID does not exist" containerID="65839c0f1b4349e57cdda09e87b0c294ff6369b4b941c7c6bb621ea460cffced" Apr 23 18:10:02.567805 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:10:02.567641 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65839c0f1b4349e57cdda09e87b0c294ff6369b4b941c7c6bb621ea460cffced"} err="failed to get container status \"65839c0f1b4349e57cdda09e87b0c294ff6369b4b941c7c6bb621ea460cffced\": rpc error: code = NotFound desc = could not find container \"65839c0f1b4349e57cdda09e87b0c294ff6369b4b941c7c6bb621ea460cffced\": container with ID starting with 65839c0f1b4349e57cdda09e87b0c294ff6369b4b941c7c6bb621ea460cffced not found: ID does not exist" Apr 23 18:10:02.567805 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:10:02.567670 2568 scope.go:117] "RemoveContainer" containerID="e2b861d6a82a913e0852394c4cd6e2bd66c928c6fb723010c0577945441c97dd" Apr 23 18:10:02.568426 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:10:02.568281 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2b861d6a82a913e0852394c4cd6e2bd66c928c6fb723010c0577945441c97dd\": container with ID starting with e2b861d6a82a913e0852394c4cd6e2bd66c928c6fb723010c0577945441c97dd not found: ID does not exist" containerID="e2b861d6a82a913e0852394c4cd6e2bd66c928c6fb723010c0577945441c97dd" Apr 23 18:10:02.568426 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:10:02.568315 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2b861d6a82a913e0852394c4cd6e2bd66c928c6fb723010c0577945441c97dd"} err="failed to get container status \"e2b861d6a82a913e0852394c4cd6e2bd66c928c6fb723010c0577945441c97dd\": rpc error: code = NotFound desc = could not find container \"e2b861d6a82a913e0852394c4cd6e2bd66c928c6fb723010c0577945441c97dd\": container with ID starting with e2b861d6a82a913e0852394c4cd6e2bd66c928c6fb723010c0577945441c97dd not found: ID does not exist" Apr 23 18:10:02.568426 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:10:02.568339 2568 scope.go:117] "RemoveContainer" containerID="d4e70083795c6eb401deb7d137588254825e7875647036a676f151825b213f89" Apr 23 18:10:02.568989 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:10:02.568890 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4e70083795c6eb401deb7d137588254825e7875647036a676f151825b213f89\": container with ID starting with d4e70083795c6eb401deb7d137588254825e7875647036a676f151825b213f89 not found: ID does not exist" containerID="d4e70083795c6eb401deb7d137588254825e7875647036a676f151825b213f89" Apr 23 18:10:02.568989 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:10:02.568950 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4e70083795c6eb401deb7d137588254825e7875647036a676f151825b213f89"} err="failed to get container status \"d4e70083795c6eb401deb7d137588254825e7875647036a676f151825b213f89\": rpc error: code = NotFound desc = could not find container \"d4e70083795c6eb401deb7d137588254825e7875647036a676f151825b213f89\": container with ID starting with d4e70083795c6eb401deb7d137588254825e7875647036a676f151825b213f89 not found: ID does not exist" Apr 23 18:10:02.601729 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:10:02.601255 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45fb45b8-ae27-4403-be3b-94860db0002c" path="/var/lib/kubelet/pods/45fb45b8-ae27-4403-be3b-94860db0002c/volumes" Apr 23 18:12:03.211454 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:03.211431 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 18:12:03.903892 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:03.903846 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-zq5vm" event={"ID":"6236ec1f-b23f-4f89-be8e-ed5ebd0237bb","Type":"ContainerStarted","Data":"06afca0f54989198495590c32d0835a09ae5d630258daf9a5a79c2da10d3677c"} Apr 23 18:12:03.903892 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:03.903890 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-zq5vm" event={"ID":"6236ec1f-b23f-4f89-be8e-ed5ebd0237bb","Type":"ContainerStarted","Data":"240288aee8d032af011c354ba5f17119a1cbed9c400fc3caa8512bcf1e2acdf8"} Apr 23 18:12:03.904147 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:03.904060 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-zq5vm" Apr 23 18:12:03.934641 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:03.934584 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-zq5vm" podStartSLOduration=6.339680483 podStartE2EDuration="2m7.934567289s" podCreationTimestamp="2026-04-23 18:09:56 +0000 UTC" firstStartedPulling="2026-04-23 18:10:01.504190064 +0000 UTC m=+1059.424282517" lastFinishedPulling="2026-04-23 18:12:03.099076861 +0000 UTC m=+1181.019169323" observedRunningTime="2026-04-23 18:12:03.933757795 +0000 UTC m=+1181.853850293" watchObservedRunningTime="2026-04-23 18:12:03.934567289 +0000 UTC m=+1181.854659766" Apr 23 18:12:04.907731 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:04.907688 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-zq5vm" Apr 23 18:12:10.916039 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:10.916005 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-zq5vm" Apr 23 18:12:22.589191 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:22.589161 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwrk7_52f4d4c2-438f-4486-8ffb-4afbaad14bf1/ovn-acl-logging/0.log" Apr 23 18:12:22.589622 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:22.589434 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwrk7_52f4d4c2-438f-4486-8ffb-4afbaad14bf1/ovn-acl-logging/0.log" Apr 23 18:12:22.593002 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:22.592982 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_8dc4456d0c7e39a0756b16fde0e83318/kube-rbac-proxy-crio/4.log" Apr 23 18:12:22.593579 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:22.593562 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_8dc4456d0c7e39a0756b16fde0e83318/kube-rbac-proxy-crio/4.log" Apr 23 18:12:40.919594 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:40.919562 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-zq5vm" Apr 23 18:12:47.074485 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:47.074446 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-zq5vm"] Apr 23 18:12:47.075010 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:47.074746 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-zq5vm" podUID="6236ec1f-b23f-4f89-be8e-ed5ebd0237bb" containerName="kserve-container" containerID="cri-o://240288aee8d032af011c354ba5f17119a1cbed9c400fc3caa8512bcf1e2acdf8" gracePeriod=30 Apr 23 18:12:47.075010 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:47.074808 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-zq5vm" podUID="6236ec1f-b23f-4f89-be8e-ed5ebd0237bb" containerName="kube-rbac-proxy" containerID="cri-o://06afca0f54989198495590c32d0835a09ae5d630258daf9a5a79c2da10d3677c" gracePeriod=30 Apr 23 18:12:47.209911 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:47.209877 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn"] Apr 23 18:12:47.210233 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:47.210214 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="45fb45b8-ae27-4403-be3b-94860db0002c" containerName="kserve-container" Apr 23 18:12:47.210300 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:47.210236 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="45fb45b8-ae27-4403-be3b-94860db0002c" containerName="kserve-container" Apr 23 18:12:47.210300 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:47.210252 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="45fb45b8-ae27-4403-be3b-94860db0002c" containerName="storage-initializer" Apr 23 18:12:47.210300 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:47.210261 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="45fb45b8-ae27-4403-be3b-94860db0002c" containerName="storage-initializer" Apr 23 18:12:47.210300 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:47.210276 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="45fb45b8-ae27-4403-be3b-94860db0002c" containerName="kube-rbac-proxy" Apr 23 18:12:47.210300 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:47.210285 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="45fb45b8-ae27-4403-be3b-94860db0002c" containerName="kube-rbac-proxy" Apr 23 18:12:47.210503 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:47.210345 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="45fb45b8-ae27-4403-be3b-94860db0002c" containerName="kube-rbac-proxy" Apr 23 18:12:47.210503 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:47.210355 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="45fb45b8-ae27-4403-be3b-94860db0002c" containerName="kserve-container" Apr 23 18:12:47.213431 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:47.213411 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn" Apr 23 18:12:47.215814 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:47.215789 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-kserve-predictor-serving-cert\"" Apr 23 18:12:47.215975 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:47.215789 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\"" Apr 23 18:12:47.224582 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:47.224545 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn"] Apr 23 18:12:47.293151 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:47.293103 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aece6290-7c19-48b1-8de7-9bf0f3a382c9-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn\" (UID: \"aece6290-7c19-48b1-8de7-9bf0f3a382c9\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn" Apr 23 18:12:47.293360 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:47.293213 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/aece6290-7c19-48b1-8de7-9bf0f3a382c9-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn\" (UID: \"aece6290-7c19-48b1-8de7-9bf0f3a382c9\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn" Apr 23 18:12:47.293360 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:47.293247 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7n6c\" (UniqueName: \"kubernetes.io/projected/aece6290-7c19-48b1-8de7-9bf0f3a382c9-kube-api-access-v7n6c\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn\" (UID: \"aece6290-7c19-48b1-8de7-9bf0f3a382c9\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn" Apr 23 18:12:47.293360 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:47.293275 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/aece6290-7c19-48b1-8de7-9bf0f3a382c9-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn\" (UID: \"aece6290-7c19-48b1-8de7-9bf0f3a382c9\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn" Apr 23 18:12:47.393856 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:47.393751 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aece6290-7c19-48b1-8de7-9bf0f3a382c9-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn\" (UID: \"aece6290-7c19-48b1-8de7-9bf0f3a382c9\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn" Apr 23 18:12:47.393856 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:47.393821 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/aece6290-7c19-48b1-8de7-9bf0f3a382c9-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn\" (UID: \"aece6290-7c19-48b1-8de7-9bf0f3a382c9\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn" Apr 23 18:12:47.393856 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:47.393848 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v7n6c\" (UniqueName: \"kubernetes.io/projected/aece6290-7c19-48b1-8de7-9bf0f3a382c9-kube-api-access-v7n6c\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn\" (UID: \"aece6290-7c19-48b1-8de7-9bf0f3a382c9\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn" Apr 23 18:12:47.394142 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:47.393876 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/aece6290-7c19-48b1-8de7-9bf0f3a382c9-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn\" (UID: \"aece6290-7c19-48b1-8de7-9bf0f3a382c9\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn" Apr 23 18:12:47.394271 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:47.394251 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aece6290-7c19-48b1-8de7-9bf0f3a382c9-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn\" (UID: \"aece6290-7c19-48b1-8de7-9bf0f3a382c9\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn" Apr 23 18:12:47.394540 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:47.394519 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/aece6290-7c19-48b1-8de7-9bf0f3a382c9-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn\" (UID: \"aece6290-7c19-48b1-8de7-9bf0f3a382c9\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn" Apr 23 18:12:47.396519 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:47.396483 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/aece6290-7c19-48b1-8de7-9bf0f3a382c9-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn\" (UID: \"aece6290-7c19-48b1-8de7-9bf0f3a382c9\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn" Apr 23 18:12:47.403902 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:47.403855 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7n6c\" (UniqueName: \"kubernetes.io/projected/aece6290-7c19-48b1-8de7-9bf0f3a382c9-kube-api-access-v7n6c\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn\" (UID: \"aece6290-7c19-48b1-8de7-9bf0f3a382c9\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn" Apr 23 18:12:47.525345 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:47.525311 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn" Apr 23 18:12:47.657049 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:47.656961 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn"] Apr 23 18:12:47.660187 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:12:47.660149 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaece6290_7c19_48b1_8de7_9bf0f3a382c9.slice/crio-bb553fc3c0dae6f1336ddcc0cad65544664cf265e900a2c7e0ae6a28ac001fd6 WatchSource:0}: Error finding container bb553fc3c0dae6f1336ddcc0cad65544664cf265e900a2c7e0ae6a28ac001fd6: Status 404 returned error can't find the container with id bb553fc3c0dae6f1336ddcc0cad65544664cf265e900a2c7e0ae6a28ac001fd6 Apr 23 18:12:48.024670 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:48.024557 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn" event={"ID":"aece6290-7c19-48b1-8de7-9bf0f3a382c9","Type":"ContainerStarted","Data":"7019ab01da9ab4aa5b64c31283f81fc8dd478c8f40d8a3f2a13724f17f946c78"} Apr 23 18:12:48.024670 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:48.024609 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn" event={"ID":"aece6290-7c19-48b1-8de7-9bf0f3a382c9","Type":"ContainerStarted","Data":"bb553fc3c0dae6f1336ddcc0cad65544664cf265e900a2c7e0ae6a28ac001fd6"} Apr 23 18:12:48.026707 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:48.026677 2568 generic.go:358] "Generic (PLEG): container finished" podID="6236ec1f-b23f-4f89-be8e-ed5ebd0237bb" containerID="06afca0f54989198495590c32d0835a09ae5d630258daf9a5a79c2da10d3677c" exitCode=2 Apr 23 18:12:48.026828 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:48.026751 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-zq5vm" event={"ID":"6236ec1f-b23f-4f89-be8e-ed5ebd0237bb","Type":"ContainerDied","Data":"06afca0f54989198495590c32d0835a09ae5d630258daf9a5a79c2da10d3677c"} Apr 23 18:12:48.315863 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:48.315840 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-zq5vm" Apr 23 18:12:48.403240 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:48.403204 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6236ec1f-b23f-4f89-be8e-ed5ebd0237bb-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"6236ec1f-b23f-4f89-be8e-ed5ebd0237bb\" (UID: \"6236ec1f-b23f-4f89-be8e-ed5ebd0237bb\") " Apr 23 18:12:48.403467 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:48.403249 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkvbr\" (UniqueName: \"kubernetes.io/projected/6236ec1f-b23f-4f89-be8e-ed5ebd0237bb-kube-api-access-gkvbr\") pod \"6236ec1f-b23f-4f89-be8e-ed5ebd0237bb\" (UID: \"6236ec1f-b23f-4f89-be8e-ed5ebd0237bb\") " Apr 23 18:12:48.403467 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:48.403290 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6236ec1f-b23f-4f89-be8e-ed5ebd0237bb-proxy-tls\") pod \"6236ec1f-b23f-4f89-be8e-ed5ebd0237bb\" (UID: \"6236ec1f-b23f-4f89-be8e-ed5ebd0237bb\") " Apr 23 18:12:48.403467 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:48.403310 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6236ec1f-b23f-4f89-be8e-ed5ebd0237bb-kserve-provision-location\") pod \"6236ec1f-b23f-4f89-be8e-ed5ebd0237bb\" (UID: \"6236ec1f-b23f-4f89-be8e-ed5ebd0237bb\") " Apr 23 18:12:48.403675 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:48.403644 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6236ec1f-b23f-4f89-be8e-ed5ebd0237bb-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config") pod "6236ec1f-b23f-4f89-be8e-ed5ebd0237bb" (UID: "6236ec1f-b23f-4f89-be8e-ed5ebd0237bb"). InnerVolumeSpecName "isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:12:48.403750 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:48.403691 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6236ec1f-b23f-4f89-be8e-ed5ebd0237bb-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6236ec1f-b23f-4f89-be8e-ed5ebd0237bb" (UID: "6236ec1f-b23f-4f89-be8e-ed5ebd0237bb"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:12:48.405394 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:48.405362 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6236ec1f-b23f-4f89-be8e-ed5ebd0237bb-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "6236ec1f-b23f-4f89-be8e-ed5ebd0237bb" (UID: "6236ec1f-b23f-4f89-be8e-ed5ebd0237bb"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:12:48.405538 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:48.405516 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6236ec1f-b23f-4f89-be8e-ed5ebd0237bb-kube-api-access-gkvbr" (OuterVolumeSpecName: "kube-api-access-gkvbr") pod "6236ec1f-b23f-4f89-be8e-ed5ebd0237bb" (UID: "6236ec1f-b23f-4f89-be8e-ed5ebd0237bb"). InnerVolumeSpecName "kube-api-access-gkvbr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:12:48.504385 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:48.504350 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6236ec1f-b23f-4f89-be8e-ed5ebd0237bb-proxy-tls\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:12:48.504385 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:48.504381 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6236ec1f-b23f-4f89-be8e-ed5ebd0237bb-kserve-provision-location\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:12:48.504385 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:48.504395 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6236ec1f-b23f-4f89-be8e-ed5ebd0237bb-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:12:48.504640 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:48.504411 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gkvbr\" (UniqueName: \"kubernetes.io/projected/6236ec1f-b23f-4f89-be8e-ed5ebd0237bb-kube-api-access-gkvbr\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:12:49.031806 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:49.031712 2568 generic.go:358] "Generic (PLEG): container finished" podID="6236ec1f-b23f-4f89-be8e-ed5ebd0237bb" containerID="240288aee8d032af011c354ba5f17119a1cbed9c400fc3caa8512bcf1e2acdf8" exitCode=0 Apr 23 18:12:49.031806 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:49.031754 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-zq5vm" event={"ID":"6236ec1f-b23f-4f89-be8e-ed5ebd0237bb","Type":"ContainerDied","Data":"240288aee8d032af011c354ba5f17119a1cbed9c400fc3caa8512bcf1e2acdf8"} Apr 23 18:12:49.031806 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:49.031792 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-zq5vm" Apr 23 18:12:49.031806 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:49.031797 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-zq5vm" event={"ID":"6236ec1f-b23f-4f89-be8e-ed5ebd0237bb","Type":"ContainerDied","Data":"b5021deca92977e86516b66b762bb9fd7cbfecb74471b12ed021d171b04761d7"} Apr 23 18:12:49.032170 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:49.031819 2568 scope.go:117] "RemoveContainer" containerID="06afca0f54989198495590c32d0835a09ae5d630258daf9a5a79c2da10d3677c" Apr 23 18:12:49.039750 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:49.039729 2568 scope.go:117] "RemoveContainer" containerID="240288aee8d032af011c354ba5f17119a1cbed9c400fc3caa8512bcf1e2acdf8" Apr 23 18:12:49.046885 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:49.046863 2568 scope.go:117] "RemoveContainer" containerID="8e7c705239d7376f228110d880e96044c383ef58cc5179ac8fc7aa36f014cb5f" Apr 23 18:12:49.051797 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:49.051773 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-zq5vm"] Apr 23 18:12:49.054352 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:49.054335 2568 scope.go:117] "RemoveContainer" containerID="06afca0f54989198495590c32d0835a09ae5d630258daf9a5a79c2da10d3677c" Apr 23 18:12:49.054617 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:12:49.054597 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06afca0f54989198495590c32d0835a09ae5d630258daf9a5a79c2da10d3677c\": container with ID starting with 06afca0f54989198495590c32d0835a09ae5d630258daf9a5a79c2da10d3677c not found: ID does not exist" containerID="06afca0f54989198495590c32d0835a09ae5d630258daf9a5a79c2da10d3677c" Apr 23 18:12:49.054658 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:49.054627 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06afca0f54989198495590c32d0835a09ae5d630258daf9a5a79c2da10d3677c"} err="failed to get container status \"06afca0f54989198495590c32d0835a09ae5d630258daf9a5a79c2da10d3677c\": rpc error: code = NotFound desc = could not find container \"06afca0f54989198495590c32d0835a09ae5d630258daf9a5a79c2da10d3677c\": container with ID starting with 06afca0f54989198495590c32d0835a09ae5d630258daf9a5a79c2da10d3677c not found: ID does not exist" Apr 23 18:12:49.054658 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:49.054647 2568 scope.go:117] "RemoveContainer" containerID="240288aee8d032af011c354ba5f17119a1cbed9c400fc3caa8512bcf1e2acdf8" Apr 23 18:12:49.054879 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:12:49.054858 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"240288aee8d032af011c354ba5f17119a1cbed9c400fc3caa8512bcf1e2acdf8\": container with ID starting with 240288aee8d032af011c354ba5f17119a1cbed9c400fc3caa8512bcf1e2acdf8 not found: ID does not exist" containerID="240288aee8d032af011c354ba5f17119a1cbed9c400fc3caa8512bcf1e2acdf8" Apr 23 18:12:49.054922 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:49.054886 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"240288aee8d032af011c354ba5f17119a1cbed9c400fc3caa8512bcf1e2acdf8"} err="failed to get container status \"240288aee8d032af011c354ba5f17119a1cbed9c400fc3caa8512bcf1e2acdf8\": rpc error: code = NotFound desc = could not find container \"240288aee8d032af011c354ba5f17119a1cbed9c400fc3caa8512bcf1e2acdf8\": container with ID starting with 240288aee8d032af011c354ba5f17119a1cbed9c400fc3caa8512bcf1e2acdf8 not found: ID does not exist" Apr 23 18:12:49.054922 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:49.054904 2568 scope.go:117] "RemoveContainer" containerID="8e7c705239d7376f228110d880e96044c383ef58cc5179ac8fc7aa36f014cb5f" Apr 23 18:12:49.055149 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:12:49.055131 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e7c705239d7376f228110d880e96044c383ef58cc5179ac8fc7aa36f014cb5f\": container with ID starting with 8e7c705239d7376f228110d880e96044c383ef58cc5179ac8fc7aa36f014cb5f not found: ID does not exist" containerID="8e7c705239d7376f228110d880e96044c383ef58cc5179ac8fc7aa36f014cb5f" Apr 23 18:12:49.055194 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:49.055154 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e7c705239d7376f228110d880e96044c383ef58cc5179ac8fc7aa36f014cb5f"} err="failed to get container status \"8e7c705239d7376f228110d880e96044c383ef58cc5179ac8fc7aa36f014cb5f\": rpc error: code = NotFound desc = could not find container \"8e7c705239d7376f228110d880e96044c383ef58cc5179ac8fc7aa36f014cb5f\": container with ID starting with 8e7c705239d7376f228110d880e96044c383ef58cc5179ac8fc7aa36f014cb5f not found: ID does not exist" Apr 23 18:12:49.061875 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:49.061854 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-zq5vm"] Apr 23 18:12:50.598651 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:50.598616 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6236ec1f-b23f-4f89-be8e-ed5ebd0237bb" path="/var/lib/kubelet/pods/6236ec1f-b23f-4f89-be8e-ed5ebd0237bb/volumes" Apr 23 18:12:52.041385 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:52.041352 2568 generic.go:358] "Generic (PLEG): container finished" podID="aece6290-7c19-48b1-8de7-9bf0f3a382c9" containerID="7019ab01da9ab4aa5b64c31283f81fc8dd478c8f40d8a3f2a13724f17f946c78" exitCode=0 Apr 23 18:12:52.041827 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:52.041429 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn" event={"ID":"aece6290-7c19-48b1-8de7-9bf0f3a382c9","Type":"ContainerDied","Data":"7019ab01da9ab4aa5b64c31283f81fc8dd478c8f40d8a3f2a13724f17f946c78"} Apr 23 18:12:53.046251 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:53.046212 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn" event={"ID":"aece6290-7c19-48b1-8de7-9bf0f3a382c9","Type":"ContainerStarted","Data":"1222efc99ba7b9d380720c1a08e618e064c9e903ea1178cdb8c1a0b45680c6c2"} Apr 23 18:12:53.046251 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:53.046257 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn" event={"ID":"aece6290-7c19-48b1-8de7-9bf0f3a382c9","Type":"ContainerStarted","Data":"e6645e2e8001a3933504154ed5623da1c4b312df1da62a81ec723b8a7de8b374"} Apr 23 18:12:53.046706 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:53.046548 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn" Apr 23 18:12:53.046706 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:53.046691 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn" Apr 23 18:12:53.047860 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:53.047835 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn" podUID="aece6290-7c19-48b1-8de7-9bf0f3a382c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 23 18:12:53.067666 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:53.067621 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn" podStartSLOduration=6.067604579 podStartE2EDuration="6.067604579s" podCreationTimestamp="2026-04-23 18:12:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:12:53.066256936 +0000 UTC m=+1230.986349411" watchObservedRunningTime="2026-04-23 18:12:53.067604579 +0000 UTC m=+1230.987697054" Apr 23 18:12:54.049371 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:54.049327 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn" podUID="aece6290-7c19-48b1-8de7-9bf0f3a382c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 23 18:12:59.054639 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:59.054603 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn" Apr 23 18:12:59.055298 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:12:59.055269 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn" podUID="aece6290-7c19-48b1-8de7-9bf0f3a382c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 23 18:13:09.055947 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:09.055856 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn" Apr 23 18:13:17.222348 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:17.222318 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn"] Apr 23 18:13:17.222756 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:17.222627 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn" podUID="aece6290-7c19-48b1-8de7-9bf0f3a382c9" containerName="kserve-container" containerID="cri-o://e6645e2e8001a3933504154ed5623da1c4b312df1da62a81ec723b8a7de8b374" gracePeriod=30 Apr 23 18:13:17.222756 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:17.222664 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn" podUID="aece6290-7c19-48b1-8de7-9bf0f3a382c9" containerName="kube-rbac-proxy" containerID="cri-o://1222efc99ba7b9d380720c1a08e618e064c9e903ea1178cdb8c1a0b45680c6c2" gracePeriod=30 Apr 23 18:13:17.315095 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:17.315049 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd"] Apr 23 18:13:17.315387 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:17.315372 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6236ec1f-b23f-4f89-be8e-ed5ebd0237bb" containerName="kserve-container" Apr 23 18:13:17.315445 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:17.315390 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="6236ec1f-b23f-4f89-be8e-ed5ebd0237bb" containerName="kserve-container" Apr 23 18:13:17.315445 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:17.315406 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6236ec1f-b23f-4f89-be8e-ed5ebd0237bb" containerName="kube-rbac-proxy" Apr 23 18:13:17.315445 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:17.315411 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="6236ec1f-b23f-4f89-be8e-ed5ebd0237bb" containerName="kube-rbac-proxy" Apr 23 18:13:17.315445 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:17.315419 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6236ec1f-b23f-4f89-be8e-ed5ebd0237bb" containerName="storage-initializer" Apr 23 18:13:17.315445 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:17.315426 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="6236ec1f-b23f-4f89-be8e-ed5ebd0237bb" containerName="storage-initializer" Apr 23 18:13:17.315683 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:17.315475 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="6236ec1f-b23f-4f89-be8e-ed5ebd0237bb" containerName="kserve-container" Apr 23 18:13:17.315683 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:17.315485 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="6236ec1f-b23f-4f89-be8e-ed5ebd0237bb" containerName="kube-rbac-proxy" Apr 23 18:13:17.318066 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:17.318043 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd" Apr 23 18:13:17.319715 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:17.319691 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 23 18:13:17.319814 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:17.319743 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-mlflow-v2-runtime-predictor-serving-cert\"" Apr 23 18:13:17.328644 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:17.328616 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd"] Apr 23 18:13:17.425362 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:17.425319 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/039d4720-5427-47aa-9d46-2deec9604dd6-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd\" (UID: \"039d4720-5427-47aa-9d46-2deec9604dd6\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd" Apr 23 18:13:17.425576 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:17.425390 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/039d4720-5427-47aa-9d46-2deec9604dd6-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd\" (UID: \"039d4720-5427-47aa-9d46-2deec9604dd6\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd" Apr 23 18:13:17.425576 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:17.425429 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/039d4720-5427-47aa-9d46-2deec9604dd6-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd\" (UID: \"039d4720-5427-47aa-9d46-2deec9604dd6\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd" Apr 23 18:13:17.425576 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:17.425471 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56kr5\" (UniqueName: \"kubernetes.io/projected/039d4720-5427-47aa-9d46-2deec9604dd6-kube-api-access-56kr5\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd\" (UID: \"039d4720-5427-47aa-9d46-2deec9604dd6\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd" Apr 23 18:13:17.526893 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:17.526785 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/039d4720-5427-47aa-9d46-2deec9604dd6-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd\" (UID: \"039d4720-5427-47aa-9d46-2deec9604dd6\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd" Apr 23 18:13:17.526893 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:17.526833 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-56kr5\" (UniqueName: \"kubernetes.io/projected/039d4720-5427-47aa-9d46-2deec9604dd6-kube-api-access-56kr5\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd\" (UID: \"039d4720-5427-47aa-9d46-2deec9604dd6\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd" Apr 23 18:13:17.526893 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:17.526866 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/039d4720-5427-47aa-9d46-2deec9604dd6-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd\" (UID: \"039d4720-5427-47aa-9d46-2deec9604dd6\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd" Apr 23 18:13:17.527206 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:17.526903 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/039d4720-5427-47aa-9d46-2deec9604dd6-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd\" (UID: \"039d4720-5427-47aa-9d46-2deec9604dd6\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd" Apr 23 18:13:17.527206 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:13:17.527110 2568 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-serving-cert: secret "isvc-mlflow-v2-runtime-predictor-serving-cert" not found Apr 23 18:13:17.527206 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:13:17.527172 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/039d4720-5427-47aa-9d46-2deec9604dd6-proxy-tls podName:039d4720-5427-47aa-9d46-2deec9604dd6 nodeName:}" failed. No retries permitted until 2026-04-23 18:13:18.027153313 +0000 UTC m=+1255.947245767 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/039d4720-5427-47aa-9d46-2deec9604dd6-proxy-tls") pod "isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd" (UID: "039d4720-5427-47aa-9d46-2deec9604dd6") : secret "isvc-mlflow-v2-runtime-predictor-serving-cert" not found Apr 23 18:13:17.527363 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:17.527260 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/039d4720-5427-47aa-9d46-2deec9604dd6-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd\" (UID: \"039d4720-5427-47aa-9d46-2deec9604dd6\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd" Apr 23 18:13:17.527572 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:17.527551 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/039d4720-5427-47aa-9d46-2deec9604dd6-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd\" (UID: \"039d4720-5427-47aa-9d46-2deec9604dd6\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd" Apr 23 18:13:17.536030 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:17.536001 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-56kr5\" (UniqueName: \"kubernetes.io/projected/039d4720-5427-47aa-9d46-2deec9604dd6-kube-api-access-56kr5\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd\" (UID: \"039d4720-5427-47aa-9d46-2deec9604dd6\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd" Apr 23 18:13:17.863335 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:17.863312 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn" Apr 23 18:13:17.929292 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:17.929264 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/aece6290-7c19-48b1-8de7-9bf0f3a382c9-proxy-tls\") pod \"aece6290-7c19-48b1-8de7-9bf0f3a382c9\" (UID: \"aece6290-7c19-48b1-8de7-9bf0f3a382c9\") " Apr 23 18:13:17.929482 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:17.929335 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aece6290-7c19-48b1-8de7-9bf0f3a382c9-kserve-provision-location\") pod \"aece6290-7c19-48b1-8de7-9bf0f3a382c9\" (UID: \"aece6290-7c19-48b1-8de7-9bf0f3a382c9\") " Apr 23 18:13:17.929482 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:17.929361 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7n6c\" (UniqueName: \"kubernetes.io/projected/aece6290-7c19-48b1-8de7-9bf0f3a382c9-kube-api-access-v7n6c\") pod \"aece6290-7c19-48b1-8de7-9bf0f3a382c9\" (UID: \"aece6290-7c19-48b1-8de7-9bf0f3a382c9\") " Apr 23 18:13:17.929482 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:17.929392 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/aece6290-7c19-48b1-8de7-9bf0f3a382c9-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"aece6290-7c19-48b1-8de7-9bf0f3a382c9\" (UID: \"aece6290-7c19-48b1-8de7-9bf0f3a382c9\") " Apr 23 18:13:17.929727 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:17.929693 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aece6290-7c19-48b1-8de7-9bf0f3a382c9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "aece6290-7c19-48b1-8de7-9bf0f3a382c9" (UID: "aece6290-7c19-48b1-8de7-9bf0f3a382c9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:13:17.929778 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:17.929749 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aece6290-7c19-48b1-8de7-9bf0f3a382c9-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config") pod "aece6290-7c19-48b1-8de7-9bf0f3a382c9" (UID: "aece6290-7c19-48b1-8de7-9bf0f3a382c9"). InnerVolumeSpecName "isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:13:17.931411 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:17.931391 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aece6290-7c19-48b1-8de7-9bf0f3a382c9-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "aece6290-7c19-48b1-8de7-9bf0f3a382c9" (UID: "aece6290-7c19-48b1-8de7-9bf0f3a382c9"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:13:17.931483 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:17.931441 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aece6290-7c19-48b1-8de7-9bf0f3a382c9-kube-api-access-v7n6c" (OuterVolumeSpecName: "kube-api-access-v7n6c") pod "aece6290-7c19-48b1-8de7-9bf0f3a382c9" (UID: "aece6290-7c19-48b1-8de7-9bf0f3a382c9"). InnerVolumeSpecName "kube-api-access-v7n6c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:13:18.030715 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:18.030679 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/039d4720-5427-47aa-9d46-2deec9604dd6-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd\" (UID: \"039d4720-5427-47aa-9d46-2deec9604dd6\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd" Apr 23 18:13:18.030866 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:18.030728 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/aece6290-7c19-48b1-8de7-9bf0f3a382c9-proxy-tls\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:13:18.030866 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:18.030739 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aece6290-7c19-48b1-8de7-9bf0f3a382c9-kserve-provision-location\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:13:18.030866 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:18.030754 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v7n6c\" (UniqueName: \"kubernetes.io/projected/aece6290-7c19-48b1-8de7-9bf0f3a382c9-kube-api-access-v7n6c\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:13:18.030866 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:18.030768 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/aece6290-7c19-48b1-8de7-9bf0f3a382c9-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:13:18.030866 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:13:18.030832 2568 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-serving-cert: secret "isvc-mlflow-v2-runtime-predictor-serving-cert" not found Apr 23 18:13:18.031044 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:13:18.030894 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/039d4720-5427-47aa-9d46-2deec9604dd6-proxy-tls podName:039d4720-5427-47aa-9d46-2deec9604dd6 nodeName:}" failed. No retries permitted until 2026-04-23 18:13:19.030879037 +0000 UTC m=+1256.950971489 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/039d4720-5427-47aa-9d46-2deec9604dd6-proxy-tls") pod "isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd" (UID: "039d4720-5427-47aa-9d46-2deec9604dd6") : secret "isvc-mlflow-v2-runtime-predictor-serving-cert" not found Apr 23 18:13:18.117306 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:18.117277 2568 generic.go:358] "Generic (PLEG): container finished" podID="aece6290-7c19-48b1-8de7-9bf0f3a382c9" containerID="1222efc99ba7b9d380720c1a08e618e064c9e903ea1178cdb8c1a0b45680c6c2" exitCode=2 Apr 23 18:13:18.117306 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:18.117301 2568 generic.go:358] "Generic (PLEG): container finished" podID="aece6290-7c19-48b1-8de7-9bf0f3a382c9" containerID="e6645e2e8001a3933504154ed5623da1c4b312df1da62a81ec723b8a7de8b374" exitCode=0 Apr 23 18:13:18.117607 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:18.117361 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn" Apr 23 18:13:18.117607 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:18.117367 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn" event={"ID":"aece6290-7c19-48b1-8de7-9bf0f3a382c9","Type":"ContainerDied","Data":"1222efc99ba7b9d380720c1a08e618e064c9e903ea1178cdb8c1a0b45680c6c2"} Apr 23 18:13:18.117607 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:18.117411 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn" event={"ID":"aece6290-7c19-48b1-8de7-9bf0f3a382c9","Type":"ContainerDied","Data":"e6645e2e8001a3933504154ed5623da1c4b312df1da62a81ec723b8a7de8b374"} Apr 23 18:13:18.117607 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:18.117427 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn" event={"ID":"aece6290-7c19-48b1-8de7-9bf0f3a382c9","Type":"ContainerDied","Data":"bb553fc3c0dae6f1336ddcc0cad65544664cf265e900a2c7e0ae6a28ac001fd6"} Apr 23 18:13:18.117607 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:18.117444 2568 scope.go:117] "RemoveContainer" containerID="1222efc99ba7b9d380720c1a08e618e064c9e903ea1178cdb8c1a0b45680c6c2" Apr 23 18:13:18.125416 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:18.125403 2568 scope.go:117] "RemoveContainer" containerID="e6645e2e8001a3933504154ed5623da1c4b312df1da62a81ec723b8a7de8b374" Apr 23 18:13:18.132385 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:18.132344 2568 scope.go:117] "RemoveContainer" containerID="7019ab01da9ab4aa5b64c31283f81fc8dd478c8f40d8a3f2a13724f17f946c78" Apr 23 18:13:18.139274 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:18.139170 2568 scope.go:117] "RemoveContainer" containerID="1222efc99ba7b9d380720c1a08e618e064c9e903ea1178cdb8c1a0b45680c6c2" Apr 23 18:13:18.139493 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:13:18.139460 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1222efc99ba7b9d380720c1a08e618e064c9e903ea1178cdb8c1a0b45680c6c2\": container with ID starting with 1222efc99ba7b9d380720c1a08e618e064c9e903ea1178cdb8c1a0b45680c6c2 not found: ID does not exist" containerID="1222efc99ba7b9d380720c1a08e618e064c9e903ea1178cdb8c1a0b45680c6c2" Apr 23 18:13:18.139584 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:18.139507 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1222efc99ba7b9d380720c1a08e618e064c9e903ea1178cdb8c1a0b45680c6c2"} err="failed to get container status \"1222efc99ba7b9d380720c1a08e618e064c9e903ea1178cdb8c1a0b45680c6c2\": rpc error: code = NotFound desc = could not find container \"1222efc99ba7b9d380720c1a08e618e064c9e903ea1178cdb8c1a0b45680c6c2\": container with ID starting with 1222efc99ba7b9d380720c1a08e618e064c9e903ea1178cdb8c1a0b45680c6c2 not found: ID does not exist" Apr 23 18:13:18.139654 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:18.139586 2568 scope.go:117] "RemoveContainer" containerID="e6645e2e8001a3933504154ed5623da1c4b312df1da62a81ec723b8a7de8b374" Apr 23 18:13:18.139852 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:13:18.139834 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6645e2e8001a3933504154ed5623da1c4b312df1da62a81ec723b8a7de8b374\": container with ID starting with e6645e2e8001a3933504154ed5623da1c4b312df1da62a81ec723b8a7de8b374 not found: ID does not exist" containerID="e6645e2e8001a3933504154ed5623da1c4b312df1da62a81ec723b8a7de8b374" Apr 23 18:13:18.139967 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:18.139859 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6645e2e8001a3933504154ed5623da1c4b312df1da62a81ec723b8a7de8b374"} err="failed to get container status \"e6645e2e8001a3933504154ed5623da1c4b312df1da62a81ec723b8a7de8b374\": rpc error: code = NotFound desc = could not find container \"e6645e2e8001a3933504154ed5623da1c4b312df1da62a81ec723b8a7de8b374\": container with ID starting with e6645e2e8001a3933504154ed5623da1c4b312df1da62a81ec723b8a7de8b374 not found: ID does not exist" Apr 23 18:13:18.139967 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:18.139879 2568 scope.go:117] "RemoveContainer" containerID="7019ab01da9ab4aa5b64c31283f81fc8dd478c8f40d8a3f2a13724f17f946c78" Apr 23 18:13:18.140139 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:13:18.140121 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7019ab01da9ab4aa5b64c31283f81fc8dd478c8f40d8a3f2a13724f17f946c78\": container with ID starting with 7019ab01da9ab4aa5b64c31283f81fc8dd478c8f40d8a3f2a13724f17f946c78 not found: ID does not exist" containerID="7019ab01da9ab4aa5b64c31283f81fc8dd478c8f40d8a3f2a13724f17f946c78" Apr 23 18:13:18.140206 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:18.140133 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn"] Apr 23 18:13:18.140206 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:18.140146 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7019ab01da9ab4aa5b64c31283f81fc8dd478c8f40d8a3f2a13724f17f946c78"} err="failed to get container status \"7019ab01da9ab4aa5b64c31283f81fc8dd478c8f40d8a3f2a13724f17f946c78\": rpc error: code = NotFound desc = could not find container \"7019ab01da9ab4aa5b64c31283f81fc8dd478c8f40d8a3f2a13724f17f946c78\": container with ID starting with 7019ab01da9ab4aa5b64c31283f81fc8dd478c8f40d8a3f2a13724f17f946c78 not found: ID does not exist" Apr 23 18:13:18.140206 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:18.140165 2568 scope.go:117] "RemoveContainer" containerID="1222efc99ba7b9d380720c1a08e618e064c9e903ea1178cdb8c1a0b45680c6c2" Apr 23 18:13:18.140463 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:18.140441 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1222efc99ba7b9d380720c1a08e618e064c9e903ea1178cdb8c1a0b45680c6c2"} err="failed to get container status \"1222efc99ba7b9d380720c1a08e618e064c9e903ea1178cdb8c1a0b45680c6c2\": rpc error: code = NotFound desc = could not find container \"1222efc99ba7b9d380720c1a08e618e064c9e903ea1178cdb8c1a0b45680c6c2\": container with ID starting with 1222efc99ba7b9d380720c1a08e618e064c9e903ea1178cdb8c1a0b45680c6c2 not found: ID does not exist" Apr 23 18:13:18.140580 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:18.140466 2568 scope.go:117] "RemoveContainer" containerID="e6645e2e8001a3933504154ed5623da1c4b312df1da62a81ec723b8a7de8b374" Apr 23 18:13:18.140760 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:18.140731 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6645e2e8001a3933504154ed5623da1c4b312df1da62a81ec723b8a7de8b374"} err="failed to get container status \"e6645e2e8001a3933504154ed5623da1c4b312df1da62a81ec723b8a7de8b374\": rpc error: code = NotFound desc = could not find container \"e6645e2e8001a3933504154ed5623da1c4b312df1da62a81ec723b8a7de8b374\": container with ID starting with e6645e2e8001a3933504154ed5623da1c4b312df1da62a81ec723b8a7de8b374 not found: ID does not exist" Apr 23 18:13:18.140819 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:18.140764 2568 scope.go:117] "RemoveContainer" containerID="7019ab01da9ab4aa5b64c31283f81fc8dd478c8f40d8a3f2a13724f17f946c78" Apr 23 18:13:18.141019 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:18.140998 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7019ab01da9ab4aa5b64c31283f81fc8dd478c8f40d8a3f2a13724f17f946c78"} err="failed to get container status \"7019ab01da9ab4aa5b64c31283f81fc8dd478c8f40d8a3f2a13724f17f946c78\": rpc error: code = NotFound desc = could not find container \"7019ab01da9ab4aa5b64c31283f81fc8dd478c8f40d8a3f2a13724f17f946c78\": container with ID starting with 7019ab01da9ab4aa5b64c31283f81fc8dd478c8f40d8a3f2a13724f17f946c78 not found: ID does not exist" Apr 23 18:13:18.145084 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:18.145053 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-8qpqn"] Apr 23 18:13:18.598176 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:18.598144 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aece6290-7c19-48b1-8de7-9bf0f3a382c9" path="/var/lib/kubelet/pods/aece6290-7c19-48b1-8de7-9bf0f3a382c9/volumes" Apr 23 18:13:19.039037 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:19.038923 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/039d4720-5427-47aa-9d46-2deec9604dd6-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd\" (UID: \"039d4720-5427-47aa-9d46-2deec9604dd6\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd" Apr 23 18:13:19.041301 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:19.041277 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/039d4720-5427-47aa-9d46-2deec9604dd6-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd\" (UID: \"039d4720-5427-47aa-9d46-2deec9604dd6\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd" Apr 23 18:13:19.129302 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:19.129267 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd" Apr 23 18:13:19.254766 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:19.254732 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd"] Apr 23 18:13:19.257880 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:13:19.257847 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod039d4720_5427_47aa_9d46_2deec9604dd6.slice/crio-b9a6b0eafa3e61db3e9333812ee0c269af55ce0958db29082415008297748856 WatchSource:0}: Error finding container b9a6b0eafa3e61db3e9333812ee0c269af55ce0958db29082415008297748856: Status 404 returned error can't find the container with id b9a6b0eafa3e61db3e9333812ee0c269af55ce0958db29082415008297748856 Apr 23 18:13:20.124463 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:20.124425 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd" event={"ID":"039d4720-5427-47aa-9d46-2deec9604dd6","Type":"ContainerStarted","Data":"fd51c103e48b74b2431bd2c3b360e8932e191c331582094a7ddd0a4d175e84e9"} Apr 23 18:13:20.124463 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:20.124466 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd" event={"ID":"039d4720-5427-47aa-9d46-2deec9604dd6","Type":"ContainerStarted","Data":"b9a6b0eafa3e61db3e9333812ee0c269af55ce0958db29082415008297748856"} Apr 23 18:13:24.135310 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:24.135274 2568 generic.go:358] "Generic (PLEG): container finished" podID="039d4720-5427-47aa-9d46-2deec9604dd6" containerID="fd51c103e48b74b2431bd2c3b360e8932e191c331582094a7ddd0a4d175e84e9" exitCode=0 Apr 23 18:13:24.135697 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:24.135321 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd" event={"ID":"039d4720-5427-47aa-9d46-2deec9604dd6","Type":"ContainerDied","Data":"fd51c103e48b74b2431bd2c3b360e8932e191c331582094a7ddd0a4d175e84e9"} Apr 23 18:13:25.139690 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:25.139658 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd" event={"ID":"039d4720-5427-47aa-9d46-2deec9604dd6","Type":"ContainerStarted","Data":"c03c8cef267a8d6dcd5747be1fc70fccdf4ec5a80831db1ec7a87640b8b8e447"} Apr 23 18:13:25.140094 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:25.139697 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd" event={"ID":"039d4720-5427-47aa-9d46-2deec9604dd6","Type":"ContainerStarted","Data":"561f5be0601d2e4d1f06941da80909e949f2701c87bc35323751b0fd81aa2600"} Apr 23 18:13:25.140094 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:25.139907 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd" Apr 23 18:13:25.140094 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:25.139946 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd" Apr 23 18:13:25.160292 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:25.160247 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd" podStartSLOduration=8.160232354 podStartE2EDuration="8.160232354s" podCreationTimestamp="2026-04-23 18:13:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:13:25.158258513 +0000 UTC m=+1263.078350988" watchObservedRunningTime="2026-04-23 18:13:25.160232354 +0000 UTC m=+1263.080324828" Apr 23 18:13:31.148477 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:13:31.148445 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd" Apr 23 18:14:01.152877 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:01.152843 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd" Apr 23 18:14:07.417192 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:07.417153 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd"] Apr 23 18:14:07.417602 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:07.417478 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd" podUID="039d4720-5427-47aa-9d46-2deec9604dd6" containerName="kserve-container" containerID="cri-o://561f5be0601d2e4d1f06941da80909e949f2701c87bc35323751b0fd81aa2600" gracePeriod=30 Apr 23 18:14:07.417602 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:07.417540 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd" podUID="039d4720-5427-47aa-9d46-2deec9604dd6" containerName="kube-rbac-proxy" containerID="cri-o://c03c8cef267a8d6dcd5747be1fc70fccdf4ec5a80831db1ec7a87640b8b8e447" gracePeriod=30 Apr 23 18:14:07.549209 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:07.545616 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn"] Apr 23 18:14:07.549209 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:07.546237 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aece6290-7c19-48b1-8de7-9bf0f3a382c9" containerName="storage-initializer" Apr 23 18:14:07.549209 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:07.546257 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="aece6290-7c19-48b1-8de7-9bf0f3a382c9" containerName="storage-initializer" Apr 23 18:14:07.549209 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:07.546282 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aece6290-7c19-48b1-8de7-9bf0f3a382c9" containerName="kube-rbac-proxy" Apr 23 18:14:07.549209 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:07.546291 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="aece6290-7c19-48b1-8de7-9bf0f3a382c9" containerName="kube-rbac-proxy" Apr 23 18:14:07.549209 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:07.546322 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aece6290-7c19-48b1-8de7-9bf0f3a382c9" containerName="kserve-container" Apr 23 18:14:07.549209 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:07.546331 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="aece6290-7c19-48b1-8de7-9bf0f3a382c9" containerName="kserve-container" Apr 23 18:14:07.549209 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:07.546446 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="aece6290-7c19-48b1-8de7-9bf0f3a382c9" containerName="kube-rbac-proxy" Apr 23 18:14:07.549209 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:07.546458 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="aece6290-7c19-48b1-8de7-9bf0f3a382c9" containerName="kserve-container" Apr 23 18:14:07.550712 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:07.550686 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" Apr 23 18:14:07.552679 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:07.552656 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\"" Apr 23 18:14:07.552679 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:07.552671 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-mcp-predictor-serving-cert\"" Apr 23 18:14:07.563492 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:07.563463 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn"] Apr 23 18:14:07.593072 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:07.593034 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjjd8\" (UniqueName: \"kubernetes.io/projected/4d4b9034-8631-4788-b747-a6e69f470a20-kube-api-access-jjjd8\") pod \"isvc-sklearn-mcp-predictor-598ddf44db-9wnsn\" (UID: \"4d4b9034-8631-4788-b747-a6e69f470a20\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" Apr 23 18:14:07.593235 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:07.593122 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d4b9034-8631-4788-b747-a6e69f470a20-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-598ddf44db-9wnsn\" (UID: \"4d4b9034-8631-4788-b747-a6e69f470a20\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" Apr 23 18:14:07.593235 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:07.593175 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4d4b9034-8631-4788-b747-a6e69f470a20-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-598ddf44db-9wnsn\" (UID: \"4d4b9034-8631-4788-b747-a6e69f470a20\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" Apr 23 18:14:07.593235 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:07.593201 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4d4b9034-8631-4788-b747-a6e69f470a20-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-mcp-predictor-598ddf44db-9wnsn\" (UID: \"4d4b9034-8631-4788-b747-a6e69f470a20\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" Apr 23 18:14:07.694513 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:07.694423 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jjjd8\" (UniqueName: \"kubernetes.io/projected/4d4b9034-8631-4788-b747-a6e69f470a20-kube-api-access-jjjd8\") pod \"isvc-sklearn-mcp-predictor-598ddf44db-9wnsn\" (UID: \"4d4b9034-8631-4788-b747-a6e69f470a20\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" Apr 23 18:14:07.694513 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:07.694471 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d4b9034-8631-4788-b747-a6e69f470a20-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-598ddf44db-9wnsn\" (UID: \"4d4b9034-8631-4788-b747-a6e69f470a20\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" Apr 23 18:14:07.694513 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:07.694501 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4d4b9034-8631-4788-b747-a6e69f470a20-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-598ddf44db-9wnsn\" (UID: \"4d4b9034-8631-4788-b747-a6e69f470a20\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" Apr 23 18:14:07.694807 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:07.694525 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4d4b9034-8631-4788-b747-a6e69f470a20-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-mcp-predictor-598ddf44db-9wnsn\" (UID: \"4d4b9034-8631-4788-b747-a6e69f470a20\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" Apr 23 18:14:07.694807 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:14:07.694613 2568 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-serving-cert: secret "isvc-sklearn-mcp-predictor-serving-cert" not found Apr 23 18:14:07.694807 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:14:07.694737 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d4b9034-8631-4788-b747-a6e69f470a20-proxy-tls podName:4d4b9034-8631-4788-b747-a6e69f470a20 nodeName:}" failed. No retries permitted until 2026-04-23 18:14:08.194696225 +0000 UTC m=+1306.114788682 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/4d4b9034-8631-4788-b747-a6e69f470a20-proxy-tls") pod "isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" (UID: "4d4b9034-8631-4788-b747-a6e69f470a20") : secret "isvc-sklearn-mcp-predictor-serving-cert" not found Apr 23 18:14:07.694963 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:07.694917 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d4b9034-8631-4788-b747-a6e69f470a20-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-598ddf44db-9wnsn\" (UID: \"4d4b9034-8631-4788-b747-a6e69f470a20\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" Apr 23 18:14:07.695189 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:07.695173 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4d4b9034-8631-4788-b747-a6e69f470a20-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-mcp-predictor-598ddf44db-9wnsn\" (UID: \"4d4b9034-8631-4788-b747-a6e69f470a20\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" Apr 23 18:14:07.706858 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:07.706832 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjjd8\" (UniqueName: \"kubernetes.io/projected/4d4b9034-8631-4788-b747-a6e69f470a20-kube-api-access-jjjd8\") pod \"isvc-sklearn-mcp-predictor-598ddf44db-9wnsn\" (UID: \"4d4b9034-8631-4788-b747-a6e69f470a20\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" Apr 23 18:14:08.199304 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:08.199266 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4d4b9034-8631-4788-b747-a6e69f470a20-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-598ddf44db-9wnsn\" (UID: \"4d4b9034-8631-4788-b747-a6e69f470a20\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" Apr 23 18:14:08.201842 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:08.201812 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4d4b9034-8631-4788-b747-a6e69f470a20-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-598ddf44db-9wnsn\" (UID: \"4d4b9034-8631-4788-b747-a6e69f470a20\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" Apr 23 18:14:08.264728 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:08.264692 2568 generic.go:358] "Generic (PLEG): container finished" podID="039d4720-5427-47aa-9d46-2deec9604dd6" containerID="c03c8cef267a8d6dcd5747be1fc70fccdf4ec5a80831db1ec7a87640b8b8e447" exitCode=2 Apr 23 18:14:08.264906 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:08.264767 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd" event={"ID":"039d4720-5427-47aa-9d46-2deec9604dd6","Type":"ContainerDied","Data":"c03c8cef267a8d6dcd5747be1fc70fccdf4ec5a80831db1ec7a87640b8b8e447"} Apr 23 18:14:08.460905 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:08.460802 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" Apr 23 18:14:08.604925 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:08.604901 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn"] Apr 23 18:14:08.607080 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:14:08.607048 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d4b9034_8631_4788_b747_a6e69f470a20.slice/crio-9d62d7c1d3dce4f5c3209cf70c0768e9b57f3b83af272bbb92cfc0b926b824ef WatchSource:0}: Error finding container 9d62d7c1d3dce4f5c3209cf70c0768e9b57f3b83af272bbb92cfc0b926b824ef: Status 404 returned error can't find the container with id 9d62d7c1d3dce4f5c3209cf70c0768e9b57f3b83af272bbb92cfc0b926b824ef Apr 23 18:14:08.679093 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:08.679065 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd" Apr 23 18:14:08.703848 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:08.703818 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/039d4720-5427-47aa-9d46-2deec9604dd6-proxy-tls\") pod \"039d4720-5427-47aa-9d46-2deec9604dd6\" (UID: \"039d4720-5427-47aa-9d46-2deec9604dd6\") " Apr 23 18:14:08.704056 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:08.703885 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56kr5\" (UniqueName: \"kubernetes.io/projected/039d4720-5427-47aa-9d46-2deec9604dd6-kube-api-access-56kr5\") pod \"039d4720-5427-47aa-9d46-2deec9604dd6\" (UID: \"039d4720-5427-47aa-9d46-2deec9604dd6\") " Apr 23 18:14:08.704056 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:08.703997 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/039d4720-5427-47aa-9d46-2deec9604dd6-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"039d4720-5427-47aa-9d46-2deec9604dd6\" (UID: \"039d4720-5427-47aa-9d46-2deec9604dd6\") " Apr 23 18:14:08.704056 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:08.704050 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/039d4720-5427-47aa-9d46-2deec9604dd6-kserve-provision-location\") pod \"039d4720-5427-47aa-9d46-2deec9604dd6\" (UID: \"039d4720-5427-47aa-9d46-2deec9604dd6\") " Apr 23 18:14:08.704591 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:08.704443 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/039d4720-5427-47aa-9d46-2deec9604dd6-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config") pod "039d4720-5427-47aa-9d46-2deec9604dd6" (UID: "039d4720-5427-47aa-9d46-2deec9604dd6"). InnerVolumeSpecName "isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:14:08.704591 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:08.704529 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/039d4720-5427-47aa-9d46-2deec9604dd6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "039d4720-5427-47aa-9d46-2deec9604dd6" (UID: "039d4720-5427-47aa-9d46-2deec9604dd6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:14:08.706592 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:08.706568 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/039d4720-5427-47aa-9d46-2deec9604dd6-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "039d4720-5427-47aa-9d46-2deec9604dd6" (UID: "039d4720-5427-47aa-9d46-2deec9604dd6"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:14:08.706681 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:08.706611 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/039d4720-5427-47aa-9d46-2deec9604dd6-kube-api-access-56kr5" (OuterVolumeSpecName: "kube-api-access-56kr5") pod "039d4720-5427-47aa-9d46-2deec9604dd6" (UID: "039d4720-5427-47aa-9d46-2deec9604dd6"). InnerVolumeSpecName "kube-api-access-56kr5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:14:08.805245 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:08.805154 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-56kr5\" (UniqueName: \"kubernetes.io/projected/039d4720-5427-47aa-9d46-2deec9604dd6-kube-api-access-56kr5\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:14:08.805245 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:08.805187 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/039d4720-5427-47aa-9d46-2deec9604dd6-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:14:08.805245 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:08.805198 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/039d4720-5427-47aa-9d46-2deec9604dd6-kserve-provision-location\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:14:08.805245 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:08.805208 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/039d4720-5427-47aa-9d46-2deec9604dd6-proxy-tls\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:14:09.269071 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:09.269038 2568 generic.go:358] "Generic (PLEG): container finished" podID="039d4720-5427-47aa-9d46-2deec9604dd6" containerID="561f5be0601d2e4d1f06941da80909e949f2701c87bc35323751b0fd81aa2600" exitCode=0 Apr 23 18:14:09.269249 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:09.269117 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd" Apr 23 18:14:09.269249 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:09.269127 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd" event={"ID":"039d4720-5427-47aa-9d46-2deec9604dd6","Type":"ContainerDied","Data":"561f5be0601d2e4d1f06941da80909e949f2701c87bc35323751b0fd81aa2600"} Apr 23 18:14:09.269249 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:09.269163 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd" event={"ID":"039d4720-5427-47aa-9d46-2deec9604dd6","Type":"ContainerDied","Data":"b9a6b0eafa3e61db3e9333812ee0c269af55ce0958db29082415008297748856"} Apr 23 18:14:09.269249 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:09.269179 2568 scope.go:117] "RemoveContainer" containerID="c03c8cef267a8d6dcd5747be1fc70fccdf4ec5a80831db1ec7a87640b8b8e447" Apr 23 18:14:09.270468 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:09.270435 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" event={"ID":"4d4b9034-8631-4788-b747-a6e69f470a20","Type":"ContainerStarted","Data":"b0526b521963d2cb6ca57806c85a3d5057698a204ee6f4170c734fc4cf391cbe"} Apr 23 18:14:09.270468 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:09.270467 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" event={"ID":"4d4b9034-8631-4788-b747-a6e69f470a20","Type":"ContainerStarted","Data":"9d62d7c1d3dce4f5c3209cf70c0768e9b57f3b83af272bbb92cfc0b926b824ef"} Apr 23 18:14:09.276751 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:09.276735 2568 scope.go:117] "RemoveContainer" containerID="561f5be0601d2e4d1f06941da80909e949f2701c87bc35323751b0fd81aa2600" Apr 23 18:14:09.283458 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:09.283443 2568 scope.go:117] "RemoveContainer" containerID="fd51c103e48b74b2431bd2c3b360e8932e191c331582094a7ddd0a4d175e84e9" Apr 23 18:14:09.289560 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:09.289544 2568 scope.go:117] "RemoveContainer" containerID="c03c8cef267a8d6dcd5747be1fc70fccdf4ec5a80831db1ec7a87640b8b8e447" Apr 23 18:14:09.289787 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:14:09.289767 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c03c8cef267a8d6dcd5747be1fc70fccdf4ec5a80831db1ec7a87640b8b8e447\": container with ID starting with c03c8cef267a8d6dcd5747be1fc70fccdf4ec5a80831db1ec7a87640b8b8e447 not found: ID does not exist" containerID="c03c8cef267a8d6dcd5747be1fc70fccdf4ec5a80831db1ec7a87640b8b8e447" Apr 23 18:14:09.289848 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:09.289796 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c03c8cef267a8d6dcd5747be1fc70fccdf4ec5a80831db1ec7a87640b8b8e447"} err="failed to get container status \"c03c8cef267a8d6dcd5747be1fc70fccdf4ec5a80831db1ec7a87640b8b8e447\": rpc error: code = NotFound desc = could not find container \"c03c8cef267a8d6dcd5747be1fc70fccdf4ec5a80831db1ec7a87640b8b8e447\": container with ID starting with c03c8cef267a8d6dcd5747be1fc70fccdf4ec5a80831db1ec7a87640b8b8e447 not found: ID does not exist" Apr 23 18:14:09.289848 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:09.289813 2568 scope.go:117] "RemoveContainer" containerID="561f5be0601d2e4d1f06941da80909e949f2701c87bc35323751b0fd81aa2600" Apr 23 18:14:09.290110 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:14:09.290091 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"561f5be0601d2e4d1f06941da80909e949f2701c87bc35323751b0fd81aa2600\": container with ID starting with 561f5be0601d2e4d1f06941da80909e949f2701c87bc35323751b0fd81aa2600 not found: ID does not exist" containerID="561f5be0601d2e4d1f06941da80909e949f2701c87bc35323751b0fd81aa2600" Apr 23 18:14:09.290175 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:09.290115 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"561f5be0601d2e4d1f06941da80909e949f2701c87bc35323751b0fd81aa2600"} err="failed to get container status \"561f5be0601d2e4d1f06941da80909e949f2701c87bc35323751b0fd81aa2600\": rpc error: code = NotFound desc = could not find container \"561f5be0601d2e4d1f06941da80909e949f2701c87bc35323751b0fd81aa2600\": container with ID starting with 561f5be0601d2e4d1f06941da80909e949f2701c87bc35323751b0fd81aa2600 not found: ID does not exist" Apr 23 18:14:09.290175 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:09.290131 2568 scope.go:117] "RemoveContainer" containerID="fd51c103e48b74b2431bd2c3b360e8932e191c331582094a7ddd0a4d175e84e9" Apr 23 18:14:09.290336 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:14:09.290319 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd51c103e48b74b2431bd2c3b360e8932e191c331582094a7ddd0a4d175e84e9\": container with ID starting with fd51c103e48b74b2431bd2c3b360e8932e191c331582094a7ddd0a4d175e84e9 not found: ID does not exist" containerID="fd51c103e48b74b2431bd2c3b360e8932e191c331582094a7ddd0a4d175e84e9" Apr 23 18:14:09.290385 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:09.290343 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd51c103e48b74b2431bd2c3b360e8932e191c331582094a7ddd0a4d175e84e9"} err="failed to get container status \"fd51c103e48b74b2431bd2c3b360e8932e191c331582094a7ddd0a4d175e84e9\": rpc error: code = NotFound desc = could not find container \"fd51c103e48b74b2431bd2c3b360e8932e191c331582094a7ddd0a4d175e84e9\": container with ID starting with fd51c103e48b74b2431bd2c3b360e8932e191c331582094a7ddd0a4d175e84e9 not found: ID does not exist" Apr 23 18:14:09.311412 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:09.311390 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd"] Apr 23 18:14:09.316444 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:09.316426 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-zclnd"] Apr 23 18:14:10.600505 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:10.600466 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="039d4720-5427-47aa-9d46-2deec9604dd6" path="/var/lib/kubelet/pods/039d4720-5427-47aa-9d46-2deec9604dd6/volumes" Apr 23 18:14:13.283259 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:13.283214 2568 generic.go:358] "Generic (PLEG): container finished" podID="4d4b9034-8631-4788-b747-a6e69f470a20" containerID="b0526b521963d2cb6ca57806c85a3d5057698a204ee6f4170c734fc4cf391cbe" exitCode=0 Apr 23 18:14:13.283259 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:13.283262 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" event={"ID":"4d4b9034-8631-4788-b747-a6e69f470a20","Type":"ContainerDied","Data":"b0526b521963d2cb6ca57806c85a3d5057698a204ee6f4170c734fc4cf391cbe"} Apr 23 18:14:14.292899 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:14.292859 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" event={"ID":"4d4b9034-8631-4788-b747-a6e69f470a20","Type":"ContainerStarted","Data":"9d3ad051ecd15d5f14942c92a45b9b8d122bc3ea9fee72965b9c4d20803a6f33"} Apr 23 18:14:16.300465 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:16.300425 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" event={"ID":"4d4b9034-8631-4788-b747-a6e69f470a20","Type":"ContainerStarted","Data":"e3ac93a5ab5bf150dd67c9ec1e6c593339ffb39d2d13ea6517c5bdd5def4d68e"} Apr 23 18:14:16.300465 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:16.300466 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" event={"ID":"4d4b9034-8631-4788-b747-a6e69f470a20","Type":"ContainerStarted","Data":"f57f5bf01851ddc79d80369ea4d6e8c4672db7a56e884d5bae975e311291bc66"} Apr 23 18:14:16.300881 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:16.300738 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" Apr 23 18:14:16.300881 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:16.300769 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" Apr 23 18:14:16.326687 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:16.326641 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" podStartSLOduration=7.434179115 podStartE2EDuration="9.326627378s" podCreationTimestamp="2026-04-23 18:14:07 +0000 UTC" firstStartedPulling="2026-04-23 18:14:13.357611978 +0000 UTC m=+1311.277704440" lastFinishedPulling="2026-04-23 18:14:15.25006025 +0000 UTC m=+1313.170152703" observedRunningTime="2026-04-23 18:14:16.325126216 +0000 UTC m=+1314.245218691" watchObservedRunningTime="2026-04-23 18:14:16.326627378 +0000 UTC m=+1314.246719853" Apr 23 18:14:17.303594 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:17.303567 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" Apr 23 18:14:23.312526 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:23.312493 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" Apr 23 18:14:43.313347 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:43.313269 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" podUID="4d4b9034-8631-4788-b747-a6e69f470a20" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.25:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.133.0.25:8080: connect: connection refused" Apr 23 18:14:53.315177 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:14:53.315137 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" Apr 23 18:15:23.316587 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:23.316555 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" Apr 23 18:15:27.687488 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:27.687458 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn"] Apr 23 18:15:27.688324 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:27.687753 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" podUID="4d4b9034-8631-4788-b747-a6e69f470a20" containerName="kserve-container" containerID="cri-o://9d3ad051ecd15d5f14942c92a45b9b8d122bc3ea9fee72965b9c4d20803a6f33" gracePeriod=30 Apr 23 18:15:27.688324 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:27.687832 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" podUID="4d4b9034-8631-4788-b747-a6e69f470a20" containerName="kserve-agent" containerID="cri-o://f57f5bf01851ddc79d80369ea4d6e8c4672db7a56e884d5bae975e311291bc66" gracePeriod=30 Apr 23 18:15:27.688324 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:27.687836 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" podUID="4d4b9034-8631-4788-b747-a6e69f470a20" containerName="kube-rbac-proxy" containerID="cri-o://e3ac93a5ab5bf150dd67c9ec1e6c593339ffb39d2d13ea6517c5bdd5def4d68e" gracePeriod=30 Apr 23 18:15:27.732336 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:27.732299 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-tvfhs"] Apr 23 18:15:27.732595 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:27.732583 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="039d4720-5427-47aa-9d46-2deec9604dd6" containerName="storage-initializer" Apr 23 18:15:27.732639 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:27.732597 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="039d4720-5427-47aa-9d46-2deec9604dd6" containerName="storage-initializer" Apr 23 18:15:27.732639 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:27.732616 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="039d4720-5427-47aa-9d46-2deec9604dd6" containerName="kserve-container" Apr 23 18:15:27.732639 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:27.732623 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="039d4720-5427-47aa-9d46-2deec9604dd6" containerName="kserve-container" Apr 23 18:15:27.732639 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:27.732630 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="039d4720-5427-47aa-9d46-2deec9604dd6" containerName="kube-rbac-proxy" Apr 23 18:15:27.732639 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:27.732635 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="039d4720-5427-47aa-9d46-2deec9604dd6" containerName="kube-rbac-proxy" Apr 23 18:15:27.732794 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:27.732688 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="039d4720-5427-47aa-9d46-2deec9604dd6" containerName="kserve-container" Apr 23 18:15:27.732794 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:27.732701 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="039d4720-5427-47aa-9d46-2deec9604dd6" containerName="kube-rbac-proxy" Apr 23 18:15:27.735828 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:27.735810 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-tvfhs" Apr 23 18:15:27.737676 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:27.737655 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-predictor-serving-cert\"" Apr 23 18:15:27.737864 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:27.737845 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-kube-rbac-proxy-sar-config\"" Apr 23 18:15:27.753815 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:27.753785 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-tvfhs"] Apr 23 18:15:27.797418 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:27.797386 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b1b481eb-27dd-430b-acdb-94bd93f345c3-kserve-provision-location\") pod \"isvc-paddle-predictor-6b8b7cfb4b-tvfhs\" (UID: \"b1b481eb-27dd-430b-acdb-94bd93f345c3\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-tvfhs" Apr 23 18:15:27.797600 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:27.797445 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqxxg\" (UniqueName: \"kubernetes.io/projected/b1b481eb-27dd-430b-acdb-94bd93f345c3-kube-api-access-tqxxg\") pod \"isvc-paddle-predictor-6b8b7cfb4b-tvfhs\" (UID: \"b1b481eb-27dd-430b-acdb-94bd93f345c3\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-tvfhs" Apr 23 18:15:27.797600 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:27.797501 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b1b481eb-27dd-430b-acdb-94bd93f345c3-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-tvfhs\" (UID: \"b1b481eb-27dd-430b-acdb-94bd93f345c3\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-tvfhs" Apr 23 18:15:27.797600 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:27.797570 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b1b481eb-27dd-430b-acdb-94bd93f345c3-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-predictor-6b8b7cfb4b-tvfhs\" (UID: \"b1b481eb-27dd-430b-acdb-94bd93f345c3\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-tvfhs" Apr 23 18:15:27.897951 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:27.897901 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tqxxg\" (UniqueName: \"kubernetes.io/projected/b1b481eb-27dd-430b-acdb-94bd93f345c3-kube-api-access-tqxxg\") pod \"isvc-paddle-predictor-6b8b7cfb4b-tvfhs\" (UID: \"b1b481eb-27dd-430b-acdb-94bd93f345c3\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-tvfhs" Apr 23 18:15:27.898147 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:27.898012 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b1b481eb-27dd-430b-acdb-94bd93f345c3-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-tvfhs\" (UID: \"b1b481eb-27dd-430b-acdb-94bd93f345c3\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-tvfhs" Apr 23 18:15:27.898147 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:27.898044 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b1b481eb-27dd-430b-acdb-94bd93f345c3-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-predictor-6b8b7cfb4b-tvfhs\" (UID: \"b1b481eb-27dd-430b-acdb-94bd93f345c3\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-tvfhs" Apr 23 18:15:27.898147 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:27.898080 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b1b481eb-27dd-430b-acdb-94bd93f345c3-kserve-provision-location\") pod \"isvc-paddle-predictor-6b8b7cfb4b-tvfhs\" (UID: \"b1b481eb-27dd-430b-acdb-94bd93f345c3\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-tvfhs" Apr 23 18:15:27.898570 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:27.898541 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b1b481eb-27dd-430b-acdb-94bd93f345c3-kserve-provision-location\") pod \"isvc-paddle-predictor-6b8b7cfb4b-tvfhs\" (UID: \"b1b481eb-27dd-430b-acdb-94bd93f345c3\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-tvfhs" Apr 23 18:15:27.898779 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:27.898761 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b1b481eb-27dd-430b-acdb-94bd93f345c3-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-predictor-6b8b7cfb4b-tvfhs\" (UID: \"b1b481eb-27dd-430b-acdb-94bd93f345c3\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-tvfhs" Apr 23 18:15:27.900426 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:27.900404 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b1b481eb-27dd-430b-acdb-94bd93f345c3-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-tvfhs\" (UID: \"b1b481eb-27dd-430b-acdb-94bd93f345c3\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-tvfhs" Apr 23 18:15:27.907650 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:27.907627 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqxxg\" (UniqueName: \"kubernetes.io/projected/b1b481eb-27dd-430b-acdb-94bd93f345c3-kube-api-access-tqxxg\") pod \"isvc-paddle-predictor-6b8b7cfb4b-tvfhs\" (UID: \"b1b481eb-27dd-430b-acdb-94bd93f345c3\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-tvfhs" Apr 23 18:15:28.046642 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:28.046525 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-tvfhs" Apr 23 18:15:28.172321 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:28.172283 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-tvfhs"] Apr 23 18:15:28.176190 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:15:28.176159 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1b481eb_27dd_430b_acdb_94bd93f345c3.slice/crio-065a5961e8c9203e57ecedff8eaeb2b2ffcce07f98e58b135211de1995da9248 WatchSource:0}: Error finding container 065a5961e8c9203e57ecedff8eaeb2b2ffcce07f98e58b135211de1995da9248: Status 404 returned error can't find the container with id 065a5961e8c9203e57ecedff8eaeb2b2ffcce07f98e58b135211de1995da9248 Apr 23 18:15:28.307637 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:28.307523 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" podUID="4d4b9034-8631-4788-b747-a6e69f470a20" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.25:8643/healthz\": dial tcp 10.133.0.25:8643: connect: connection refused" Apr 23 18:15:28.518367 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:28.518325 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-tvfhs" event={"ID":"b1b481eb-27dd-430b-acdb-94bd93f345c3","Type":"ContainerStarted","Data":"7d1cfb8f860146fe9bc7381a88b824ea4f9995dba1ec4b9f726d759ecf977f28"} Apr 23 18:15:28.518569 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:28.518373 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-tvfhs" event={"ID":"b1b481eb-27dd-430b-acdb-94bd93f345c3","Type":"ContainerStarted","Data":"065a5961e8c9203e57ecedff8eaeb2b2ffcce07f98e58b135211de1995da9248"} Apr 23 18:15:28.520376 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:28.520345 2568 generic.go:358] "Generic (PLEG): container finished" podID="4d4b9034-8631-4788-b747-a6e69f470a20" containerID="e3ac93a5ab5bf150dd67c9ec1e6c593339ffb39d2d13ea6517c5bdd5def4d68e" exitCode=2 Apr 23 18:15:28.520500 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:28.520423 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" event={"ID":"4d4b9034-8631-4788-b747-a6e69f470a20","Type":"ContainerDied","Data":"e3ac93a5ab5bf150dd67c9ec1e6c593339ffb39d2d13ea6517c5bdd5def4d68e"} Apr 23 18:15:30.528171 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:30.528132 2568 generic.go:358] "Generic (PLEG): container finished" podID="4d4b9034-8631-4788-b747-a6e69f470a20" containerID="9d3ad051ecd15d5f14942c92a45b9b8d122bc3ea9fee72965b9c4d20803a6f33" exitCode=0 Apr 23 18:15:30.528565 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:30.528204 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" event={"ID":"4d4b9034-8631-4788-b747-a6e69f470a20","Type":"ContainerDied","Data":"9d3ad051ecd15d5f14942c92a45b9b8d122bc3ea9fee72965b9c4d20803a6f33"} Apr 23 18:15:33.307995 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:33.307921 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" podUID="4d4b9034-8631-4788-b747-a6e69f470a20" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.25:8643/healthz\": dial tcp 10.133.0.25:8643: connect: connection refused" Apr 23 18:15:33.313494 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:33.313460 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" podUID="4d4b9034-8631-4788-b747-a6e69f470a20" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.25:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.133.0.25:8080: connect: connection refused" Apr 23 18:15:33.536776 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:33.536744 2568 generic.go:358] "Generic (PLEG): container finished" podID="b1b481eb-27dd-430b-acdb-94bd93f345c3" containerID="7d1cfb8f860146fe9bc7381a88b824ea4f9995dba1ec4b9f726d759ecf977f28" exitCode=0 Apr 23 18:15:33.536965 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:33.536818 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-tvfhs" event={"ID":"b1b481eb-27dd-430b-acdb-94bd93f345c3","Type":"ContainerDied","Data":"7d1cfb8f860146fe9bc7381a88b824ea4f9995dba1ec4b9f726d759ecf977f28"} Apr 23 18:15:38.307483 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:38.307439 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" podUID="4d4b9034-8631-4788-b747-a6e69f470a20" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.25:8643/healthz\": dial tcp 10.133.0.25:8643: connect: connection refused" Apr 23 18:15:38.308000 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:38.307599 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" Apr 23 18:15:43.307642 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:43.307594 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" podUID="4d4b9034-8631-4788-b747-a6e69f470a20" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.25:8643/healthz\": dial tcp 10.133.0.25:8643: connect: connection refused" Apr 23 18:15:43.313685 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:43.313645 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" podUID="4d4b9034-8631-4788-b747-a6e69f470a20" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.25:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.133.0.25:8080: connect: connection refused" Apr 23 18:15:44.578260 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:44.578224 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-tvfhs" event={"ID":"b1b481eb-27dd-430b-acdb-94bd93f345c3","Type":"ContainerStarted","Data":"61fe6113d1210ddb9cba29d76cf71d91831b83a9d6d0f4c84c7c6e404c011433"} Apr 23 18:15:44.578662 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:44.578270 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-tvfhs" event={"ID":"b1b481eb-27dd-430b-acdb-94bd93f345c3","Type":"ContainerStarted","Data":"169413a59bf5c335e9f1f875ddd431aa9304a405033051cea7926f2648df931a"} Apr 23 18:15:44.578662 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:44.578490 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-tvfhs" Apr 23 18:15:44.599798 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:44.599743 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-tvfhs" podStartSLOduration=6.8275484859999995 podStartE2EDuration="17.599729879s" podCreationTimestamp="2026-04-23 18:15:27 +0000 UTC" firstStartedPulling="2026-04-23 18:15:33.538087572 +0000 UTC m=+1391.458180026" lastFinishedPulling="2026-04-23 18:15:44.31026895 +0000 UTC m=+1402.230361419" observedRunningTime="2026-04-23 18:15:44.597125575 +0000 UTC m=+1402.517218061" watchObservedRunningTime="2026-04-23 18:15:44.599729879 +0000 UTC m=+1402.519822353" Apr 23 18:15:45.581125 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:45.581090 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-tvfhs" Apr 23 18:15:45.582166 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:45.582140 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-tvfhs" podUID="b1b481eb-27dd-430b-acdb-94bd93f345c3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 23 18:15:46.584111 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:46.584070 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-tvfhs" podUID="b1b481eb-27dd-430b-acdb-94bd93f345c3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 23 18:15:48.307399 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:48.307349 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" podUID="4d4b9034-8631-4788-b747-a6e69f470a20" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.25:8643/healthz\": dial tcp 10.133.0.25:8643: connect: connection refused" Apr 23 18:15:51.588134 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:51.588110 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-tvfhs" Apr 23 18:15:51.588631 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:51.588598 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-tvfhs" podUID="b1b481eb-27dd-430b-acdb-94bd93f345c3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 23 18:15:53.307503 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:53.307461 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" podUID="4d4b9034-8631-4788-b747-a6e69f470a20" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.25:8643/healthz\": dial tcp 10.133.0.25:8643: connect: connection refused" Apr 23 18:15:53.313770 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:53.313742 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" podUID="4d4b9034-8631-4788-b747-a6e69f470a20" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.25:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.133.0.25:8080: connect: connection refused" Apr 23 18:15:53.313870 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:53.313845 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" Apr 23 18:15:57.831600 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:57.831576 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" Apr 23 18:15:57.924616 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:57.924586 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d4b9034-8631-4788-b747-a6e69f470a20-kserve-provision-location\") pod \"4d4b9034-8631-4788-b747-a6e69f470a20\" (UID: \"4d4b9034-8631-4788-b747-a6e69f470a20\") " Apr 23 18:15:57.924616 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:57.924624 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4d4b9034-8631-4788-b747-a6e69f470a20-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"4d4b9034-8631-4788-b747-a6e69f470a20\" (UID: \"4d4b9034-8631-4788-b747-a6e69f470a20\") " Apr 23 18:15:57.924830 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:57.924645 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4d4b9034-8631-4788-b747-a6e69f470a20-proxy-tls\") pod \"4d4b9034-8631-4788-b747-a6e69f470a20\" (UID: \"4d4b9034-8631-4788-b747-a6e69f470a20\") " Apr 23 18:15:57.924830 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:57.924680 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjjd8\" (UniqueName: \"kubernetes.io/projected/4d4b9034-8631-4788-b747-a6e69f470a20-kube-api-access-jjjd8\") pod \"4d4b9034-8631-4788-b747-a6e69f470a20\" (UID: \"4d4b9034-8631-4788-b747-a6e69f470a20\") " Apr 23 18:15:57.925007 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:57.924982 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d4b9034-8631-4788-b747-a6e69f470a20-isvc-sklearn-mcp-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-mcp-kube-rbac-proxy-sar-config") pod "4d4b9034-8631-4788-b747-a6e69f470a20" (UID: "4d4b9034-8631-4788-b747-a6e69f470a20"). InnerVolumeSpecName "isvc-sklearn-mcp-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:15:57.925085 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:57.924983 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d4b9034-8631-4788-b747-a6e69f470a20-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4d4b9034-8631-4788-b747-a6e69f470a20" (UID: "4d4b9034-8631-4788-b747-a6e69f470a20"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:15:57.926768 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:57.926736 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d4b9034-8631-4788-b747-a6e69f470a20-kube-api-access-jjjd8" (OuterVolumeSpecName: "kube-api-access-jjjd8") pod "4d4b9034-8631-4788-b747-a6e69f470a20" (UID: "4d4b9034-8631-4788-b747-a6e69f470a20"). InnerVolumeSpecName "kube-api-access-jjjd8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:15:57.926884 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:57.926854 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d4b9034-8631-4788-b747-a6e69f470a20-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4d4b9034-8631-4788-b747-a6e69f470a20" (UID: "4d4b9034-8631-4788-b747-a6e69f470a20"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:15:58.026197 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:58.026104 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jjjd8\" (UniqueName: \"kubernetes.io/projected/4d4b9034-8631-4788-b747-a6e69f470a20-kube-api-access-jjjd8\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:15:58.026197 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:58.026135 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d4b9034-8631-4788-b747-a6e69f470a20-kserve-provision-location\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:15:58.026197 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:58.026146 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4d4b9034-8631-4788-b747-a6e69f470a20-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:15:58.026197 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:58.026157 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4d4b9034-8631-4788-b747-a6e69f470a20-proxy-tls\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:15:58.615942 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:58.615895 2568 generic.go:358] "Generic (PLEG): container finished" podID="4d4b9034-8631-4788-b747-a6e69f470a20" containerID="f57f5bf01851ddc79d80369ea4d6e8c4672db7a56e884d5bae975e311291bc66" exitCode=0 Apr 23 18:15:58.616139 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:58.616016 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" Apr 23 18:15:58.616139 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:58.616037 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" event={"ID":"4d4b9034-8631-4788-b747-a6e69f470a20","Type":"ContainerDied","Data":"f57f5bf01851ddc79d80369ea4d6e8c4672db7a56e884d5bae975e311291bc66"} Apr 23 18:15:58.616139 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:58.616067 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn" event={"ID":"4d4b9034-8631-4788-b747-a6e69f470a20","Type":"ContainerDied","Data":"9d62d7c1d3dce4f5c3209cf70c0768e9b57f3b83af272bbb92cfc0b926b824ef"} Apr 23 18:15:58.616139 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:58.616086 2568 scope.go:117] "RemoveContainer" containerID="e3ac93a5ab5bf150dd67c9ec1e6c593339ffb39d2d13ea6517c5bdd5def4d68e" Apr 23 18:15:58.624072 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:58.624006 2568 scope.go:117] "RemoveContainer" containerID="f57f5bf01851ddc79d80369ea4d6e8c4672db7a56e884d5bae975e311291bc66" Apr 23 18:15:58.631342 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:58.631318 2568 scope.go:117] "RemoveContainer" containerID="9d3ad051ecd15d5f14942c92a45b9b8d122bc3ea9fee72965b9c4d20803a6f33" Apr 23 18:15:58.637412 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:58.637390 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn"] Apr 23 18:15:58.639581 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:58.639560 2568 scope.go:117] "RemoveContainer" containerID="b0526b521963d2cb6ca57806c85a3d5057698a204ee6f4170c734fc4cf391cbe" Apr 23 18:15:58.644560 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:58.644532 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-598ddf44db-9wnsn"] Apr 23 18:15:58.646910 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:58.646893 2568 scope.go:117] "RemoveContainer" containerID="e3ac93a5ab5bf150dd67c9ec1e6c593339ffb39d2d13ea6517c5bdd5def4d68e" Apr 23 18:15:58.647208 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:15:58.647189 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3ac93a5ab5bf150dd67c9ec1e6c593339ffb39d2d13ea6517c5bdd5def4d68e\": container with ID starting with e3ac93a5ab5bf150dd67c9ec1e6c593339ffb39d2d13ea6517c5bdd5def4d68e not found: ID does not exist" containerID="e3ac93a5ab5bf150dd67c9ec1e6c593339ffb39d2d13ea6517c5bdd5def4d68e" Apr 23 18:15:58.647276 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:58.647221 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3ac93a5ab5bf150dd67c9ec1e6c593339ffb39d2d13ea6517c5bdd5def4d68e"} err="failed to get container status \"e3ac93a5ab5bf150dd67c9ec1e6c593339ffb39d2d13ea6517c5bdd5def4d68e\": rpc error: code = NotFound desc = could not find container \"e3ac93a5ab5bf150dd67c9ec1e6c593339ffb39d2d13ea6517c5bdd5def4d68e\": container with ID starting with e3ac93a5ab5bf150dd67c9ec1e6c593339ffb39d2d13ea6517c5bdd5def4d68e not found: ID does not exist" Apr 23 18:15:58.647276 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:58.647249 2568 scope.go:117] "RemoveContainer" containerID="f57f5bf01851ddc79d80369ea4d6e8c4672db7a56e884d5bae975e311291bc66" Apr 23 18:15:58.647528 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:15:58.647510 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f57f5bf01851ddc79d80369ea4d6e8c4672db7a56e884d5bae975e311291bc66\": container with ID starting with f57f5bf01851ddc79d80369ea4d6e8c4672db7a56e884d5bae975e311291bc66 not found: ID does not exist" containerID="f57f5bf01851ddc79d80369ea4d6e8c4672db7a56e884d5bae975e311291bc66" Apr 23 18:15:58.647569 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:58.647534 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f57f5bf01851ddc79d80369ea4d6e8c4672db7a56e884d5bae975e311291bc66"} err="failed to get container status \"f57f5bf01851ddc79d80369ea4d6e8c4672db7a56e884d5bae975e311291bc66\": rpc error: code = NotFound desc = could not find container \"f57f5bf01851ddc79d80369ea4d6e8c4672db7a56e884d5bae975e311291bc66\": container with ID starting with f57f5bf01851ddc79d80369ea4d6e8c4672db7a56e884d5bae975e311291bc66 not found: ID does not exist" Apr 23 18:15:58.647569 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:58.647550 2568 scope.go:117] "RemoveContainer" containerID="9d3ad051ecd15d5f14942c92a45b9b8d122bc3ea9fee72965b9c4d20803a6f33" Apr 23 18:15:58.647773 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:15:58.647758 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d3ad051ecd15d5f14942c92a45b9b8d122bc3ea9fee72965b9c4d20803a6f33\": container with ID starting with 9d3ad051ecd15d5f14942c92a45b9b8d122bc3ea9fee72965b9c4d20803a6f33 not found: ID does not exist" containerID="9d3ad051ecd15d5f14942c92a45b9b8d122bc3ea9fee72965b9c4d20803a6f33" Apr 23 18:15:58.647821 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:58.647779 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d3ad051ecd15d5f14942c92a45b9b8d122bc3ea9fee72965b9c4d20803a6f33"} err="failed to get container status \"9d3ad051ecd15d5f14942c92a45b9b8d122bc3ea9fee72965b9c4d20803a6f33\": rpc error: code = NotFound desc = could not find container \"9d3ad051ecd15d5f14942c92a45b9b8d122bc3ea9fee72965b9c4d20803a6f33\": container with ID starting with 9d3ad051ecd15d5f14942c92a45b9b8d122bc3ea9fee72965b9c4d20803a6f33 not found: ID does not exist" Apr 23 18:15:58.647821 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:58.647797 2568 scope.go:117] "RemoveContainer" containerID="b0526b521963d2cb6ca57806c85a3d5057698a204ee6f4170c734fc4cf391cbe" Apr 23 18:15:58.648012 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:15:58.647995 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0526b521963d2cb6ca57806c85a3d5057698a204ee6f4170c734fc4cf391cbe\": container with ID starting with b0526b521963d2cb6ca57806c85a3d5057698a204ee6f4170c734fc4cf391cbe not found: ID does not exist" containerID="b0526b521963d2cb6ca57806c85a3d5057698a204ee6f4170c734fc4cf391cbe" Apr 23 18:15:58.648080 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:15:58.648019 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0526b521963d2cb6ca57806c85a3d5057698a204ee6f4170c734fc4cf391cbe"} err="failed to get container status \"b0526b521963d2cb6ca57806c85a3d5057698a204ee6f4170c734fc4cf391cbe\": rpc error: code = NotFound desc = could not find container \"b0526b521963d2cb6ca57806c85a3d5057698a204ee6f4170c734fc4cf391cbe\": container with ID starting with b0526b521963d2cb6ca57806c85a3d5057698a204ee6f4170c734fc4cf391cbe not found: ID does not exist" Apr 23 18:16:00.602188 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:00.602154 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d4b9034-8631-4788-b747-a6e69f470a20" path="/var/lib/kubelet/pods/4d4b9034-8631-4788-b747-a6e69f470a20/volumes" Apr 23 18:16:01.588864 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:01.588816 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-tvfhs" podUID="b1b481eb-27dd-430b-acdb-94bd93f345c3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 23 18:16:11.589563 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:11.589474 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-tvfhs" podUID="b1b481eb-27dd-430b-acdb-94bd93f345c3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 23 18:16:21.589295 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:21.589256 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-tvfhs" podUID="b1b481eb-27dd-430b-acdb-94bd93f345c3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 23 18:16:31.589086 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:31.589057 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-tvfhs" Apr 23 18:16:39.167573 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:39.167538 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-tvfhs"] Apr 23 18:16:39.168078 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:39.167855 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-tvfhs" podUID="b1b481eb-27dd-430b-acdb-94bd93f345c3" containerName="kserve-container" containerID="cri-o://169413a59bf5c335e9f1f875ddd431aa9304a405033051cea7926f2648df931a" gracePeriod=30 Apr 23 18:16:39.168078 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:39.167886 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-tvfhs" podUID="b1b481eb-27dd-430b-acdb-94bd93f345c3" containerName="kube-rbac-proxy" containerID="cri-o://61fe6113d1210ddb9cba29d76cf71d91831b83a9d6d0f4c84c7c6e404c011433" gracePeriod=30 Apr 23 18:16:39.270256 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:39.270220 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg"] Apr 23 18:16:39.270515 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:39.270502 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4d4b9034-8631-4788-b747-a6e69f470a20" containerName="kserve-container" Apr 23 18:16:39.270566 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:39.270517 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d4b9034-8631-4788-b747-a6e69f470a20" containerName="kserve-container" Apr 23 18:16:39.270566 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:39.270528 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4d4b9034-8631-4788-b747-a6e69f470a20" containerName="storage-initializer" Apr 23 18:16:39.270566 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:39.270534 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d4b9034-8631-4788-b747-a6e69f470a20" containerName="storage-initializer" Apr 23 18:16:39.270566 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:39.270549 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4d4b9034-8631-4788-b747-a6e69f470a20" containerName="kube-rbac-proxy" Apr 23 18:16:39.270566 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:39.270555 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d4b9034-8631-4788-b747-a6e69f470a20" containerName="kube-rbac-proxy" Apr 23 18:16:39.270566 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:39.270561 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4d4b9034-8631-4788-b747-a6e69f470a20" containerName="kserve-agent" Apr 23 18:16:39.270566 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:39.270568 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d4b9034-8631-4788-b747-a6e69f470a20" containerName="kserve-agent" Apr 23 18:16:39.270806 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:39.270611 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="4d4b9034-8631-4788-b747-a6e69f470a20" containerName="kube-rbac-proxy" Apr 23 18:16:39.270806 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:39.270625 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="4d4b9034-8631-4788-b747-a6e69f470a20" containerName="kserve-agent" Apr 23 18:16:39.270806 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:39.270635 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="4d4b9034-8631-4788-b747-a6e69f470a20" containerName="kserve-container" Apr 23 18:16:39.290357 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:39.290193 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg"] Apr 23 18:16:39.290441 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:39.290363 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg" Apr 23 18:16:39.292352 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:39.292326 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-runtime-predictor-serving-cert\"" Apr 23 18:16:39.292489 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:39.292410 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-runtime-kube-rbac-proxy-sar-config\"" Apr 23 18:16:39.434218 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:39.434137 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/78830a59-20c3-498c-a1d8-739301cd31c1-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg\" (UID: \"78830a59-20c3-498c-a1d8-739301cd31c1\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg" Apr 23 18:16:39.434218 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:39.434213 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/78830a59-20c3-498c-a1d8-739301cd31c1-proxy-tls\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg\" (UID: \"78830a59-20c3-498c-a1d8-739301cd31c1\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg" Apr 23 18:16:39.434427 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:39.434257 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2r2j\" (UniqueName: \"kubernetes.io/projected/78830a59-20c3-498c-a1d8-739301cd31c1-kube-api-access-t2r2j\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg\" (UID: \"78830a59-20c3-498c-a1d8-739301cd31c1\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg" Apr 23 18:16:39.434427 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:39.434292 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/78830a59-20c3-498c-a1d8-739301cd31c1-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg\" (UID: \"78830a59-20c3-498c-a1d8-739301cd31c1\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg" Apr 23 18:16:39.535384 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:39.535345 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/78830a59-20c3-498c-a1d8-739301cd31c1-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg\" (UID: \"78830a59-20c3-498c-a1d8-739301cd31c1\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg" Apr 23 18:16:39.535592 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:39.535404 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/78830a59-20c3-498c-a1d8-739301cd31c1-proxy-tls\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg\" (UID: \"78830a59-20c3-498c-a1d8-739301cd31c1\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg" Apr 23 18:16:39.535592 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:39.535443 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t2r2j\" (UniqueName: \"kubernetes.io/projected/78830a59-20c3-498c-a1d8-739301cd31c1-kube-api-access-t2r2j\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg\" (UID: \"78830a59-20c3-498c-a1d8-739301cd31c1\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg" Apr 23 18:16:39.535592 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:39.535469 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/78830a59-20c3-498c-a1d8-739301cd31c1-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg\" (UID: \"78830a59-20c3-498c-a1d8-739301cd31c1\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg" Apr 23 18:16:39.535893 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:39.535875 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/78830a59-20c3-498c-a1d8-739301cd31c1-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg\" (UID: \"78830a59-20c3-498c-a1d8-739301cd31c1\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg" Apr 23 18:16:39.536163 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:39.536142 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/78830a59-20c3-498c-a1d8-739301cd31c1-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg\" (UID: \"78830a59-20c3-498c-a1d8-739301cd31c1\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg" Apr 23 18:16:39.537868 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:39.537851 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/78830a59-20c3-498c-a1d8-739301cd31c1-proxy-tls\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg\" (UID: \"78830a59-20c3-498c-a1d8-739301cd31c1\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg" Apr 23 18:16:39.545183 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:39.545159 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2r2j\" (UniqueName: \"kubernetes.io/projected/78830a59-20c3-498c-a1d8-739301cd31c1-kube-api-access-t2r2j\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg\" (UID: \"78830a59-20c3-498c-a1d8-739301cd31c1\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg" Apr 23 18:16:39.602500 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:39.602460 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg" Apr 23 18:16:39.724124 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:39.724097 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg"] Apr 23 18:16:39.726705 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:16:39.726673 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78830a59_20c3_498c_a1d8_739301cd31c1.slice/crio-0c84132ce6739d836b80406e5182a95a46c64be11b718828c687a1991cea16ae WatchSource:0}: Error finding container 0c84132ce6739d836b80406e5182a95a46c64be11b718828c687a1991cea16ae: Status 404 returned error can't find the container with id 0c84132ce6739d836b80406e5182a95a46c64be11b718828c687a1991cea16ae Apr 23 18:16:39.732206 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:39.732182 2568 generic.go:358] "Generic (PLEG): container finished" podID="b1b481eb-27dd-430b-acdb-94bd93f345c3" containerID="61fe6113d1210ddb9cba29d76cf71d91831b83a9d6d0f4c84c7c6e404c011433" exitCode=2 Apr 23 18:16:39.732336 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:39.732246 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-tvfhs" event={"ID":"b1b481eb-27dd-430b-acdb-94bd93f345c3","Type":"ContainerDied","Data":"61fe6113d1210ddb9cba29d76cf71d91831b83a9d6d0f4c84c7c6e404c011433"} Apr 23 18:16:39.733389 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:39.733356 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg" event={"ID":"78830a59-20c3-498c-a1d8-739301cd31c1","Type":"ContainerStarted","Data":"0c84132ce6739d836b80406e5182a95a46c64be11b718828c687a1991cea16ae"} Apr 23 18:16:40.738128 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:40.738085 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg" event={"ID":"78830a59-20c3-498c-a1d8-739301cd31c1","Type":"ContainerStarted","Data":"dee57b63ac54d2e2e23c3776058ffcf5c7f679fe72b91ea9bf7d7f5c0a100d8c"} Apr 23 18:16:41.584819 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:41.584776 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-tvfhs" podUID="b1b481eb-27dd-430b-acdb-94bd93f345c3" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.26:8643/healthz\": dial tcp 10.133.0.26:8643: connect: connection refused" Apr 23 18:16:41.589125 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:41.589095 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-tvfhs" podUID="b1b481eb-27dd-430b-acdb-94bd93f345c3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 23 18:16:42.113763 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:42.113738 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-tvfhs" Apr 23 18:16:42.258217 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:42.258122 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b1b481eb-27dd-430b-acdb-94bd93f345c3-proxy-tls\") pod \"b1b481eb-27dd-430b-acdb-94bd93f345c3\" (UID: \"b1b481eb-27dd-430b-acdb-94bd93f345c3\") " Apr 23 18:16:42.258217 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:42.258168 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b1b481eb-27dd-430b-acdb-94bd93f345c3-kserve-provision-location\") pod \"b1b481eb-27dd-430b-acdb-94bd93f345c3\" (UID: \"b1b481eb-27dd-430b-acdb-94bd93f345c3\") " Apr 23 18:16:42.258217 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:42.258199 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b1b481eb-27dd-430b-acdb-94bd93f345c3-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"b1b481eb-27dd-430b-acdb-94bd93f345c3\" (UID: \"b1b481eb-27dd-430b-acdb-94bd93f345c3\") " Apr 23 18:16:42.258524 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:42.258283 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqxxg\" (UniqueName: \"kubernetes.io/projected/b1b481eb-27dd-430b-acdb-94bd93f345c3-kube-api-access-tqxxg\") pod \"b1b481eb-27dd-430b-acdb-94bd93f345c3\" (UID: \"b1b481eb-27dd-430b-acdb-94bd93f345c3\") " Apr 23 18:16:42.258623 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:42.258599 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1b481eb-27dd-430b-acdb-94bd93f345c3-isvc-paddle-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-paddle-kube-rbac-proxy-sar-config") pod "b1b481eb-27dd-430b-acdb-94bd93f345c3" (UID: "b1b481eb-27dd-430b-acdb-94bd93f345c3"). InnerVolumeSpecName "isvc-paddle-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:16:42.260344 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:42.260312 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1b481eb-27dd-430b-acdb-94bd93f345c3-kube-api-access-tqxxg" (OuterVolumeSpecName: "kube-api-access-tqxxg") pod "b1b481eb-27dd-430b-acdb-94bd93f345c3" (UID: "b1b481eb-27dd-430b-acdb-94bd93f345c3"). InnerVolumeSpecName "kube-api-access-tqxxg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:16:42.260453 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:42.260347 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1b481eb-27dd-430b-acdb-94bd93f345c3-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b1b481eb-27dd-430b-acdb-94bd93f345c3" (UID: "b1b481eb-27dd-430b-acdb-94bd93f345c3"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:16:42.268007 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:42.267975 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1b481eb-27dd-430b-acdb-94bd93f345c3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b1b481eb-27dd-430b-acdb-94bd93f345c3" (UID: "b1b481eb-27dd-430b-acdb-94bd93f345c3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:16:42.359539 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:42.359507 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tqxxg\" (UniqueName: \"kubernetes.io/projected/b1b481eb-27dd-430b-acdb-94bd93f345c3-kube-api-access-tqxxg\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:16:42.359539 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:42.359536 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b1b481eb-27dd-430b-acdb-94bd93f345c3-proxy-tls\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:16:42.359539 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:42.359546 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b1b481eb-27dd-430b-acdb-94bd93f345c3-kserve-provision-location\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:16:42.359779 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:42.359557 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b1b481eb-27dd-430b-acdb-94bd93f345c3-isvc-paddle-kube-rbac-proxy-sar-config\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:16:42.747771 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:42.747737 2568 generic.go:358] "Generic (PLEG): container finished" podID="b1b481eb-27dd-430b-acdb-94bd93f345c3" containerID="169413a59bf5c335e9f1f875ddd431aa9304a405033051cea7926f2648df931a" exitCode=0 Apr 23 18:16:42.747974 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:42.747817 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-tvfhs" Apr 23 18:16:42.747974 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:42.747825 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-tvfhs" event={"ID":"b1b481eb-27dd-430b-acdb-94bd93f345c3","Type":"ContainerDied","Data":"169413a59bf5c335e9f1f875ddd431aa9304a405033051cea7926f2648df931a"} Apr 23 18:16:42.747974 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:42.747865 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-tvfhs" event={"ID":"b1b481eb-27dd-430b-acdb-94bd93f345c3","Type":"ContainerDied","Data":"065a5961e8c9203e57ecedff8eaeb2b2ffcce07f98e58b135211de1995da9248"} Apr 23 18:16:42.747974 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:42.747881 2568 scope.go:117] "RemoveContainer" containerID="61fe6113d1210ddb9cba29d76cf71d91831b83a9d6d0f4c84c7c6e404c011433" Apr 23 18:16:42.755883 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:42.755861 2568 scope.go:117] "RemoveContainer" containerID="169413a59bf5c335e9f1f875ddd431aa9304a405033051cea7926f2648df931a" Apr 23 18:16:42.763194 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:42.763171 2568 scope.go:117] "RemoveContainer" containerID="7d1cfb8f860146fe9bc7381a88b824ea4f9995dba1ec4b9f726d759ecf977f28" Apr 23 18:16:42.768654 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:42.768630 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-tvfhs"] Apr 23 18:16:42.770891 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:42.770869 2568 scope.go:117] "RemoveContainer" containerID="61fe6113d1210ddb9cba29d76cf71d91831b83a9d6d0f4c84c7c6e404c011433" Apr 23 18:16:42.771204 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:16:42.771184 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61fe6113d1210ddb9cba29d76cf71d91831b83a9d6d0f4c84c7c6e404c011433\": container with ID starting with 61fe6113d1210ddb9cba29d76cf71d91831b83a9d6d0f4c84c7c6e404c011433 not found: ID does not exist" containerID="61fe6113d1210ddb9cba29d76cf71d91831b83a9d6d0f4c84c7c6e404c011433" Apr 23 18:16:42.771257 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:42.771220 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61fe6113d1210ddb9cba29d76cf71d91831b83a9d6d0f4c84c7c6e404c011433"} err="failed to get container status \"61fe6113d1210ddb9cba29d76cf71d91831b83a9d6d0f4c84c7c6e404c011433\": rpc error: code = NotFound desc = could not find container \"61fe6113d1210ddb9cba29d76cf71d91831b83a9d6d0f4c84c7c6e404c011433\": container with ID starting with 61fe6113d1210ddb9cba29d76cf71d91831b83a9d6d0f4c84c7c6e404c011433 not found: ID does not exist" Apr 23 18:16:42.771257 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:42.771240 2568 scope.go:117] "RemoveContainer" containerID="169413a59bf5c335e9f1f875ddd431aa9304a405033051cea7926f2648df931a" Apr 23 18:16:42.771517 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:16:42.771499 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"169413a59bf5c335e9f1f875ddd431aa9304a405033051cea7926f2648df931a\": container with ID starting with 169413a59bf5c335e9f1f875ddd431aa9304a405033051cea7926f2648df931a not found: ID does not exist" containerID="169413a59bf5c335e9f1f875ddd431aa9304a405033051cea7926f2648df931a" Apr 23 18:16:42.771563 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:42.771523 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"169413a59bf5c335e9f1f875ddd431aa9304a405033051cea7926f2648df931a"} err="failed to get container status \"169413a59bf5c335e9f1f875ddd431aa9304a405033051cea7926f2648df931a\": rpc error: code = NotFound desc = could not find container \"169413a59bf5c335e9f1f875ddd431aa9304a405033051cea7926f2648df931a\": container with ID starting with 169413a59bf5c335e9f1f875ddd431aa9304a405033051cea7926f2648df931a not found: ID does not exist" Apr 23 18:16:42.771563 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:42.771539 2568 scope.go:117] "RemoveContainer" containerID="7d1cfb8f860146fe9bc7381a88b824ea4f9995dba1ec4b9f726d759ecf977f28" Apr 23 18:16:42.771769 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:16:42.771752 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d1cfb8f860146fe9bc7381a88b824ea4f9995dba1ec4b9f726d759ecf977f28\": container with ID starting with 7d1cfb8f860146fe9bc7381a88b824ea4f9995dba1ec4b9f726d759ecf977f28 not found: ID does not exist" containerID="7d1cfb8f860146fe9bc7381a88b824ea4f9995dba1ec4b9f726d759ecf977f28" Apr 23 18:16:42.771812 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:42.771774 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d1cfb8f860146fe9bc7381a88b824ea4f9995dba1ec4b9f726d759ecf977f28"} err="failed to get container status \"7d1cfb8f860146fe9bc7381a88b824ea4f9995dba1ec4b9f726d759ecf977f28\": rpc error: code = NotFound desc = could not find container \"7d1cfb8f860146fe9bc7381a88b824ea4f9995dba1ec4b9f726d759ecf977f28\": container with ID starting with 7d1cfb8f860146fe9bc7381a88b824ea4f9995dba1ec4b9f726d759ecf977f28 not found: ID does not exist" Apr 23 18:16:42.777895 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:42.777864 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-tvfhs"] Apr 23 18:16:44.599306 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:44.599277 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1b481eb-27dd-430b-acdb-94bd93f345c3" path="/var/lib/kubelet/pods/b1b481eb-27dd-430b-acdb-94bd93f345c3/volumes" Apr 23 18:16:44.756219 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:44.756188 2568 generic.go:358] "Generic (PLEG): container finished" podID="78830a59-20c3-498c-a1d8-739301cd31c1" containerID="dee57b63ac54d2e2e23c3776058ffcf5c7f679fe72b91ea9bf7d7f5c0a100d8c" exitCode=0 Apr 23 18:16:44.756385 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:44.756243 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg" event={"ID":"78830a59-20c3-498c-a1d8-739301cd31c1","Type":"ContainerDied","Data":"dee57b63ac54d2e2e23c3776058ffcf5c7f679fe72b91ea9bf7d7f5c0a100d8c"} Apr 23 18:16:45.760816 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:45.760778 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg" event={"ID":"78830a59-20c3-498c-a1d8-739301cd31c1","Type":"ContainerStarted","Data":"0bb47419274491ab80cf44fdef32b67c55b90afa656a3db380165469b4e7278a"} Apr 23 18:16:45.760816 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:45.760822 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg" event={"ID":"78830a59-20c3-498c-a1d8-739301cd31c1","Type":"ContainerStarted","Data":"05f4d65956191186ff8c9284d5629bd3dc9ec349943f2fbfa4f84c549ff81e23"} Apr 23 18:16:45.761292 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:45.761134 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg" Apr 23 18:16:45.761292 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:45.761245 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg" Apr 23 18:16:45.762739 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:45.762713 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg" podUID="78830a59-20c3-498c-a1d8-739301cd31c1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 18:16:45.787072 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:45.787030 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg" podStartSLOduration=6.787016091 podStartE2EDuration="6.787016091s" podCreationTimestamp="2026-04-23 18:16:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:16:45.785070706 +0000 UTC m=+1463.705163181" watchObservedRunningTime="2026-04-23 18:16:45.787016091 +0000 UTC m=+1463.707108566" Apr 23 18:16:46.763744 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:46.763706 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg" podUID="78830a59-20c3-498c-a1d8-739301cd31c1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 18:16:51.768110 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:51.768075 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg" Apr 23 18:16:51.768620 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:16:51.768566 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg" podUID="78830a59-20c3-498c-a1d8-739301cd31c1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 18:17:01.769260 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:01.769222 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg" podUID="78830a59-20c3-498c-a1d8-739301cd31c1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 18:17:11.768544 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:11.768502 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg" podUID="78830a59-20c3-498c-a1d8-739301cd31c1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 18:17:21.768576 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:21.768537 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg" podUID="78830a59-20c3-498c-a1d8-739301cd31c1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 18:17:22.608533 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:22.608506 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwrk7_52f4d4c2-438f-4486-8ffb-4afbaad14bf1/ovn-acl-logging/0.log" Apr 23 18:17:22.612297 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:22.612275 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwrk7_52f4d4c2-438f-4486-8ffb-4afbaad14bf1/ovn-acl-logging/0.log" Apr 23 18:17:22.614342 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:22.614322 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_8dc4456d0c7e39a0756b16fde0e83318/kube-rbac-proxy-crio/4.log" Apr 23 18:17:22.615725 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:22.615705 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_8dc4456d0c7e39a0756b16fde0e83318/kube-rbac-proxy-crio/4.log" Apr 23 18:17:31.769067 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:31.769042 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg" Apr 23 18:17:40.727706 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:40.727667 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg"] Apr 23 18:17:40.728301 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:40.728006 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg" podUID="78830a59-20c3-498c-a1d8-739301cd31c1" containerName="kserve-container" containerID="cri-o://05f4d65956191186ff8c9284d5629bd3dc9ec349943f2fbfa4f84c549ff81e23" gracePeriod=30 Apr 23 18:17:40.728301 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:40.728063 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg" podUID="78830a59-20c3-498c-a1d8-739301cd31c1" containerName="kube-rbac-proxy" containerID="cri-o://0bb47419274491ab80cf44fdef32b67c55b90afa656a3db380165469b4e7278a" gracePeriod=30 Apr 23 18:17:40.888963 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:40.888910 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc"] Apr 23 18:17:40.889268 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:40.889251 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b1b481eb-27dd-430b-acdb-94bd93f345c3" containerName="kserve-container" Apr 23 18:17:40.889350 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:40.889270 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1b481eb-27dd-430b-acdb-94bd93f345c3" containerName="kserve-container" Apr 23 18:17:40.889350 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:40.889296 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b1b481eb-27dd-430b-acdb-94bd93f345c3" containerName="kube-rbac-proxy" Apr 23 18:17:40.889350 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:40.889305 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1b481eb-27dd-430b-acdb-94bd93f345c3" containerName="kube-rbac-proxy" Apr 23 18:17:40.889350 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:40.889320 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b1b481eb-27dd-430b-acdb-94bd93f345c3" containerName="storage-initializer" Apr 23 18:17:40.889350 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:40.889329 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1b481eb-27dd-430b-acdb-94bd93f345c3" containerName="storage-initializer" Apr 23 18:17:40.889607 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:40.889403 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="b1b481eb-27dd-430b-acdb-94bd93f345c3" containerName="kserve-container" Apr 23 18:17:40.889607 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:40.889414 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="b1b481eb-27dd-430b-acdb-94bd93f345c3" containerName="kube-rbac-proxy" Apr 23 18:17:40.892343 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:40.892321 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc" Apr 23 18:17:40.895499 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:40.895477 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-v2-kserve-predictor-serving-cert\"" Apr 23 18:17:40.895623 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:40.895535 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\"" Apr 23 18:17:40.905342 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:40.905310 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc"] Apr 23 18:17:40.912016 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:40.911985 2568 generic.go:358] "Generic (PLEG): container finished" podID="78830a59-20c3-498c-a1d8-739301cd31c1" containerID="0bb47419274491ab80cf44fdef32b67c55b90afa656a3db380165469b4e7278a" exitCode=2 Apr 23 18:17:40.912188 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:40.912059 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg" event={"ID":"78830a59-20c3-498c-a1d8-739301cd31c1","Type":"ContainerDied","Data":"0bb47419274491ab80cf44fdef32b67c55b90afa656a3db380165469b4e7278a"} Apr 23 18:17:41.011157 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:41.011017 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86964dd8-1e58-47f5-b87a-76d73243b1aa-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc\" (UID: \"86964dd8-1e58-47f5-b87a-76d73243b1aa\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc" Apr 23 18:17:41.011157 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:41.011059 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/86964dd8-1e58-47f5-b87a-76d73243b1aa-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc\" (UID: \"86964dd8-1e58-47f5-b87a-76d73243b1aa\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc" Apr 23 18:17:41.019012 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:41.011165 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bngkt\" (UniqueName: \"kubernetes.io/projected/86964dd8-1e58-47f5-b87a-76d73243b1aa-kube-api-access-bngkt\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc\" (UID: \"86964dd8-1e58-47f5-b87a-76d73243b1aa\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc" Apr 23 18:17:41.019012 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:41.011198 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/86964dd8-1e58-47f5-b87a-76d73243b1aa-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc\" (UID: \"86964dd8-1e58-47f5-b87a-76d73243b1aa\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc" Apr 23 18:17:41.112269 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:41.112228 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bngkt\" (UniqueName: \"kubernetes.io/projected/86964dd8-1e58-47f5-b87a-76d73243b1aa-kube-api-access-bngkt\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc\" (UID: \"86964dd8-1e58-47f5-b87a-76d73243b1aa\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc" Apr 23 18:17:41.112269 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:41.112273 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/86964dd8-1e58-47f5-b87a-76d73243b1aa-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc\" (UID: \"86964dd8-1e58-47f5-b87a-76d73243b1aa\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc" Apr 23 18:17:41.112507 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:41.112312 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86964dd8-1e58-47f5-b87a-76d73243b1aa-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc\" (UID: \"86964dd8-1e58-47f5-b87a-76d73243b1aa\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc" Apr 23 18:17:41.112507 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:41.112333 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/86964dd8-1e58-47f5-b87a-76d73243b1aa-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc\" (UID: \"86964dd8-1e58-47f5-b87a-76d73243b1aa\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc" Apr 23 18:17:41.112756 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:41.112736 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86964dd8-1e58-47f5-b87a-76d73243b1aa-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc\" (UID: \"86964dd8-1e58-47f5-b87a-76d73243b1aa\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc" Apr 23 18:17:41.113101 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:41.113081 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/86964dd8-1e58-47f5-b87a-76d73243b1aa-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc\" (UID: \"86964dd8-1e58-47f5-b87a-76d73243b1aa\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc" Apr 23 18:17:41.114915 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:41.114883 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/86964dd8-1e58-47f5-b87a-76d73243b1aa-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc\" (UID: \"86964dd8-1e58-47f5-b87a-76d73243b1aa\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc" Apr 23 18:17:41.121452 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:41.121429 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bngkt\" (UniqueName: \"kubernetes.io/projected/86964dd8-1e58-47f5-b87a-76d73243b1aa-kube-api-access-bngkt\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc\" (UID: \"86964dd8-1e58-47f5-b87a-76d73243b1aa\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc" Apr 23 18:17:41.202380 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:41.202334 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc" Apr 23 18:17:41.326732 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:41.326690 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc"] Apr 23 18:17:41.329672 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:17:41.329644 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86964dd8_1e58_47f5_b87a_76d73243b1aa.slice/crio-16f49c1747cf49d440d0cec6d351fd25508aa13b2ba3e2b3ba02550567595453 WatchSource:0}: Error finding container 16f49c1747cf49d440d0cec6d351fd25508aa13b2ba3e2b3ba02550567595453: Status 404 returned error can't find the container with id 16f49c1747cf49d440d0cec6d351fd25508aa13b2ba3e2b3ba02550567595453 Apr 23 18:17:41.331397 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:41.331381 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 18:17:41.764364 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:41.764323 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg" podUID="78830a59-20c3-498c-a1d8-739301cd31c1" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.27:8643/healthz\": dial tcp 10.133.0.27:8643: connect: connection refused" Apr 23 18:17:41.768705 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:41.768677 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg" podUID="78830a59-20c3-498c-a1d8-739301cd31c1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 18:17:41.918815 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:41.918776 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc" event={"ID":"86964dd8-1e58-47f5-b87a-76d73243b1aa","Type":"ContainerStarted","Data":"6a0b1d421c60327a512bc4f7ada35570ea67c1bc358b5b92fe0af66274844e59"} Apr 23 18:17:41.918815 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:41.918818 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc" event={"ID":"86964dd8-1e58-47f5-b87a-76d73243b1aa","Type":"ContainerStarted","Data":"16f49c1747cf49d440d0cec6d351fd25508aa13b2ba3e2b3ba02550567595453"} Apr 23 18:17:43.468955 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:43.468911 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg" Apr 23 18:17:43.631551 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:43.631524 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/78830a59-20c3-498c-a1d8-739301cd31c1-proxy-tls\") pod \"78830a59-20c3-498c-a1d8-739301cd31c1\" (UID: \"78830a59-20c3-498c-a1d8-739301cd31c1\") " Apr 23 18:17:43.631740 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:43.631570 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/78830a59-20c3-498c-a1d8-739301cd31c1-kserve-provision-location\") pod \"78830a59-20c3-498c-a1d8-739301cd31c1\" (UID: \"78830a59-20c3-498c-a1d8-739301cd31c1\") " Apr 23 18:17:43.631740 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:43.631601 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/78830a59-20c3-498c-a1d8-739301cd31c1-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"78830a59-20c3-498c-a1d8-739301cd31c1\" (UID: \"78830a59-20c3-498c-a1d8-739301cd31c1\") " Apr 23 18:17:43.631740 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:43.631624 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2r2j\" (UniqueName: \"kubernetes.io/projected/78830a59-20c3-498c-a1d8-739301cd31c1-kube-api-access-t2r2j\") pod \"78830a59-20c3-498c-a1d8-739301cd31c1\" (UID: \"78830a59-20c3-498c-a1d8-739301cd31c1\") " Apr 23 18:17:43.632012 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:43.631984 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78830a59-20c3-498c-a1d8-739301cd31c1-isvc-paddle-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-paddle-runtime-kube-rbac-proxy-sar-config") pod "78830a59-20c3-498c-a1d8-739301cd31c1" (UID: "78830a59-20c3-498c-a1d8-739301cd31c1"). InnerVolumeSpecName "isvc-paddle-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:17:43.633692 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:43.633659 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78830a59-20c3-498c-a1d8-739301cd31c1-kube-api-access-t2r2j" (OuterVolumeSpecName: "kube-api-access-t2r2j") pod "78830a59-20c3-498c-a1d8-739301cd31c1" (UID: "78830a59-20c3-498c-a1d8-739301cd31c1"). InnerVolumeSpecName "kube-api-access-t2r2j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:17:43.633843 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:43.633761 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78830a59-20c3-498c-a1d8-739301cd31c1-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "78830a59-20c3-498c-a1d8-739301cd31c1" (UID: "78830a59-20c3-498c-a1d8-739301cd31c1"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:17:43.641554 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:43.641533 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78830a59-20c3-498c-a1d8-739301cd31c1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "78830a59-20c3-498c-a1d8-739301cd31c1" (UID: "78830a59-20c3-498c-a1d8-739301cd31c1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:17:43.732708 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:43.732683 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/78830a59-20c3-498c-a1d8-739301cd31c1-proxy-tls\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:17:43.732708 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:43.732706 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/78830a59-20c3-498c-a1d8-739301cd31c1-kserve-provision-location\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:17:43.732846 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:43.732717 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/78830a59-20c3-498c-a1d8-739301cd31c1-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:17:43.732846 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:43.732728 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t2r2j\" (UniqueName: \"kubernetes.io/projected/78830a59-20c3-498c-a1d8-739301cd31c1-kube-api-access-t2r2j\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:17:43.925601 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:43.925527 2568 generic.go:358] "Generic (PLEG): container finished" podID="78830a59-20c3-498c-a1d8-739301cd31c1" containerID="05f4d65956191186ff8c9284d5629bd3dc9ec349943f2fbfa4f84c549ff81e23" exitCode=0 Apr 23 18:17:43.925601 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:43.925584 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg" event={"ID":"78830a59-20c3-498c-a1d8-739301cd31c1","Type":"ContainerDied","Data":"05f4d65956191186ff8c9284d5629bd3dc9ec349943f2fbfa4f84c549ff81e23"} Apr 23 18:17:43.925792 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:43.925607 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg" event={"ID":"78830a59-20c3-498c-a1d8-739301cd31c1","Type":"ContainerDied","Data":"0c84132ce6739d836b80406e5182a95a46c64be11b718828c687a1991cea16ae"} Apr 23 18:17:43.925792 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:43.925609 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg" Apr 23 18:17:43.925792 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:43.925622 2568 scope.go:117] "RemoveContainer" containerID="0bb47419274491ab80cf44fdef32b67c55b90afa656a3db380165469b4e7278a" Apr 23 18:17:43.933530 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:43.933508 2568 scope.go:117] "RemoveContainer" containerID="05f4d65956191186ff8c9284d5629bd3dc9ec349943f2fbfa4f84c549ff81e23" Apr 23 18:17:43.944205 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:43.944186 2568 scope.go:117] "RemoveContainer" containerID="dee57b63ac54d2e2e23c3776058ffcf5c7f679fe72b91ea9bf7d7f5c0a100d8c" Apr 23 18:17:43.950006 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:43.949984 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg"] Apr 23 18:17:43.950912 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:43.950899 2568 scope.go:117] "RemoveContainer" containerID="0bb47419274491ab80cf44fdef32b67c55b90afa656a3db380165469b4e7278a" Apr 23 18:17:43.951213 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:17:43.951194 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bb47419274491ab80cf44fdef32b67c55b90afa656a3db380165469b4e7278a\": container with ID starting with 0bb47419274491ab80cf44fdef32b67c55b90afa656a3db380165469b4e7278a not found: ID does not exist" containerID="0bb47419274491ab80cf44fdef32b67c55b90afa656a3db380165469b4e7278a" Apr 23 18:17:43.951280 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:43.951219 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bb47419274491ab80cf44fdef32b67c55b90afa656a3db380165469b4e7278a"} err="failed to get container status \"0bb47419274491ab80cf44fdef32b67c55b90afa656a3db380165469b4e7278a\": rpc error: code = NotFound desc = could not find container \"0bb47419274491ab80cf44fdef32b67c55b90afa656a3db380165469b4e7278a\": container with ID starting with 0bb47419274491ab80cf44fdef32b67c55b90afa656a3db380165469b4e7278a not found: ID does not exist" Apr 23 18:17:43.951280 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:43.951246 2568 scope.go:117] "RemoveContainer" containerID="05f4d65956191186ff8c9284d5629bd3dc9ec349943f2fbfa4f84c549ff81e23" Apr 23 18:17:43.951498 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:17:43.951477 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05f4d65956191186ff8c9284d5629bd3dc9ec349943f2fbfa4f84c549ff81e23\": container with ID starting with 05f4d65956191186ff8c9284d5629bd3dc9ec349943f2fbfa4f84c549ff81e23 not found: ID does not exist" containerID="05f4d65956191186ff8c9284d5629bd3dc9ec349943f2fbfa4f84c549ff81e23" Apr 23 18:17:43.951554 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:43.951504 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05f4d65956191186ff8c9284d5629bd3dc9ec349943f2fbfa4f84c549ff81e23"} err="failed to get container status \"05f4d65956191186ff8c9284d5629bd3dc9ec349943f2fbfa4f84c549ff81e23\": rpc error: code = NotFound desc = could not find container \"05f4d65956191186ff8c9284d5629bd3dc9ec349943f2fbfa4f84c549ff81e23\": container with ID starting with 05f4d65956191186ff8c9284d5629bd3dc9ec349943f2fbfa4f84c549ff81e23 not found: ID does not exist" Apr 23 18:17:43.951554 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:43.951519 2568 scope.go:117] "RemoveContainer" containerID="dee57b63ac54d2e2e23c3776058ffcf5c7f679fe72b91ea9bf7d7f5c0a100d8c" Apr 23 18:17:43.951749 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:17:43.951734 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dee57b63ac54d2e2e23c3776058ffcf5c7f679fe72b91ea9bf7d7f5c0a100d8c\": container with ID starting with dee57b63ac54d2e2e23c3776058ffcf5c7f679fe72b91ea9bf7d7f5c0a100d8c not found: ID does not exist" containerID="dee57b63ac54d2e2e23c3776058ffcf5c7f679fe72b91ea9bf7d7f5c0a100d8c" Apr 23 18:17:43.951791 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:43.951752 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dee57b63ac54d2e2e23c3776058ffcf5c7f679fe72b91ea9bf7d7f5c0a100d8c"} err="failed to get container status \"dee57b63ac54d2e2e23c3776058ffcf5c7f679fe72b91ea9bf7d7f5c0a100d8c\": rpc error: code = NotFound desc = could not find container \"dee57b63ac54d2e2e23c3776058ffcf5c7f679fe72b91ea9bf7d7f5c0a100d8c\": container with ID starting with dee57b63ac54d2e2e23c3776058ffcf5c7f679fe72b91ea9bf7d7f5c0a100d8c not found: ID does not exist" Apr 23 18:17:43.954944 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:43.954910 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-pmdfg"] Apr 23 18:17:44.598488 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:44.598456 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78830a59-20c3-498c-a1d8-739301cd31c1" path="/var/lib/kubelet/pods/78830a59-20c3-498c-a1d8-739301cd31c1/volumes" Apr 23 18:17:45.932362 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:45.932334 2568 generic.go:358] "Generic (PLEG): container finished" podID="86964dd8-1e58-47f5-b87a-76d73243b1aa" containerID="6a0b1d421c60327a512bc4f7ada35570ea67c1bc358b5b92fe0af66274844e59" exitCode=0 Apr 23 18:17:45.932738 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:45.932401 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc" event={"ID":"86964dd8-1e58-47f5-b87a-76d73243b1aa","Type":"ContainerDied","Data":"6a0b1d421c60327a512bc4f7ada35570ea67c1bc358b5b92fe0af66274844e59"} Apr 23 18:17:46.936777 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:46.936742 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc" event={"ID":"86964dd8-1e58-47f5-b87a-76d73243b1aa","Type":"ContainerStarted","Data":"0c650e2da0e10992282798cd6db651af0220a9e6984b8cdd9b1768d1f3941c25"} Apr 23 18:17:46.937165 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:46.936784 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc" event={"ID":"86964dd8-1e58-47f5-b87a-76d73243b1aa","Type":"ContainerStarted","Data":"aac4caab770303bdefc9d96922d7aa81777d5ba4b2e2e26fe0aa43de59c85a3d"} Apr 23 18:17:46.937165 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:46.937057 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc" Apr 23 18:17:46.961078 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:46.961034 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc" podStartSLOduration=6.961020266 podStartE2EDuration="6.961020266s" podCreationTimestamp="2026-04-23 18:17:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:17:46.960472035 +0000 UTC m=+1524.880564516" watchObservedRunningTime="2026-04-23 18:17:46.961020266 +0000 UTC m=+1524.881112739" Apr 23 18:17:47.940226 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:47.940193 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc" Apr 23 18:17:47.941370 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:47.941341 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc" podUID="86964dd8-1e58-47f5-b87a-76d73243b1aa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 23 18:17:48.942777 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:48.942735 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc" podUID="86964dd8-1e58-47f5-b87a-76d73243b1aa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 23 18:17:53.947613 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:53.947588 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc" Apr 23 18:17:53.948173 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:17:53.948147 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc" podUID="86964dd8-1e58-47f5-b87a-76d73243b1aa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 23 18:18:03.948173 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:03.948132 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc" podUID="86964dd8-1e58-47f5-b87a-76d73243b1aa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 23 18:18:13.948116 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:13.948076 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc" podUID="86964dd8-1e58-47f5-b87a-76d73243b1aa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 23 18:18:23.948518 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:23.948474 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc" podUID="86964dd8-1e58-47f5-b87a-76d73243b1aa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 23 18:18:33.949099 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:33.949064 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc" Apr 23 18:18:42.600298 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:42.600267 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc"] Apr 23 18:18:42.600667 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:42.600564 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc" podUID="86964dd8-1e58-47f5-b87a-76d73243b1aa" containerName="kserve-container" containerID="cri-o://aac4caab770303bdefc9d96922d7aa81777d5ba4b2e2e26fe0aa43de59c85a3d" gracePeriod=30 Apr 23 18:18:42.600667 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:42.600604 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc" podUID="86964dd8-1e58-47f5-b87a-76d73243b1aa" containerName="kube-rbac-proxy" containerID="cri-o://0c650e2da0e10992282798cd6db651af0220a9e6984b8cdd9b1768d1f3941c25" gracePeriod=30 Apr 23 18:18:42.678923 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:42.678894 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-bdcr7"] Apr 23 18:18:42.679307 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:42.679291 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="78830a59-20c3-498c-a1d8-739301cd31c1" containerName="kube-rbac-proxy" Apr 23 18:18:42.679352 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:42.679313 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="78830a59-20c3-498c-a1d8-739301cd31c1" containerName="kube-rbac-proxy" Apr 23 18:18:42.679352 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:42.679329 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="78830a59-20c3-498c-a1d8-739301cd31c1" containerName="storage-initializer" Apr 23 18:18:42.679352 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:42.679340 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="78830a59-20c3-498c-a1d8-739301cd31c1" containerName="storage-initializer" Apr 23 18:18:42.679458 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:42.679350 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="78830a59-20c3-498c-a1d8-739301cd31c1" containerName="kserve-container" Apr 23 18:18:42.679458 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:42.679361 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="78830a59-20c3-498c-a1d8-739301cd31c1" containerName="kserve-container" Apr 23 18:18:42.679458 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:42.679441 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="78830a59-20c3-498c-a1d8-739301cd31c1" containerName="kube-rbac-proxy" Apr 23 18:18:42.679458 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:42.679456 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="78830a59-20c3-498c-a1d8-739301cd31c1" containerName="kserve-container" Apr 23 18:18:42.682585 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:42.682568 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-bdcr7" Apr 23 18:18:42.684487 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:42.684462 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-predictor-serving-cert\"" Apr 23 18:18:42.684587 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:42.684464 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-kube-rbac-proxy-sar-config\"" Apr 23 18:18:42.693667 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:42.693645 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-bdcr7"] Apr 23 18:18:42.767804 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:42.767769 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a21f9f8d-2c3e-4f56-b3d4-a8687d529064-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-predictor-8bb578669-bdcr7\" (UID: \"a21f9f8d-2c3e-4f56-b3d4-a8687d529064\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-bdcr7" Apr 23 18:18:42.767964 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:42.767814 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a21f9f8d-2c3e-4f56-b3d4-a8687d529064-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-bdcr7\" (UID: \"a21f9f8d-2c3e-4f56-b3d4-a8687d529064\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-bdcr7" Apr 23 18:18:42.767964 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:42.767894 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a21f9f8d-2c3e-4f56-b3d4-a8687d529064-kserve-provision-location\") pod \"isvc-pmml-predictor-8bb578669-bdcr7\" (UID: \"a21f9f8d-2c3e-4f56-b3d4-a8687d529064\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-bdcr7" Apr 23 18:18:42.767964 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:42.767957 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tst9l\" (UniqueName: \"kubernetes.io/projected/a21f9f8d-2c3e-4f56-b3d4-a8687d529064-kube-api-access-tst9l\") pod \"isvc-pmml-predictor-8bb578669-bdcr7\" (UID: \"a21f9f8d-2c3e-4f56-b3d4-a8687d529064\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-bdcr7" Apr 23 18:18:42.868804 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:42.868713 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a21f9f8d-2c3e-4f56-b3d4-a8687d529064-kserve-provision-location\") pod \"isvc-pmml-predictor-8bb578669-bdcr7\" (UID: \"a21f9f8d-2c3e-4f56-b3d4-a8687d529064\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-bdcr7" Apr 23 18:18:42.868804 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:42.868764 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tst9l\" (UniqueName: \"kubernetes.io/projected/a21f9f8d-2c3e-4f56-b3d4-a8687d529064-kube-api-access-tst9l\") pod \"isvc-pmml-predictor-8bb578669-bdcr7\" (UID: \"a21f9f8d-2c3e-4f56-b3d4-a8687d529064\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-bdcr7" Apr 23 18:18:42.869048 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:42.868846 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a21f9f8d-2c3e-4f56-b3d4-a8687d529064-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-predictor-8bb578669-bdcr7\" (UID: \"a21f9f8d-2c3e-4f56-b3d4-a8687d529064\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-bdcr7" Apr 23 18:18:42.869048 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:42.868879 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a21f9f8d-2c3e-4f56-b3d4-a8687d529064-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-bdcr7\" (UID: \"a21f9f8d-2c3e-4f56-b3d4-a8687d529064\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-bdcr7" Apr 23 18:18:42.869188 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:42.869149 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a21f9f8d-2c3e-4f56-b3d4-a8687d529064-kserve-provision-location\") pod \"isvc-pmml-predictor-8bb578669-bdcr7\" (UID: \"a21f9f8d-2c3e-4f56-b3d4-a8687d529064\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-bdcr7" Apr 23 18:18:42.869562 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:42.869537 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a21f9f8d-2c3e-4f56-b3d4-a8687d529064-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-predictor-8bb578669-bdcr7\" (UID: \"a21f9f8d-2c3e-4f56-b3d4-a8687d529064\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-bdcr7" Apr 23 18:18:42.871345 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:42.871322 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a21f9f8d-2c3e-4f56-b3d4-a8687d529064-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-bdcr7\" (UID: \"a21f9f8d-2c3e-4f56-b3d4-a8687d529064\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-bdcr7" Apr 23 18:18:42.881457 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:42.881432 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tst9l\" (UniqueName: \"kubernetes.io/projected/a21f9f8d-2c3e-4f56-b3d4-a8687d529064-kube-api-access-tst9l\") pod \"isvc-pmml-predictor-8bb578669-bdcr7\" (UID: \"a21f9f8d-2c3e-4f56-b3d4-a8687d529064\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-bdcr7" Apr 23 18:18:42.991673 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:42.991645 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-bdcr7" Apr 23 18:18:43.095778 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:43.095748 2568 generic.go:358] "Generic (PLEG): container finished" podID="86964dd8-1e58-47f5-b87a-76d73243b1aa" containerID="0c650e2da0e10992282798cd6db651af0220a9e6984b8cdd9b1768d1f3941c25" exitCode=2 Apr 23 18:18:43.095944 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:43.095821 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc" event={"ID":"86964dd8-1e58-47f5-b87a-76d73243b1aa","Type":"ContainerDied","Data":"0c650e2da0e10992282798cd6db651af0220a9e6984b8cdd9b1768d1f3941c25"} Apr 23 18:18:43.113368 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:43.113344 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-bdcr7"] Apr 23 18:18:43.116295 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:18:43.116273 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda21f9f8d_2c3e_4f56_b3d4_a8687d529064.slice/crio-c01b883b7ad64ebc70d5ba73d57827247219cf7acb018b0a307ac49d77e03931 WatchSource:0}: Error finding container c01b883b7ad64ebc70d5ba73d57827247219cf7acb018b0a307ac49d77e03931: Status 404 returned error can't find the container with id c01b883b7ad64ebc70d5ba73d57827247219cf7acb018b0a307ac49d77e03931 Apr 23 18:18:43.943977 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:43.943914 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc" podUID="86964dd8-1e58-47f5-b87a-76d73243b1aa" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.28:8643/healthz\": dial tcp 10.133.0.28:8643: connect: connection refused" Apr 23 18:18:43.948182 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:43.948162 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc" podUID="86964dd8-1e58-47f5-b87a-76d73243b1aa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 23 18:18:44.100780 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:44.100752 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-bdcr7" event={"ID":"a21f9f8d-2c3e-4f56-b3d4-a8687d529064","Type":"ContainerStarted","Data":"de9b62dbdef62d1d4faf02dda0b1ac2ab4a1e6ec29e7168035a64b312244b578"} Apr 23 18:18:44.100780 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:44.100785 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-bdcr7" event={"ID":"a21f9f8d-2c3e-4f56-b3d4-a8687d529064","Type":"ContainerStarted","Data":"c01b883b7ad64ebc70d5ba73d57827247219cf7acb018b0a307ac49d77e03931"} Apr 23 18:18:45.239996 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:45.239973 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc" Apr 23 18:18:45.287645 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:45.287618 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86964dd8-1e58-47f5-b87a-76d73243b1aa-kserve-provision-location\") pod \"86964dd8-1e58-47f5-b87a-76d73243b1aa\" (UID: \"86964dd8-1e58-47f5-b87a-76d73243b1aa\") " Apr 23 18:18:45.287801 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:45.287652 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/86964dd8-1e58-47f5-b87a-76d73243b1aa-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"86964dd8-1e58-47f5-b87a-76d73243b1aa\" (UID: \"86964dd8-1e58-47f5-b87a-76d73243b1aa\") " Apr 23 18:18:45.287801 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:45.287670 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bngkt\" (UniqueName: \"kubernetes.io/projected/86964dd8-1e58-47f5-b87a-76d73243b1aa-kube-api-access-bngkt\") pod \"86964dd8-1e58-47f5-b87a-76d73243b1aa\" (UID: \"86964dd8-1e58-47f5-b87a-76d73243b1aa\") " Apr 23 18:18:45.287940 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:45.287801 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/86964dd8-1e58-47f5-b87a-76d73243b1aa-proxy-tls\") pod \"86964dd8-1e58-47f5-b87a-76d73243b1aa\" (UID: \"86964dd8-1e58-47f5-b87a-76d73243b1aa\") " Apr 23 18:18:45.288028 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:45.288006 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86964dd8-1e58-47f5-b87a-76d73243b1aa-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config") pod "86964dd8-1e58-47f5-b87a-76d73243b1aa" (UID: "86964dd8-1e58-47f5-b87a-76d73243b1aa"). InnerVolumeSpecName "isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:18:45.289700 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:45.289665 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86964dd8-1e58-47f5-b87a-76d73243b1aa-kube-api-access-bngkt" (OuterVolumeSpecName: "kube-api-access-bngkt") pod "86964dd8-1e58-47f5-b87a-76d73243b1aa" (UID: "86964dd8-1e58-47f5-b87a-76d73243b1aa"). InnerVolumeSpecName "kube-api-access-bngkt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:18:45.289805 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:45.289718 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86964dd8-1e58-47f5-b87a-76d73243b1aa-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "86964dd8-1e58-47f5-b87a-76d73243b1aa" (UID: "86964dd8-1e58-47f5-b87a-76d73243b1aa"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:18:45.297657 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:45.297633 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86964dd8-1e58-47f5-b87a-76d73243b1aa-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "86964dd8-1e58-47f5-b87a-76d73243b1aa" (UID: "86964dd8-1e58-47f5-b87a-76d73243b1aa"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:18:45.388650 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:45.388560 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/86964dd8-1e58-47f5-b87a-76d73243b1aa-proxy-tls\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:18:45.388650 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:45.388591 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86964dd8-1e58-47f5-b87a-76d73243b1aa-kserve-provision-location\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:18:45.388650 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:45.388606 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/86964dd8-1e58-47f5-b87a-76d73243b1aa-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:18:45.388650 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:45.388622 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bngkt\" (UniqueName: \"kubernetes.io/projected/86964dd8-1e58-47f5-b87a-76d73243b1aa-kube-api-access-bngkt\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:18:46.108360 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:46.108320 2568 generic.go:358] "Generic (PLEG): container finished" podID="86964dd8-1e58-47f5-b87a-76d73243b1aa" containerID="aac4caab770303bdefc9d96922d7aa81777d5ba4b2e2e26fe0aa43de59c85a3d" exitCode=0 Apr 23 18:18:46.108567 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:46.108391 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc" event={"ID":"86964dd8-1e58-47f5-b87a-76d73243b1aa","Type":"ContainerDied","Data":"aac4caab770303bdefc9d96922d7aa81777d5ba4b2e2e26fe0aa43de59c85a3d"} Apr 23 18:18:46.108567 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:46.108410 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc" Apr 23 18:18:46.108567 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:46.108421 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc" event={"ID":"86964dd8-1e58-47f5-b87a-76d73243b1aa","Type":"ContainerDied","Data":"16f49c1747cf49d440d0cec6d351fd25508aa13b2ba3e2b3ba02550567595453"} Apr 23 18:18:46.108567 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:46.108439 2568 scope.go:117] "RemoveContainer" containerID="0c650e2da0e10992282798cd6db651af0220a9e6984b8cdd9b1768d1f3941c25" Apr 23 18:18:46.117423 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:46.117163 2568 scope.go:117] "RemoveContainer" containerID="aac4caab770303bdefc9d96922d7aa81777d5ba4b2e2e26fe0aa43de59c85a3d" Apr 23 18:18:46.124105 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:46.124082 2568 scope.go:117] "RemoveContainer" containerID="6a0b1d421c60327a512bc4f7ada35570ea67c1bc358b5b92fe0af66274844e59" Apr 23 18:18:46.130737 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:46.130715 2568 scope.go:117] "RemoveContainer" containerID="0c650e2da0e10992282798cd6db651af0220a9e6984b8cdd9b1768d1f3941c25" Apr 23 18:18:46.131028 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:18:46.131006 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c650e2da0e10992282798cd6db651af0220a9e6984b8cdd9b1768d1f3941c25\": container with ID starting with 0c650e2da0e10992282798cd6db651af0220a9e6984b8cdd9b1768d1f3941c25 not found: ID does not exist" containerID="0c650e2da0e10992282798cd6db651af0220a9e6984b8cdd9b1768d1f3941c25" Apr 23 18:18:46.131110 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:46.131037 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c650e2da0e10992282798cd6db651af0220a9e6984b8cdd9b1768d1f3941c25"} err="failed to get container status \"0c650e2da0e10992282798cd6db651af0220a9e6984b8cdd9b1768d1f3941c25\": rpc error: code = NotFound desc = could not find container \"0c650e2da0e10992282798cd6db651af0220a9e6984b8cdd9b1768d1f3941c25\": container with ID starting with 0c650e2da0e10992282798cd6db651af0220a9e6984b8cdd9b1768d1f3941c25 not found: ID does not exist" Apr 23 18:18:46.131110 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:46.131054 2568 scope.go:117] "RemoveContainer" containerID="aac4caab770303bdefc9d96922d7aa81777d5ba4b2e2e26fe0aa43de59c85a3d" Apr 23 18:18:46.131311 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:18:46.131292 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aac4caab770303bdefc9d96922d7aa81777d5ba4b2e2e26fe0aa43de59c85a3d\": container with ID starting with aac4caab770303bdefc9d96922d7aa81777d5ba4b2e2e26fe0aa43de59c85a3d not found: ID does not exist" containerID="aac4caab770303bdefc9d96922d7aa81777d5ba4b2e2e26fe0aa43de59c85a3d" Apr 23 18:18:46.131369 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:46.131316 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aac4caab770303bdefc9d96922d7aa81777d5ba4b2e2e26fe0aa43de59c85a3d"} err="failed to get container status \"aac4caab770303bdefc9d96922d7aa81777d5ba4b2e2e26fe0aa43de59c85a3d\": rpc error: code = NotFound desc = could not find container \"aac4caab770303bdefc9d96922d7aa81777d5ba4b2e2e26fe0aa43de59c85a3d\": container with ID starting with aac4caab770303bdefc9d96922d7aa81777d5ba4b2e2e26fe0aa43de59c85a3d not found: ID does not exist" Apr 23 18:18:46.131369 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:46.131332 2568 scope.go:117] "RemoveContainer" containerID="6a0b1d421c60327a512bc4f7ada35570ea67c1bc358b5b92fe0af66274844e59" Apr 23 18:18:46.131573 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:18:46.131529 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a0b1d421c60327a512bc4f7ada35570ea67c1bc358b5b92fe0af66274844e59\": container with ID starting with 6a0b1d421c60327a512bc4f7ada35570ea67c1bc358b5b92fe0af66274844e59 not found: ID does not exist" containerID="6a0b1d421c60327a512bc4f7ada35570ea67c1bc358b5b92fe0af66274844e59" Apr 23 18:18:46.131573 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:46.131548 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a0b1d421c60327a512bc4f7ada35570ea67c1bc358b5b92fe0af66274844e59"} err="failed to get container status \"6a0b1d421c60327a512bc4f7ada35570ea67c1bc358b5b92fe0af66274844e59\": rpc error: code = NotFound desc = could not find container \"6a0b1d421c60327a512bc4f7ada35570ea67c1bc358b5b92fe0af66274844e59\": container with ID starting with 6a0b1d421c60327a512bc4f7ada35570ea67c1bc358b5b92fe0af66274844e59 not found: ID does not exist" Apr 23 18:18:46.132226 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:46.132206 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc"] Apr 23 18:18:46.137129 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:46.137108 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-tw4fc"] Apr 23 18:18:46.598814 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:46.598779 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86964dd8-1e58-47f5-b87a-76d73243b1aa" path="/var/lib/kubelet/pods/86964dd8-1e58-47f5-b87a-76d73243b1aa/volumes" Apr 23 18:18:47.113219 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:47.113186 2568 generic.go:358] "Generic (PLEG): container finished" podID="a21f9f8d-2c3e-4f56-b3d4-a8687d529064" containerID="de9b62dbdef62d1d4faf02dda0b1ac2ab4a1e6ec29e7168035a64b312244b578" exitCode=0 Apr 23 18:18:47.113420 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:47.113263 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-bdcr7" event={"ID":"a21f9f8d-2c3e-4f56-b3d4-a8687d529064","Type":"ContainerDied","Data":"de9b62dbdef62d1d4faf02dda0b1ac2ab4a1e6ec29e7168035a64b312244b578"} Apr 23 18:18:54.138409 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:54.138373 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-bdcr7" event={"ID":"a21f9f8d-2c3e-4f56-b3d4-a8687d529064","Type":"ContainerStarted","Data":"dfa4a8914ced59c15f534f87a75d73c35d6344df617d0fcc91236baa9c6c209f"} Apr 23 18:18:54.138409 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:54.138415 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-bdcr7" event={"ID":"a21f9f8d-2c3e-4f56-b3d4-a8687d529064","Type":"ContainerStarted","Data":"75c313008baf5c9cee6dbefc4ad1fde8359a0f40511464e6dbf8191018fbc0ad"} Apr 23 18:18:54.138838 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:54.138619 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-bdcr7" Apr 23 18:18:54.161089 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:54.161036 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-bdcr7" podStartSLOduration=5.345936865 podStartE2EDuration="12.161019245s" podCreationTimestamp="2026-04-23 18:18:42 +0000 UTC" firstStartedPulling="2026-04-23 18:18:47.114530776 +0000 UTC m=+1585.034623230" lastFinishedPulling="2026-04-23 18:18:53.929613156 +0000 UTC m=+1591.849705610" observedRunningTime="2026-04-23 18:18:54.159833872 +0000 UTC m=+1592.079926349" watchObservedRunningTime="2026-04-23 18:18:54.161019245 +0000 UTC m=+1592.081111721" Apr 23 18:18:55.141424 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:55.141389 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-bdcr7" Apr 23 18:18:55.142657 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:55.142630 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-bdcr7" podUID="a21f9f8d-2c3e-4f56-b3d4-a8687d529064" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 23 18:18:56.144355 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:18:56.144316 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-bdcr7" podUID="a21f9f8d-2c3e-4f56-b3d4-a8687d529064" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 23 18:19:01.148613 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:19:01.148586 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-bdcr7" Apr 23 18:19:01.149219 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:19:01.149193 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-bdcr7" podUID="a21f9f8d-2c3e-4f56-b3d4-a8687d529064" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 23 18:19:11.149812 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:19:11.149727 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-bdcr7" podUID="a21f9f8d-2c3e-4f56-b3d4-a8687d529064" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 23 18:19:21.149189 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:19:21.149149 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-bdcr7" podUID="a21f9f8d-2c3e-4f56-b3d4-a8687d529064" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 23 18:19:31.150094 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:19:31.150057 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-bdcr7" podUID="a21f9f8d-2c3e-4f56-b3d4-a8687d529064" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 23 18:19:41.149678 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:19:41.149633 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-bdcr7" podUID="a21f9f8d-2c3e-4f56-b3d4-a8687d529064" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 23 18:19:51.149122 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:19:51.149078 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-bdcr7" podUID="a21f9f8d-2c3e-4f56-b3d4-a8687d529064" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 23 18:20:01.149697 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:01.149654 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-bdcr7" podUID="a21f9f8d-2c3e-4f56-b3d4-a8687d529064" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 23 18:20:11.154252 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:11.150419 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-bdcr7" Apr 23 18:20:13.639964 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:13.639911 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-bdcr7"] Apr 23 18:20:13.640333 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:13.640268 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-bdcr7" podUID="a21f9f8d-2c3e-4f56-b3d4-a8687d529064" containerName="kserve-container" containerID="cri-o://75c313008baf5c9cee6dbefc4ad1fde8359a0f40511464e6dbf8191018fbc0ad" gracePeriod=30 Apr 23 18:20:13.640333 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:13.640286 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-bdcr7" podUID="a21f9f8d-2c3e-4f56-b3d4-a8687d529064" containerName="kube-rbac-proxy" containerID="cri-o://dfa4a8914ced59c15f534f87a75d73c35d6344df617d0fcc91236baa9c6c209f" gracePeriod=30 Apr 23 18:20:13.753207 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:13.753175 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc"] Apr 23 18:20:13.753462 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:13.753450 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="86964dd8-1e58-47f5-b87a-76d73243b1aa" containerName="kserve-container" Apr 23 18:20:13.753508 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:13.753465 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="86964dd8-1e58-47f5-b87a-76d73243b1aa" containerName="kserve-container" Apr 23 18:20:13.753508 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:13.753475 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="86964dd8-1e58-47f5-b87a-76d73243b1aa" containerName="kube-rbac-proxy" Apr 23 18:20:13.753508 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:13.753482 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="86964dd8-1e58-47f5-b87a-76d73243b1aa" containerName="kube-rbac-proxy" Apr 23 18:20:13.753508 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:13.753489 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="86964dd8-1e58-47f5-b87a-76d73243b1aa" containerName="storage-initializer" Apr 23 18:20:13.753508 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:13.753495 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="86964dd8-1e58-47f5-b87a-76d73243b1aa" containerName="storage-initializer" Apr 23 18:20:13.753668 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:13.753553 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="86964dd8-1e58-47f5-b87a-76d73243b1aa" containerName="kube-rbac-proxy" Apr 23 18:20:13.753668 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:13.753562 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="86964dd8-1e58-47f5-b87a-76d73243b1aa" containerName="kserve-container" Apr 23 18:20:13.756314 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:13.756288 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc" Apr 23 18:20:13.758752 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:13.758696 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-runtime-predictor-serving-cert\"" Apr 23 18:20:13.759021 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:13.759006 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-runtime-kube-rbac-proxy-sar-config\"" Apr 23 18:20:13.768540 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:13.768518 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc"] Apr 23 18:20:13.802573 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:13.802545 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a0f31bbf-d85a-4ac2-885a-7fbda9df085a-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-bxtjc\" (UID: \"a0f31bbf-d85a-4ac2-885a-7fbda9df085a\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc" Apr 23 18:20:13.802674 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:13.802591 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4gck\" (UniqueName: \"kubernetes.io/projected/a0f31bbf-d85a-4ac2-885a-7fbda9df085a-kube-api-access-z4gck\") pod \"isvc-pmml-runtime-predictor-67bc544947-bxtjc\" (UID: \"a0f31bbf-d85a-4ac2-885a-7fbda9df085a\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc" Apr 23 18:20:13.802674 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:13.802610 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a0f31bbf-d85a-4ac2-885a-7fbda9df085a-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-67bc544947-bxtjc\" (UID: \"a0f31bbf-d85a-4ac2-885a-7fbda9df085a\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc" Apr 23 18:20:13.802752 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:13.802692 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a0f31bbf-d85a-4ac2-885a-7fbda9df085a-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-runtime-predictor-67bc544947-bxtjc\" (UID: \"a0f31bbf-d85a-4ac2-885a-7fbda9df085a\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc" Apr 23 18:20:13.903922 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:13.903850 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z4gck\" (UniqueName: \"kubernetes.io/projected/a0f31bbf-d85a-4ac2-885a-7fbda9df085a-kube-api-access-z4gck\") pod \"isvc-pmml-runtime-predictor-67bc544947-bxtjc\" (UID: \"a0f31bbf-d85a-4ac2-885a-7fbda9df085a\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc" Apr 23 18:20:13.903922 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:13.903889 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a0f31bbf-d85a-4ac2-885a-7fbda9df085a-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-67bc544947-bxtjc\" (UID: \"a0f31bbf-d85a-4ac2-885a-7fbda9df085a\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc" Apr 23 18:20:13.903922 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:13.903920 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a0f31bbf-d85a-4ac2-885a-7fbda9df085a-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-runtime-predictor-67bc544947-bxtjc\" (UID: \"a0f31bbf-d85a-4ac2-885a-7fbda9df085a\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc" Apr 23 18:20:13.904188 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:13.903969 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a0f31bbf-d85a-4ac2-885a-7fbda9df085a-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-bxtjc\" (UID: \"a0f31bbf-d85a-4ac2-885a-7fbda9df085a\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc" Apr 23 18:20:13.904188 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:20:13.904060 2568 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-pmml-runtime-predictor-serving-cert: secret "isvc-pmml-runtime-predictor-serving-cert" not found Apr 23 18:20:13.904188 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:20:13.904115 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0f31bbf-d85a-4ac2-885a-7fbda9df085a-proxy-tls podName:a0f31bbf-d85a-4ac2-885a-7fbda9df085a nodeName:}" failed. No retries permitted until 2026-04-23 18:20:14.404095523 +0000 UTC m=+1672.324187979 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/a0f31bbf-d85a-4ac2-885a-7fbda9df085a-proxy-tls") pod "isvc-pmml-runtime-predictor-67bc544947-bxtjc" (UID: "a0f31bbf-d85a-4ac2-885a-7fbda9df085a") : secret "isvc-pmml-runtime-predictor-serving-cert" not found Apr 23 18:20:13.904306 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:13.904255 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a0f31bbf-d85a-4ac2-885a-7fbda9df085a-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-67bc544947-bxtjc\" (UID: \"a0f31bbf-d85a-4ac2-885a-7fbda9df085a\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc" Apr 23 18:20:13.904574 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:13.904557 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a0f31bbf-d85a-4ac2-885a-7fbda9df085a-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-runtime-predictor-67bc544947-bxtjc\" (UID: \"a0f31bbf-d85a-4ac2-885a-7fbda9df085a\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc" Apr 23 18:20:13.915309 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:13.915284 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4gck\" (UniqueName: \"kubernetes.io/projected/a0f31bbf-d85a-4ac2-885a-7fbda9df085a-kube-api-access-z4gck\") pod \"isvc-pmml-runtime-predictor-67bc544947-bxtjc\" (UID: \"a0f31bbf-d85a-4ac2-885a-7fbda9df085a\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc" Apr 23 18:20:14.350855 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:14.350825 2568 generic.go:358] "Generic (PLEG): container finished" podID="a21f9f8d-2c3e-4f56-b3d4-a8687d529064" containerID="dfa4a8914ced59c15f534f87a75d73c35d6344df617d0fcc91236baa9c6c209f" exitCode=2 Apr 23 18:20:14.351031 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:14.350887 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-bdcr7" event={"ID":"a21f9f8d-2c3e-4f56-b3d4-a8687d529064","Type":"ContainerDied","Data":"dfa4a8914ced59c15f534f87a75d73c35d6344df617d0fcc91236baa9c6c209f"} Apr 23 18:20:14.406902 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:14.406875 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a0f31bbf-d85a-4ac2-885a-7fbda9df085a-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-bxtjc\" (UID: \"a0f31bbf-d85a-4ac2-885a-7fbda9df085a\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc" Apr 23 18:20:14.409209 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:14.409192 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a0f31bbf-d85a-4ac2-885a-7fbda9df085a-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-bxtjc\" (UID: \"a0f31bbf-d85a-4ac2-885a-7fbda9df085a\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc" Apr 23 18:20:14.666414 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:14.666323 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc" Apr 23 18:20:14.783408 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:14.783266 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc"] Apr 23 18:20:14.786227 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:20:14.786199 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0f31bbf_d85a_4ac2_885a_7fbda9df085a.slice/crio-72605d4f1487418b1ac003c2deeb15b2267598f38b0e3e3564b13b274aaa9a62 WatchSource:0}: Error finding container 72605d4f1487418b1ac003c2deeb15b2267598f38b0e3e3564b13b274aaa9a62: Status 404 returned error can't find the container with id 72605d4f1487418b1ac003c2deeb15b2267598f38b0e3e3564b13b274aaa9a62 Apr 23 18:20:15.360058 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:15.360018 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc" event={"ID":"a0f31bbf-d85a-4ac2-885a-7fbda9df085a","Type":"ContainerStarted","Data":"5831d390e44509caabbb1cc3e59687eb269027c2872fa45dadafc5cd9f717272"} Apr 23 18:20:15.360058 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:15.360061 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc" event={"ID":"a0f31bbf-d85a-4ac2-885a-7fbda9df085a","Type":"ContainerStarted","Data":"72605d4f1487418b1ac003c2deeb15b2267598f38b0e3e3564b13b274aaa9a62"} Apr 23 18:20:16.144628 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:16.144587 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-bdcr7" podUID="a21f9f8d-2c3e-4f56-b3d4-a8687d529064" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.29:8643/healthz\": dial tcp 10.133.0.29:8643: connect: connection refused" Apr 23 18:20:17.075442 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:17.075420 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-bdcr7" Apr 23 18:20:17.130148 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:17.130072 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a21f9f8d-2c3e-4f56-b3d4-a8687d529064-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"a21f9f8d-2c3e-4f56-b3d4-a8687d529064\" (UID: \"a21f9f8d-2c3e-4f56-b3d4-a8687d529064\") " Apr 23 18:20:17.130148 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:17.130125 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a21f9f8d-2c3e-4f56-b3d4-a8687d529064-proxy-tls\") pod \"a21f9f8d-2c3e-4f56-b3d4-a8687d529064\" (UID: \"a21f9f8d-2c3e-4f56-b3d4-a8687d529064\") " Apr 23 18:20:17.130330 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:17.130164 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a21f9f8d-2c3e-4f56-b3d4-a8687d529064-kserve-provision-location\") pod \"a21f9f8d-2c3e-4f56-b3d4-a8687d529064\" (UID: \"a21f9f8d-2c3e-4f56-b3d4-a8687d529064\") " Apr 23 18:20:17.130330 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:17.130208 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tst9l\" (UniqueName: \"kubernetes.io/projected/a21f9f8d-2c3e-4f56-b3d4-a8687d529064-kube-api-access-tst9l\") pod \"a21f9f8d-2c3e-4f56-b3d4-a8687d529064\" (UID: \"a21f9f8d-2c3e-4f56-b3d4-a8687d529064\") " Apr 23 18:20:17.130506 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:17.130476 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a21f9f8d-2c3e-4f56-b3d4-a8687d529064-isvc-pmml-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-pmml-kube-rbac-proxy-sar-config") pod "a21f9f8d-2c3e-4f56-b3d4-a8687d529064" (UID: "a21f9f8d-2c3e-4f56-b3d4-a8687d529064"). InnerVolumeSpecName "isvc-pmml-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:20:17.130612 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:17.130511 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a21f9f8d-2c3e-4f56-b3d4-a8687d529064-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a21f9f8d-2c3e-4f56-b3d4-a8687d529064" (UID: "a21f9f8d-2c3e-4f56-b3d4-a8687d529064"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:20:17.132183 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:17.132163 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a21f9f8d-2c3e-4f56-b3d4-a8687d529064-kube-api-access-tst9l" (OuterVolumeSpecName: "kube-api-access-tst9l") pod "a21f9f8d-2c3e-4f56-b3d4-a8687d529064" (UID: "a21f9f8d-2c3e-4f56-b3d4-a8687d529064"). InnerVolumeSpecName "kube-api-access-tst9l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:20:17.132285 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:17.132267 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a21f9f8d-2c3e-4f56-b3d4-a8687d529064-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a21f9f8d-2c3e-4f56-b3d4-a8687d529064" (UID: "a21f9f8d-2c3e-4f56-b3d4-a8687d529064"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:20:17.231580 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:17.231547 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tst9l\" (UniqueName: \"kubernetes.io/projected/a21f9f8d-2c3e-4f56-b3d4-a8687d529064-kube-api-access-tst9l\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:20:17.231580 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:17.231573 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a21f9f8d-2c3e-4f56-b3d4-a8687d529064-isvc-pmml-kube-rbac-proxy-sar-config\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:20:17.231580 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:17.231584 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a21f9f8d-2c3e-4f56-b3d4-a8687d529064-proxy-tls\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:20:17.232052 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:17.231594 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a21f9f8d-2c3e-4f56-b3d4-a8687d529064-kserve-provision-location\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:20:17.368238 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:17.368200 2568 generic.go:358] "Generic (PLEG): container finished" podID="a21f9f8d-2c3e-4f56-b3d4-a8687d529064" containerID="75c313008baf5c9cee6dbefc4ad1fde8359a0f40511464e6dbf8191018fbc0ad" exitCode=0 Apr 23 18:20:17.368433 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:17.368277 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-bdcr7" Apr 23 18:20:17.368433 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:17.368275 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-bdcr7" event={"ID":"a21f9f8d-2c3e-4f56-b3d4-a8687d529064","Type":"ContainerDied","Data":"75c313008baf5c9cee6dbefc4ad1fde8359a0f40511464e6dbf8191018fbc0ad"} Apr 23 18:20:17.368433 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:17.368379 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-bdcr7" event={"ID":"a21f9f8d-2c3e-4f56-b3d4-a8687d529064","Type":"ContainerDied","Data":"c01b883b7ad64ebc70d5ba73d57827247219cf7acb018b0a307ac49d77e03931"} Apr 23 18:20:17.368433 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:17.368396 2568 scope.go:117] "RemoveContainer" containerID="dfa4a8914ced59c15f534f87a75d73c35d6344df617d0fcc91236baa9c6c209f" Apr 23 18:20:17.376684 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:17.376595 2568 scope.go:117] "RemoveContainer" containerID="75c313008baf5c9cee6dbefc4ad1fde8359a0f40511464e6dbf8191018fbc0ad" Apr 23 18:20:17.383618 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:17.383597 2568 scope.go:117] "RemoveContainer" containerID="de9b62dbdef62d1d4faf02dda0b1ac2ab4a1e6ec29e7168035a64b312244b578" Apr 23 18:20:17.391236 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:17.391213 2568 scope.go:117] "RemoveContainer" containerID="dfa4a8914ced59c15f534f87a75d73c35d6344df617d0fcc91236baa9c6c209f" Apr 23 18:20:17.391515 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:17.391486 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-bdcr7"] Apr 23 18:20:17.391586 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:20:17.391509 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfa4a8914ced59c15f534f87a75d73c35d6344df617d0fcc91236baa9c6c209f\": container with ID starting with dfa4a8914ced59c15f534f87a75d73c35d6344df617d0fcc91236baa9c6c209f not found: ID does not exist" containerID="dfa4a8914ced59c15f534f87a75d73c35d6344df617d0fcc91236baa9c6c209f" Apr 23 18:20:17.391586 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:17.391537 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfa4a8914ced59c15f534f87a75d73c35d6344df617d0fcc91236baa9c6c209f"} err="failed to get container status \"dfa4a8914ced59c15f534f87a75d73c35d6344df617d0fcc91236baa9c6c209f\": rpc error: code = NotFound desc = could not find container \"dfa4a8914ced59c15f534f87a75d73c35d6344df617d0fcc91236baa9c6c209f\": container with ID starting with dfa4a8914ced59c15f534f87a75d73c35d6344df617d0fcc91236baa9c6c209f not found: ID does not exist" Apr 23 18:20:17.391586 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:17.391556 2568 scope.go:117] "RemoveContainer" containerID="75c313008baf5c9cee6dbefc4ad1fde8359a0f40511464e6dbf8191018fbc0ad" Apr 23 18:20:17.391816 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:20:17.391797 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75c313008baf5c9cee6dbefc4ad1fde8359a0f40511464e6dbf8191018fbc0ad\": container with ID starting with 75c313008baf5c9cee6dbefc4ad1fde8359a0f40511464e6dbf8191018fbc0ad not found: ID does not exist" containerID="75c313008baf5c9cee6dbefc4ad1fde8359a0f40511464e6dbf8191018fbc0ad" Apr 23 18:20:17.391886 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:17.391826 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75c313008baf5c9cee6dbefc4ad1fde8359a0f40511464e6dbf8191018fbc0ad"} err="failed to get container status \"75c313008baf5c9cee6dbefc4ad1fde8359a0f40511464e6dbf8191018fbc0ad\": rpc error: code = NotFound desc = could not find container \"75c313008baf5c9cee6dbefc4ad1fde8359a0f40511464e6dbf8191018fbc0ad\": container with ID starting with 75c313008baf5c9cee6dbefc4ad1fde8359a0f40511464e6dbf8191018fbc0ad not found: ID does not exist" Apr 23 18:20:17.391886 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:17.391850 2568 scope.go:117] "RemoveContainer" containerID="de9b62dbdef62d1d4faf02dda0b1ac2ab4a1e6ec29e7168035a64b312244b578" Apr 23 18:20:17.392115 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:20:17.392095 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de9b62dbdef62d1d4faf02dda0b1ac2ab4a1e6ec29e7168035a64b312244b578\": container with ID starting with de9b62dbdef62d1d4faf02dda0b1ac2ab4a1e6ec29e7168035a64b312244b578 not found: ID does not exist" containerID="de9b62dbdef62d1d4faf02dda0b1ac2ab4a1e6ec29e7168035a64b312244b578" Apr 23 18:20:17.392170 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:17.392122 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de9b62dbdef62d1d4faf02dda0b1ac2ab4a1e6ec29e7168035a64b312244b578"} err="failed to get container status \"de9b62dbdef62d1d4faf02dda0b1ac2ab4a1e6ec29e7168035a64b312244b578\": rpc error: code = NotFound desc = could not find container \"de9b62dbdef62d1d4faf02dda0b1ac2ab4a1e6ec29e7168035a64b312244b578\": container with ID starting with de9b62dbdef62d1d4faf02dda0b1ac2ab4a1e6ec29e7168035a64b312244b578 not found: ID does not exist" Apr 23 18:20:17.395318 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:17.395295 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-bdcr7"] Apr 23 18:20:18.599711 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:18.599683 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a21f9f8d-2c3e-4f56-b3d4-a8687d529064" path="/var/lib/kubelet/pods/a21f9f8d-2c3e-4f56-b3d4-a8687d529064/volumes" Apr 23 18:20:19.378873 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:19.378843 2568 generic.go:358] "Generic (PLEG): container finished" podID="a0f31bbf-d85a-4ac2-885a-7fbda9df085a" containerID="5831d390e44509caabbb1cc3e59687eb269027c2872fa45dadafc5cd9f717272" exitCode=0 Apr 23 18:20:19.379066 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:19.378921 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc" event={"ID":"a0f31bbf-d85a-4ac2-885a-7fbda9df085a","Type":"ContainerDied","Data":"5831d390e44509caabbb1cc3e59687eb269027c2872fa45dadafc5cd9f717272"} Apr 23 18:20:20.384082 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:20.384052 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc" event={"ID":"a0f31bbf-d85a-4ac2-885a-7fbda9df085a","Type":"ContainerStarted","Data":"a23ec7077b7cc1fb02da14ac3a61029f7b79765aac01d28838ffe9c323755831"} Apr 23 18:20:20.384520 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:20.384091 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc" event={"ID":"a0f31bbf-d85a-4ac2-885a-7fbda9df085a","Type":"ContainerStarted","Data":"0bb2ee9900103c95517b8859c47b50abf5066c71a115ac397ec2809834e75915"} Apr 23 18:20:20.384520 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:20.384388 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc" Apr 23 18:20:20.384634 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:20.384547 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc" Apr 23 18:20:20.385890 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:20.385864 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc" podUID="a0f31bbf-d85a-4ac2-885a-7fbda9df085a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 23 18:20:20.402532 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:20.402484 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc" podStartSLOduration=7.40247166 podStartE2EDuration="7.40247166s" podCreationTimestamp="2026-04-23 18:20:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:20:20.401149567 +0000 UTC m=+1678.321242043" watchObservedRunningTime="2026-04-23 18:20:20.40247166 +0000 UTC m=+1678.322564135" Apr 23 18:20:21.387467 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:21.387428 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc" podUID="a0f31bbf-d85a-4ac2-885a-7fbda9df085a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 23 18:20:22.390064 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:22.390022 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc" podUID="a0f31bbf-d85a-4ac2-885a-7fbda9df085a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 23 18:20:27.394863 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:27.394835 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc" Apr 23 18:20:27.395360 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:27.395337 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc" podUID="a0f31bbf-d85a-4ac2-885a-7fbda9df085a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 23 18:20:37.395916 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:37.395829 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc" podUID="a0f31bbf-d85a-4ac2-885a-7fbda9df085a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 23 18:20:47.395885 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:47.395847 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc" podUID="a0f31bbf-d85a-4ac2-885a-7fbda9df085a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 23 18:20:57.395477 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:20:57.395435 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc" podUID="a0f31bbf-d85a-4ac2-885a-7fbda9df085a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 23 18:21:07.395531 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:07.395494 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc" podUID="a0f31bbf-d85a-4ac2-885a-7fbda9df085a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 23 18:21:17.395367 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:17.395328 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc" podUID="a0f31bbf-d85a-4ac2-885a-7fbda9df085a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 23 18:21:27.396339 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:27.396298 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc" podUID="a0f31bbf-d85a-4ac2-885a-7fbda9df085a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 23 18:21:37.395323 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:37.395276 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc" podUID="a0f31bbf-d85a-4ac2-885a-7fbda9df085a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 23 18:21:40.600905 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:40.600877 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc" Apr 23 18:21:44.845011 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:44.844976 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc"] Apr 23 18:21:44.845487 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:44.845396 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc" podUID="a0f31bbf-d85a-4ac2-885a-7fbda9df085a" containerName="kserve-container" containerID="cri-o://0bb2ee9900103c95517b8859c47b50abf5066c71a115ac397ec2809834e75915" gracePeriod=30 Apr 23 18:21:44.845487 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:44.845437 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc" podUID="a0f31bbf-d85a-4ac2-885a-7fbda9df085a" containerName="kube-rbac-proxy" containerID="cri-o://a23ec7077b7cc1fb02da14ac3a61029f7b79765aac01d28838ffe9c323755831" gracePeriod=30 Apr 23 18:21:44.984532 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:44.984502 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp"] Apr 23 18:21:44.984778 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:44.984765 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a21f9f8d-2c3e-4f56-b3d4-a8687d529064" containerName="kube-rbac-proxy" Apr 23 18:21:44.984827 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:44.984779 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="a21f9f8d-2c3e-4f56-b3d4-a8687d529064" containerName="kube-rbac-proxy" Apr 23 18:21:44.984827 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:44.984791 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a21f9f8d-2c3e-4f56-b3d4-a8687d529064" containerName="storage-initializer" Apr 23 18:21:44.984827 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:44.984796 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="a21f9f8d-2c3e-4f56-b3d4-a8687d529064" containerName="storage-initializer" Apr 23 18:21:44.984827 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:44.984806 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a21f9f8d-2c3e-4f56-b3d4-a8687d529064" containerName="kserve-container" Apr 23 18:21:44.984827 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:44.984811 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="a21f9f8d-2c3e-4f56-b3d4-a8687d529064" containerName="kserve-container" Apr 23 18:21:44.985011 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:44.984869 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="a21f9f8d-2c3e-4f56-b3d4-a8687d529064" containerName="kserve-container" Apr 23 18:21:44.985011 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:44.984878 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="a21f9f8d-2c3e-4f56-b3d4-a8687d529064" containerName="kube-rbac-proxy" Apr 23 18:21:44.987637 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:44.987618 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp" Apr 23 18:21:44.989732 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:44.989703 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-v2-kserve-predictor-serving-cert\"" Apr 23 18:21:44.989823 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:44.989737 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\"" Apr 23 18:21:45.000069 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:45.000045 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp"] Apr 23 18:21:45.067696 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:45.067665 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpvmq\" (UniqueName: \"kubernetes.io/projected/cdfd3e18-e160-43d8-9414-c221598df44a-kube-api-access-tpvmq\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp\" (UID: \"cdfd3e18-e160-43d8-9414-c221598df44a\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp" Apr 23 18:21:45.067865 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:45.067712 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cdfd3e18-e160-43d8-9414-c221598df44a-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp\" (UID: \"cdfd3e18-e160-43d8-9414-c221598df44a\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp" Apr 23 18:21:45.067865 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:45.067800 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cdfd3e18-e160-43d8-9414-c221598df44a-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp\" (UID: \"cdfd3e18-e160-43d8-9414-c221598df44a\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp" Apr 23 18:21:45.067865 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:45.067844 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cdfd3e18-e160-43d8-9414-c221598df44a-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp\" (UID: \"cdfd3e18-e160-43d8-9414-c221598df44a\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp" Apr 23 18:21:45.168253 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:45.168168 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tpvmq\" (UniqueName: \"kubernetes.io/projected/cdfd3e18-e160-43d8-9414-c221598df44a-kube-api-access-tpvmq\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp\" (UID: \"cdfd3e18-e160-43d8-9414-c221598df44a\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp" Apr 23 18:21:45.168253 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:45.168208 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cdfd3e18-e160-43d8-9414-c221598df44a-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp\" (UID: \"cdfd3e18-e160-43d8-9414-c221598df44a\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp" Apr 23 18:21:45.168465 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:45.168326 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cdfd3e18-e160-43d8-9414-c221598df44a-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp\" (UID: \"cdfd3e18-e160-43d8-9414-c221598df44a\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp" Apr 23 18:21:45.168465 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:45.168380 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cdfd3e18-e160-43d8-9414-c221598df44a-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp\" (UID: \"cdfd3e18-e160-43d8-9414-c221598df44a\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp" Apr 23 18:21:45.168747 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:45.168726 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cdfd3e18-e160-43d8-9414-c221598df44a-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp\" (UID: \"cdfd3e18-e160-43d8-9414-c221598df44a\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp" Apr 23 18:21:45.169039 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:45.169021 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cdfd3e18-e160-43d8-9414-c221598df44a-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp\" (UID: \"cdfd3e18-e160-43d8-9414-c221598df44a\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp" Apr 23 18:21:45.170619 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:45.170597 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cdfd3e18-e160-43d8-9414-c221598df44a-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp\" (UID: \"cdfd3e18-e160-43d8-9414-c221598df44a\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp" Apr 23 18:21:45.178124 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:45.178103 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpvmq\" (UniqueName: \"kubernetes.io/projected/cdfd3e18-e160-43d8-9414-c221598df44a-kube-api-access-tpvmq\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp\" (UID: \"cdfd3e18-e160-43d8-9414-c221598df44a\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp" Apr 23 18:21:45.297319 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:45.297290 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp" Apr 23 18:21:45.415253 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:45.415222 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp"] Apr 23 18:21:45.418318 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:21:45.418268 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdfd3e18_e160_43d8_9414_c221598df44a.slice/crio-556afa71edde66d431c1ded813ebf6008155f62b1f4a62649d244db95e9ddf5d WatchSource:0}: Error finding container 556afa71edde66d431c1ded813ebf6008155f62b1f4a62649d244db95e9ddf5d: Status 404 returned error can't find the container with id 556afa71edde66d431c1ded813ebf6008155f62b1f4a62649d244db95e9ddf5d Apr 23 18:21:45.618560 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:45.618515 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp" event={"ID":"cdfd3e18-e160-43d8-9414-c221598df44a","Type":"ContainerStarted","Data":"faaca26845dbbd289354c38305cc599f8bf33d7c8c6047463318954b6dba6e30"} Apr 23 18:21:45.618560 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:45.618564 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp" event={"ID":"cdfd3e18-e160-43d8-9414-c221598df44a","Type":"ContainerStarted","Data":"556afa71edde66d431c1ded813ebf6008155f62b1f4a62649d244db95e9ddf5d"} Apr 23 18:21:45.620321 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:45.620294 2568 generic.go:358] "Generic (PLEG): container finished" podID="a0f31bbf-d85a-4ac2-885a-7fbda9df085a" containerID="a23ec7077b7cc1fb02da14ac3a61029f7b79765aac01d28838ffe9c323755831" exitCode=2 Apr 23 18:21:45.620426 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:45.620382 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc" event={"ID":"a0f31bbf-d85a-4ac2-885a-7fbda9df085a","Type":"ContainerDied","Data":"a23ec7077b7cc1fb02da14ac3a61029f7b79765aac01d28838ffe9c323755831"} Apr 23 18:21:47.391036 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:47.391001 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc" podUID="a0f31bbf-d85a-4ac2-885a-7fbda9df085a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.30:8643/healthz\": dial tcp 10.133.0.30:8643: connect: connection refused" Apr 23 18:21:48.581672 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:48.581641 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc" Apr 23 18:21:48.594303 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:48.594274 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a0f31bbf-d85a-4ac2-885a-7fbda9df085a-kserve-provision-location\") pod \"a0f31bbf-d85a-4ac2-885a-7fbda9df085a\" (UID: \"a0f31bbf-d85a-4ac2-885a-7fbda9df085a\") " Apr 23 18:21:48.594428 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:48.594308 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4gck\" (UniqueName: \"kubernetes.io/projected/a0f31bbf-d85a-4ac2-885a-7fbda9df085a-kube-api-access-z4gck\") pod \"a0f31bbf-d85a-4ac2-885a-7fbda9df085a\" (UID: \"a0f31bbf-d85a-4ac2-885a-7fbda9df085a\") " Apr 23 18:21:48.594428 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:48.594332 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a0f31bbf-d85a-4ac2-885a-7fbda9df085a-proxy-tls\") pod \"a0f31bbf-d85a-4ac2-885a-7fbda9df085a\" (UID: \"a0f31bbf-d85a-4ac2-885a-7fbda9df085a\") " Apr 23 18:21:48.594428 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:48.594361 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a0f31bbf-d85a-4ac2-885a-7fbda9df085a-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"a0f31bbf-d85a-4ac2-885a-7fbda9df085a\" (UID: \"a0f31bbf-d85a-4ac2-885a-7fbda9df085a\") " Apr 23 18:21:48.594657 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:48.594631 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0f31bbf-d85a-4ac2-885a-7fbda9df085a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a0f31bbf-d85a-4ac2-885a-7fbda9df085a" (UID: "a0f31bbf-d85a-4ac2-885a-7fbda9df085a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:21:48.594751 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:48.594727 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0f31bbf-d85a-4ac2-885a-7fbda9df085a-isvc-pmml-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-pmml-runtime-kube-rbac-proxy-sar-config") pod "a0f31bbf-d85a-4ac2-885a-7fbda9df085a" (UID: "a0f31bbf-d85a-4ac2-885a-7fbda9df085a"). InnerVolumeSpecName "isvc-pmml-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:21:48.596382 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:48.596355 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0f31bbf-d85a-4ac2-885a-7fbda9df085a-kube-api-access-z4gck" (OuterVolumeSpecName: "kube-api-access-z4gck") pod "a0f31bbf-d85a-4ac2-885a-7fbda9df085a" (UID: "a0f31bbf-d85a-4ac2-885a-7fbda9df085a"). InnerVolumeSpecName "kube-api-access-z4gck". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:21:48.596481 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:48.596396 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0f31bbf-d85a-4ac2-885a-7fbda9df085a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a0f31bbf-d85a-4ac2-885a-7fbda9df085a" (UID: "a0f31bbf-d85a-4ac2-885a-7fbda9df085a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:21:48.629223 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:48.629133 2568 generic.go:358] "Generic (PLEG): container finished" podID="a0f31bbf-d85a-4ac2-885a-7fbda9df085a" containerID="0bb2ee9900103c95517b8859c47b50abf5066c71a115ac397ec2809834e75915" exitCode=0 Apr 23 18:21:48.629223 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:48.629209 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc" Apr 23 18:21:48.629382 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:48.629228 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc" event={"ID":"a0f31bbf-d85a-4ac2-885a-7fbda9df085a","Type":"ContainerDied","Data":"0bb2ee9900103c95517b8859c47b50abf5066c71a115ac397ec2809834e75915"} Apr 23 18:21:48.629382 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:48.629268 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc" event={"ID":"a0f31bbf-d85a-4ac2-885a-7fbda9df085a","Type":"ContainerDied","Data":"72605d4f1487418b1ac003c2deeb15b2267598f38b0e3e3564b13b274aaa9a62"} Apr 23 18:21:48.629382 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:48.629285 2568 scope.go:117] "RemoveContainer" containerID="a23ec7077b7cc1fb02da14ac3a61029f7b79765aac01d28838ffe9c323755831" Apr 23 18:21:48.636611 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:48.636585 2568 scope.go:117] "RemoveContainer" containerID="0bb2ee9900103c95517b8859c47b50abf5066c71a115ac397ec2809834e75915" Apr 23 18:21:48.643490 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:48.643470 2568 scope.go:117] "RemoveContainer" containerID="5831d390e44509caabbb1cc3e59687eb269027c2872fa45dadafc5cd9f717272" Apr 23 18:21:48.650634 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:48.650612 2568 scope.go:117] "RemoveContainer" containerID="a23ec7077b7cc1fb02da14ac3a61029f7b79765aac01d28838ffe9c323755831" Apr 23 18:21:48.650869 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:48.650851 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc"] Apr 23 18:21:48.650957 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:21:48.650926 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a23ec7077b7cc1fb02da14ac3a61029f7b79765aac01d28838ffe9c323755831\": container with ID starting with a23ec7077b7cc1fb02da14ac3a61029f7b79765aac01d28838ffe9c323755831 not found: ID does not exist" containerID="a23ec7077b7cc1fb02da14ac3a61029f7b79765aac01d28838ffe9c323755831" Apr 23 18:21:48.651009 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:48.650966 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a23ec7077b7cc1fb02da14ac3a61029f7b79765aac01d28838ffe9c323755831"} err="failed to get container status \"a23ec7077b7cc1fb02da14ac3a61029f7b79765aac01d28838ffe9c323755831\": rpc error: code = NotFound desc = could not find container \"a23ec7077b7cc1fb02da14ac3a61029f7b79765aac01d28838ffe9c323755831\": container with ID starting with a23ec7077b7cc1fb02da14ac3a61029f7b79765aac01d28838ffe9c323755831 not found: ID does not exist" Apr 23 18:21:48.651009 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:48.650986 2568 scope.go:117] "RemoveContainer" containerID="0bb2ee9900103c95517b8859c47b50abf5066c71a115ac397ec2809834e75915" Apr 23 18:21:48.651226 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:21:48.651210 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bb2ee9900103c95517b8859c47b50abf5066c71a115ac397ec2809834e75915\": container with ID starting with 0bb2ee9900103c95517b8859c47b50abf5066c71a115ac397ec2809834e75915 not found: ID does not exist" containerID="0bb2ee9900103c95517b8859c47b50abf5066c71a115ac397ec2809834e75915" Apr 23 18:21:48.651265 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:48.651231 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bb2ee9900103c95517b8859c47b50abf5066c71a115ac397ec2809834e75915"} err="failed to get container status \"0bb2ee9900103c95517b8859c47b50abf5066c71a115ac397ec2809834e75915\": rpc error: code = NotFound desc = could not find container \"0bb2ee9900103c95517b8859c47b50abf5066c71a115ac397ec2809834e75915\": container with ID starting with 0bb2ee9900103c95517b8859c47b50abf5066c71a115ac397ec2809834e75915 not found: ID does not exist" Apr 23 18:21:48.651265 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:48.651246 2568 scope.go:117] "RemoveContainer" containerID="5831d390e44509caabbb1cc3e59687eb269027c2872fa45dadafc5cd9f717272" Apr 23 18:21:48.651451 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:21:48.651437 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5831d390e44509caabbb1cc3e59687eb269027c2872fa45dadafc5cd9f717272\": container with ID starting with 5831d390e44509caabbb1cc3e59687eb269027c2872fa45dadafc5cd9f717272 not found: ID does not exist" containerID="5831d390e44509caabbb1cc3e59687eb269027c2872fa45dadafc5cd9f717272" Apr 23 18:21:48.651494 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:48.651454 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5831d390e44509caabbb1cc3e59687eb269027c2872fa45dadafc5cd9f717272"} err="failed to get container status \"5831d390e44509caabbb1cc3e59687eb269027c2872fa45dadafc5cd9f717272\": rpc error: code = NotFound desc = could not find container \"5831d390e44509caabbb1cc3e59687eb269027c2872fa45dadafc5cd9f717272\": container with ID starting with 5831d390e44509caabbb1cc3e59687eb269027c2872fa45dadafc5cd9f717272 not found: ID does not exist" Apr 23 18:21:48.655261 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:48.655239 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bxtjc"] Apr 23 18:21:48.695227 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:48.695195 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a0f31bbf-d85a-4ac2-885a-7fbda9df085a-kserve-provision-location\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:21:48.695227 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:48.695225 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z4gck\" (UniqueName: \"kubernetes.io/projected/a0f31bbf-d85a-4ac2-885a-7fbda9df085a-kube-api-access-z4gck\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:21:48.695227 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:48.695237 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a0f31bbf-d85a-4ac2-885a-7fbda9df085a-proxy-tls\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:21:48.695466 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:48.695248 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a0f31bbf-d85a-4ac2-885a-7fbda9df085a-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:21:49.632826 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:49.632790 2568 generic.go:358] "Generic (PLEG): container finished" podID="cdfd3e18-e160-43d8-9414-c221598df44a" containerID="faaca26845dbbd289354c38305cc599f8bf33d7c8c6047463318954b6dba6e30" exitCode=0 Apr 23 18:21:49.633315 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:49.632864 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp" event={"ID":"cdfd3e18-e160-43d8-9414-c221598df44a","Type":"ContainerDied","Data":"faaca26845dbbd289354c38305cc599f8bf33d7c8c6047463318954b6dba6e30"} Apr 23 18:21:50.599016 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:50.598982 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0f31bbf-d85a-4ac2-885a-7fbda9df085a" path="/var/lib/kubelet/pods/a0f31bbf-d85a-4ac2-885a-7fbda9df085a/volumes" Apr 23 18:21:50.638581 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:50.638547 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp" event={"ID":"cdfd3e18-e160-43d8-9414-c221598df44a","Type":"ContainerStarted","Data":"1d227cb3c49721cc0ad77a2af61304b519acf2bdd62d648ac424990cb17980f2"} Apr 23 18:21:50.638960 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:50.638588 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp" event={"ID":"cdfd3e18-e160-43d8-9414-c221598df44a","Type":"ContainerStarted","Data":"226ce440d329820c6b4f2e3516bbe9617625708030b4ce2add346a18e102793b"} Apr 23 18:21:50.638960 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:50.638819 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp" Apr 23 18:21:50.668993 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:50.668924 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp" podStartSLOduration=6.668910402 podStartE2EDuration="6.668910402s" podCreationTimestamp="2026-04-23 18:21:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:21:50.666216946 +0000 UTC m=+1768.586309421" watchObservedRunningTime="2026-04-23 18:21:50.668910402 +0000 UTC m=+1768.589002876" Apr 23 18:21:51.641376 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:51.641337 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp" Apr 23 18:21:51.642437 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:51.642412 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp" podUID="cdfd3e18-e160-43d8-9414-c221598df44a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 23 18:21:52.644206 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:52.644169 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp" podUID="cdfd3e18-e160-43d8-9414-c221598df44a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 23 18:21:57.648902 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:57.648869 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp" Apr 23 18:21:57.649515 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:21:57.649486 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp" podUID="cdfd3e18-e160-43d8-9414-c221598df44a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 23 18:22:07.649732 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:22:07.649637 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp" podUID="cdfd3e18-e160-43d8-9414-c221598df44a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 23 18:22:17.649617 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:22:17.649578 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp" podUID="cdfd3e18-e160-43d8-9414-c221598df44a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 23 18:22:22.630951 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:22:22.630910 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwrk7_52f4d4c2-438f-4486-8ffb-4afbaad14bf1/ovn-acl-logging/0.log" Apr 23 18:22:22.633509 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:22:22.633487 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwrk7_52f4d4c2-438f-4486-8ffb-4afbaad14bf1/ovn-acl-logging/0.log" Apr 23 18:22:22.634516 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:22:22.634498 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_8dc4456d0c7e39a0756b16fde0e83318/kube-rbac-proxy-crio/4.log" Apr 23 18:22:22.636985 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:22:22.636969 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_8dc4456d0c7e39a0756b16fde0e83318/kube-rbac-proxy-crio/4.log" Apr 23 18:22:27.649695 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:22:27.649656 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp" podUID="cdfd3e18-e160-43d8-9414-c221598df44a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 23 18:22:37.650122 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:22:37.650085 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp" podUID="cdfd3e18-e160-43d8-9414-c221598df44a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 23 18:22:47.650078 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:22:47.650031 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp" podUID="cdfd3e18-e160-43d8-9414-c221598df44a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 23 18:22:57.649345 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:22:57.649309 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp" podUID="cdfd3e18-e160-43d8-9414-c221598df44a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 23 18:23:07.650362 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:07.650319 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp" podUID="cdfd3e18-e160-43d8-9414-c221598df44a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 23 18:23:08.598480 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:08.598455 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp" Apr 23 18:23:16.049618 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:16.049567 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp"] Apr 23 18:23:16.050086 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:16.049917 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp" podUID="cdfd3e18-e160-43d8-9414-c221598df44a" containerName="kserve-container" containerID="cri-o://226ce440d329820c6b4f2e3516bbe9617625708030b4ce2add346a18e102793b" gracePeriod=30 Apr 23 18:23:16.050086 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:16.049976 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp" podUID="cdfd3e18-e160-43d8-9414-c221598df44a" containerName="kube-rbac-proxy" containerID="cri-o://1d227cb3c49721cc0ad77a2af61304b519acf2bdd62d648ac424990cb17980f2" gracePeriod=30 Apr 23 18:23:16.167375 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:16.167345 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-5df402-predictor-5766465f6-lzp5b"] Apr 23 18:23:16.167648 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:16.167634 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a0f31bbf-d85a-4ac2-885a-7fbda9df085a" containerName="storage-initializer" Apr 23 18:23:16.167695 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:16.167650 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f31bbf-d85a-4ac2-885a-7fbda9df085a" containerName="storage-initializer" Apr 23 18:23:16.167695 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:16.167666 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a0f31bbf-d85a-4ac2-885a-7fbda9df085a" containerName="kube-rbac-proxy" Apr 23 18:23:16.167695 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:16.167672 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f31bbf-d85a-4ac2-885a-7fbda9df085a" containerName="kube-rbac-proxy" Apr 23 18:23:16.167695 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:16.167684 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a0f31bbf-d85a-4ac2-885a-7fbda9df085a" containerName="kserve-container" Apr 23 18:23:16.167695 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:16.167690 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f31bbf-d85a-4ac2-885a-7fbda9df085a" containerName="kserve-container" Apr 23 18:23:16.167851 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:16.167736 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="a0f31bbf-d85a-4ac2-885a-7fbda9df085a" containerName="kserve-container" Apr 23 18:23:16.167851 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:16.167746 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="a0f31bbf-d85a-4ac2-885a-7fbda9df085a" containerName="kube-rbac-proxy" Apr 23 18:23:16.170661 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:16.170642 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-5df402-predictor-5766465f6-lzp5b" Apr 23 18:23:16.172700 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:16.172678 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-5df402-predictor-serving-cert\"" Apr 23 18:23:16.172829 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:16.172685 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-5df402-kube-rbac-proxy-sar-config\"" Apr 23 18:23:16.180190 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:16.180098 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-5df402-predictor-5766465f6-lzp5b"] Apr 23 18:23:16.249478 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:16.249427 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nczr5\" (UniqueName: \"kubernetes.io/projected/1b2f3978-5d08-4fa9-b961-309e6c14b2b4-kube-api-access-nczr5\") pod \"isvc-primary-5df402-predictor-5766465f6-lzp5b\" (UID: \"1b2f3978-5d08-4fa9-b961-309e6c14b2b4\") " pod="kserve-ci-e2e-test/isvc-primary-5df402-predictor-5766465f6-lzp5b" Apr 23 18:23:16.249478 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:16.249482 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b2f3978-5d08-4fa9-b961-309e6c14b2b4-kserve-provision-location\") pod \"isvc-primary-5df402-predictor-5766465f6-lzp5b\" (UID: \"1b2f3978-5d08-4fa9-b961-309e6c14b2b4\") " pod="kserve-ci-e2e-test/isvc-primary-5df402-predictor-5766465f6-lzp5b" Apr 23 18:23:16.249708 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:16.249602 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b2f3978-5d08-4fa9-b961-309e6c14b2b4-proxy-tls\") pod \"isvc-primary-5df402-predictor-5766465f6-lzp5b\" (UID: \"1b2f3978-5d08-4fa9-b961-309e6c14b2b4\") " pod="kserve-ci-e2e-test/isvc-primary-5df402-predictor-5766465f6-lzp5b" Apr 23 18:23:16.249708 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:16.249635 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-primary-5df402-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1b2f3978-5d08-4fa9-b961-309e6c14b2b4-isvc-primary-5df402-kube-rbac-proxy-sar-config\") pod \"isvc-primary-5df402-predictor-5766465f6-lzp5b\" (UID: \"1b2f3978-5d08-4fa9-b961-309e6c14b2b4\") " pod="kserve-ci-e2e-test/isvc-primary-5df402-predictor-5766465f6-lzp5b" Apr 23 18:23:16.350276 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:16.350236 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b2f3978-5d08-4fa9-b961-309e6c14b2b4-proxy-tls\") pod \"isvc-primary-5df402-predictor-5766465f6-lzp5b\" (UID: \"1b2f3978-5d08-4fa9-b961-309e6c14b2b4\") " pod="kserve-ci-e2e-test/isvc-primary-5df402-predictor-5766465f6-lzp5b" Apr 23 18:23:16.350276 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:16.350283 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-primary-5df402-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1b2f3978-5d08-4fa9-b961-309e6c14b2b4-isvc-primary-5df402-kube-rbac-proxy-sar-config\") pod \"isvc-primary-5df402-predictor-5766465f6-lzp5b\" (UID: \"1b2f3978-5d08-4fa9-b961-309e6c14b2b4\") " pod="kserve-ci-e2e-test/isvc-primary-5df402-predictor-5766465f6-lzp5b" Apr 23 18:23:16.350543 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:16.350316 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nczr5\" (UniqueName: \"kubernetes.io/projected/1b2f3978-5d08-4fa9-b961-309e6c14b2b4-kube-api-access-nczr5\") pod \"isvc-primary-5df402-predictor-5766465f6-lzp5b\" (UID: \"1b2f3978-5d08-4fa9-b961-309e6c14b2b4\") " pod="kserve-ci-e2e-test/isvc-primary-5df402-predictor-5766465f6-lzp5b" Apr 23 18:23:16.350543 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:16.350341 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b2f3978-5d08-4fa9-b961-309e6c14b2b4-kserve-provision-location\") pod \"isvc-primary-5df402-predictor-5766465f6-lzp5b\" (UID: \"1b2f3978-5d08-4fa9-b961-309e6c14b2b4\") " pod="kserve-ci-e2e-test/isvc-primary-5df402-predictor-5766465f6-lzp5b" Apr 23 18:23:16.350819 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:16.350799 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b2f3978-5d08-4fa9-b961-309e6c14b2b4-kserve-provision-location\") pod \"isvc-primary-5df402-predictor-5766465f6-lzp5b\" (UID: \"1b2f3978-5d08-4fa9-b961-309e6c14b2b4\") " pod="kserve-ci-e2e-test/isvc-primary-5df402-predictor-5766465f6-lzp5b" Apr 23 18:23:16.351081 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:16.351061 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-primary-5df402-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1b2f3978-5d08-4fa9-b961-309e6c14b2b4-isvc-primary-5df402-kube-rbac-proxy-sar-config\") pod \"isvc-primary-5df402-predictor-5766465f6-lzp5b\" (UID: \"1b2f3978-5d08-4fa9-b961-309e6c14b2b4\") " pod="kserve-ci-e2e-test/isvc-primary-5df402-predictor-5766465f6-lzp5b" Apr 23 18:23:16.352674 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:16.352654 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b2f3978-5d08-4fa9-b961-309e6c14b2b4-proxy-tls\") pod \"isvc-primary-5df402-predictor-5766465f6-lzp5b\" (UID: \"1b2f3978-5d08-4fa9-b961-309e6c14b2b4\") " pod="kserve-ci-e2e-test/isvc-primary-5df402-predictor-5766465f6-lzp5b" Apr 23 18:23:16.359655 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:16.359623 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nczr5\" (UniqueName: \"kubernetes.io/projected/1b2f3978-5d08-4fa9-b961-309e6c14b2b4-kube-api-access-nczr5\") pod \"isvc-primary-5df402-predictor-5766465f6-lzp5b\" (UID: \"1b2f3978-5d08-4fa9-b961-309e6c14b2b4\") " pod="kserve-ci-e2e-test/isvc-primary-5df402-predictor-5766465f6-lzp5b" Apr 23 18:23:16.481512 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:16.481468 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-5df402-predictor-5766465f6-lzp5b" Apr 23 18:23:16.605466 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:16.605372 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-5df402-predictor-5766465f6-lzp5b"] Apr 23 18:23:16.608724 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:23:16.608698 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b2f3978_5d08_4fa9_b961_309e6c14b2b4.slice/crio-c8c6ed7324b7d3bfd79d97102ec7dfc07c2f0cd0662f6f39cf6747373149f743 WatchSource:0}: Error finding container c8c6ed7324b7d3bfd79d97102ec7dfc07c2f0cd0662f6f39cf6747373149f743: Status 404 returned error can't find the container with id c8c6ed7324b7d3bfd79d97102ec7dfc07c2f0cd0662f6f39cf6747373149f743 Apr 23 18:23:16.610455 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:16.610438 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 18:23:16.871004 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:16.870883 2568 generic.go:358] "Generic (PLEG): container finished" podID="cdfd3e18-e160-43d8-9414-c221598df44a" containerID="1d227cb3c49721cc0ad77a2af61304b519acf2bdd62d648ac424990cb17980f2" exitCode=2 Apr 23 18:23:16.871004 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:16.870960 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp" event={"ID":"cdfd3e18-e160-43d8-9414-c221598df44a","Type":"ContainerDied","Data":"1d227cb3c49721cc0ad77a2af61304b519acf2bdd62d648ac424990cb17980f2"} Apr 23 18:23:16.872368 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:16.872341 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-5df402-predictor-5766465f6-lzp5b" event={"ID":"1b2f3978-5d08-4fa9-b961-309e6c14b2b4","Type":"ContainerStarted","Data":"739fb25f35738ebaa85bbc0172e94920af837579c52fd41bd8e9a504d2c70247"} Apr 23 18:23:16.872513 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:16.872373 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-5df402-predictor-5766465f6-lzp5b" event={"ID":"1b2f3978-5d08-4fa9-b961-309e6c14b2b4","Type":"ContainerStarted","Data":"c8c6ed7324b7d3bfd79d97102ec7dfc07c2f0cd0662f6f39cf6747373149f743"} Apr 23 18:23:17.644383 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:17.644343 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp" podUID="cdfd3e18-e160-43d8-9414-c221598df44a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.31:8643/healthz\": dial tcp 10.133.0.31:8643: connect: connection refused" Apr 23 18:23:18.596357 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:18.596315 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp" podUID="cdfd3e18-e160-43d8-9414-c221598df44a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 23 18:23:19.787578 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:19.787554 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp" Apr 23 18:23:19.878896 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:19.878866 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpvmq\" (UniqueName: \"kubernetes.io/projected/cdfd3e18-e160-43d8-9414-c221598df44a-kube-api-access-tpvmq\") pod \"cdfd3e18-e160-43d8-9414-c221598df44a\" (UID: \"cdfd3e18-e160-43d8-9414-c221598df44a\") " Apr 23 18:23:19.879123 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:19.878903 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cdfd3e18-e160-43d8-9414-c221598df44a-proxy-tls\") pod \"cdfd3e18-e160-43d8-9414-c221598df44a\" (UID: \"cdfd3e18-e160-43d8-9414-c221598df44a\") " Apr 23 18:23:19.879123 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:19.878974 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cdfd3e18-e160-43d8-9414-c221598df44a-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"cdfd3e18-e160-43d8-9414-c221598df44a\" (UID: \"cdfd3e18-e160-43d8-9414-c221598df44a\") " Apr 23 18:23:19.879123 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:19.879018 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cdfd3e18-e160-43d8-9414-c221598df44a-kserve-provision-location\") pod \"cdfd3e18-e160-43d8-9414-c221598df44a\" (UID: \"cdfd3e18-e160-43d8-9414-c221598df44a\") " Apr 23 18:23:19.879436 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:19.879382 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdfd3e18-e160-43d8-9414-c221598df44a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "cdfd3e18-e160-43d8-9414-c221598df44a" (UID: "cdfd3e18-e160-43d8-9414-c221598df44a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:23:19.879436 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:19.879386 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdfd3e18-e160-43d8-9414-c221598df44a-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config") pod "cdfd3e18-e160-43d8-9414-c221598df44a" (UID: "cdfd3e18-e160-43d8-9414-c221598df44a"). InnerVolumeSpecName "isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:23:19.881193 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:19.881171 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdfd3e18-e160-43d8-9414-c221598df44a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "cdfd3e18-e160-43d8-9414-c221598df44a" (UID: "cdfd3e18-e160-43d8-9414-c221598df44a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:23:19.881477 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:19.881456 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdfd3e18-e160-43d8-9414-c221598df44a-kube-api-access-tpvmq" (OuterVolumeSpecName: "kube-api-access-tpvmq") pod "cdfd3e18-e160-43d8-9414-c221598df44a" (UID: "cdfd3e18-e160-43d8-9414-c221598df44a"). InnerVolumeSpecName "kube-api-access-tpvmq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:23:19.883831 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:19.883807 2568 generic.go:358] "Generic (PLEG): container finished" podID="cdfd3e18-e160-43d8-9414-c221598df44a" containerID="226ce440d329820c6b4f2e3516bbe9617625708030b4ce2add346a18e102793b" exitCode=0 Apr 23 18:23:19.883973 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:19.883875 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp" event={"ID":"cdfd3e18-e160-43d8-9414-c221598df44a","Type":"ContainerDied","Data":"226ce440d329820c6b4f2e3516bbe9617625708030b4ce2add346a18e102793b"} Apr 23 18:23:19.883973 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:19.883900 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp" event={"ID":"cdfd3e18-e160-43d8-9414-c221598df44a","Type":"ContainerDied","Data":"556afa71edde66d431c1ded813ebf6008155f62b1f4a62649d244db95e9ddf5d"} Apr 23 18:23:19.883973 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:19.883906 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp" Apr 23 18:23:19.884148 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:19.883915 2568 scope.go:117] "RemoveContainer" containerID="1d227cb3c49721cc0ad77a2af61304b519acf2bdd62d648ac424990cb17980f2" Apr 23 18:23:19.894410 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:19.894392 2568 scope.go:117] "RemoveContainer" containerID="226ce440d329820c6b4f2e3516bbe9617625708030b4ce2add346a18e102793b" Apr 23 18:23:19.901259 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:19.901241 2568 scope.go:117] "RemoveContainer" containerID="faaca26845dbbd289354c38305cc599f8bf33d7c8c6047463318954b6dba6e30" Apr 23 18:23:19.907320 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:19.907298 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp"] Apr 23 18:23:19.908025 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:19.908013 2568 scope.go:117] "RemoveContainer" containerID="1d227cb3c49721cc0ad77a2af61304b519acf2bdd62d648ac424990cb17980f2" Apr 23 18:23:19.908266 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:23:19.908244 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d227cb3c49721cc0ad77a2af61304b519acf2bdd62d648ac424990cb17980f2\": container with ID starting with 1d227cb3c49721cc0ad77a2af61304b519acf2bdd62d648ac424990cb17980f2 not found: ID does not exist" containerID="1d227cb3c49721cc0ad77a2af61304b519acf2bdd62d648ac424990cb17980f2" Apr 23 18:23:19.908322 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:19.908276 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d227cb3c49721cc0ad77a2af61304b519acf2bdd62d648ac424990cb17980f2"} err="failed to get container status \"1d227cb3c49721cc0ad77a2af61304b519acf2bdd62d648ac424990cb17980f2\": rpc error: code = NotFound desc = could not find container \"1d227cb3c49721cc0ad77a2af61304b519acf2bdd62d648ac424990cb17980f2\": container with ID starting with 1d227cb3c49721cc0ad77a2af61304b519acf2bdd62d648ac424990cb17980f2 not found: ID does not exist" Apr 23 18:23:19.908322 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:19.908295 2568 scope.go:117] "RemoveContainer" containerID="226ce440d329820c6b4f2e3516bbe9617625708030b4ce2add346a18e102793b" Apr 23 18:23:19.908555 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:23:19.908536 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"226ce440d329820c6b4f2e3516bbe9617625708030b4ce2add346a18e102793b\": container with ID starting with 226ce440d329820c6b4f2e3516bbe9617625708030b4ce2add346a18e102793b not found: ID does not exist" containerID="226ce440d329820c6b4f2e3516bbe9617625708030b4ce2add346a18e102793b" Apr 23 18:23:19.908608 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:19.908560 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"226ce440d329820c6b4f2e3516bbe9617625708030b4ce2add346a18e102793b"} err="failed to get container status \"226ce440d329820c6b4f2e3516bbe9617625708030b4ce2add346a18e102793b\": rpc error: code = NotFound desc = could not find container \"226ce440d329820c6b4f2e3516bbe9617625708030b4ce2add346a18e102793b\": container with ID starting with 226ce440d329820c6b4f2e3516bbe9617625708030b4ce2add346a18e102793b not found: ID does not exist" Apr 23 18:23:19.908608 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:19.908591 2568 scope.go:117] "RemoveContainer" containerID="faaca26845dbbd289354c38305cc599f8bf33d7c8c6047463318954b6dba6e30" Apr 23 18:23:19.908820 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:23:19.908804 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"faaca26845dbbd289354c38305cc599f8bf33d7c8c6047463318954b6dba6e30\": container with ID starting with faaca26845dbbd289354c38305cc599f8bf33d7c8c6047463318954b6dba6e30 not found: ID does not exist" containerID="faaca26845dbbd289354c38305cc599f8bf33d7c8c6047463318954b6dba6e30" Apr 23 18:23:19.908867 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:19.908824 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"faaca26845dbbd289354c38305cc599f8bf33d7c8c6047463318954b6dba6e30"} err="failed to get container status \"faaca26845dbbd289354c38305cc599f8bf33d7c8c6047463318954b6dba6e30\": rpc error: code = NotFound desc = could not find container \"faaca26845dbbd289354c38305cc599f8bf33d7c8c6047463318954b6dba6e30\": container with ID starting with faaca26845dbbd289354c38305cc599f8bf33d7c8c6047463318954b6dba6e30 not found: ID does not exist" Apr 23 18:23:19.911968 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:19.911946 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-pm2zp"] Apr 23 18:23:19.980259 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:19.980225 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tpvmq\" (UniqueName: \"kubernetes.io/projected/cdfd3e18-e160-43d8-9414-c221598df44a-kube-api-access-tpvmq\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:23:19.980259 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:19.980254 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cdfd3e18-e160-43d8-9414-c221598df44a-proxy-tls\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:23:19.980259 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:19.980267 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cdfd3e18-e160-43d8-9414-c221598df44a-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:23:19.980468 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:19.980278 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cdfd3e18-e160-43d8-9414-c221598df44a-kserve-provision-location\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:23:20.598723 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:20.598689 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdfd3e18-e160-43d8-9414-c221598df44a" path="/var/lib/kubelet/pods/cdfd3e18-e160-43d8-9414-c221598df44a/volumes" Apr 23 18:23:20.887732 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:20.887647 2568 generic.go:358] "Generic (PLEG): container finished" podID="1b2f3978-5d08-4fa9-b961-309e6c14b2b4" containerID="739fb25f35738ebaa85bbc0172e94920af837579c52fd41bd8e9a504d2c70247" exitCode=0 Apr 23 18:23:20.887732 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:20.887724 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-5df402-predictor-5766465f6-lzp5b" event={"ID":"1b2f3978-5d08-4fa9-b961-309e6c14b2b4","Type":"ContainerDied","Data":"739fb25f35738ebaa85bbc0172e94920af837579c52fd41bd8e9a504d2c70247"} Apr 23 18:23:21.893453 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:21.893412 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-5df402-predictor-5766465f6-lzp5b" event={"ID":"1b2f3978-5d08-4fa9-b961-309e6c14b2b4","Type":"ContainerStarted","Data":"24cc04c01dc55fd060df3ad82e79b5c8451ad508d5bf3cf83770cb92cbd4b702"} Apr 23 18:23:21.893453 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:21.893450 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-5df402-predictor-5766465f6-lzp5b" event={"ID":"1b2f3978-5d08-4fa9-b961-309e6c14b2b4","Type":"ContainerStarted","Data":"42d49ede221f38fd919b996cade1e7a39e37164b849aa1b46e5f66190371a17b"} Apr 23 18:23:21.893888 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:21.893643 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-5df402-predictor-5766465f6-lzp5b" Apr 23 18:23:21.921456 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:21.921405 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-5df402-predictor-5766465f6-lzp5b" podStartSLOduration=5.921389939 podStartE2EDuration="5.921389939s" podCreationTimestamp="2026-04-23 18:23:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:23:21.920399245 +0000 UTC m=+1859.840491720" watchObservedRunningTime="2026-04-23 18:23:21.921389939 +0000 UTC m=+1859.841482413" Apr 23 18:23:22.896499 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:22.896458 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-5df402-predictor-5766465f6-lzp5b" Apr 23 18:23:22.897783 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:22.897755 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-5df402-predictor-5766465f6-lzp5b" podUID="1b2f3978-5d08-4fa9-b961-309e6c14b2b4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 23 18:23:23.899520 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:23.899485 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-5df402-predictor-5766465f6-lzp5b" podUID="1b2f3978-5d08-4fa9-b961-309e6c14b2b4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 23 18:23:28.904790 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:28.904760 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-5df402-predictor-5766465f6-lzp5b" Apr 23 18:23:28.905295 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:28.905267 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-5df402-predictor-5766465f6-lzp5b" podUID="1b2f3978-5d08-4fa9-b961-309e6c14b2b4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 23 18:23:38.905592 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:38.905515 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-5df402-predictor-5766465f6-lzp5b" podUID="1b2f3978-5d08-4fa9-b961-309e6c14b2b4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 23 18:23:48.905525 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:48.905481 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-5df402-predictor-5766465f6-lzp5b" podUID="1b2f3978-5d08-4fa9-b961-309e6c14b2b4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 23 18:23:58.905497 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:23:58.905453 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-5df402-predictor-5766465f6-lzp5b" podUID="1b2f3978-5d08-4fa9-b961-309e6c14b2b4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 23 18:24:08.906038 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:08.905995 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-5df402-predictor-5766465f6-lzp5b" podUID="1b2f3978-5d08-4fa9-b961-309e6c14b2b4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 23 18:24:18.906250 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:18.906151 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-5df402-predictor-5766465f6-lzp5b" podUID="1b2f3978-5d08-4fa9-b961-309e6c14b2b4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 23 18:24:28.906078 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:28.906048 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-5df402-predictor-5766465f6-lzp5b" Apr 23 18:24:36.372166 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:36.372130 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-5df402-predictor-599c8b945f-xzctz"] Apr 23 18:24:36.372591 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:36.372424 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cdfd3e18-e160-43d8-9414-c221598df44a" containerName="kube-rbac-proxy" Apr 23 18:24:36.372591 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:36.372435 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdfd3e18-e160-43d8-9414-c221598df44a" containerName="kube-rbac-proxy" Apr 23 18:24:36.372591 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:36.372449 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cdfd3e18-e160-43d8-9414-c221598df44a" containerName="storage-initializer" Apr 23 18:24:36.372591 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:36.372455 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdfd3e18-e160-43d8-9414-c221598df44a" containerName="storage-initializer" Apr 23 18:24:36.372591 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:36.372464 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cdfd3e18-e160-43d8-9414-c221598df44a" containerName="kserve-container" Apr 23 18:24:36.372591 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:36.372470 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdfd3e18-e160-43d8-9414-c221598df44a" containerName="kserve-container" Apr 23 18:24:36.372591 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:36.372517 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="cdfd3e18-e160-43d8-9414-c221598df44a" containerName="kserve-container" Apr 23 18:24:36.372591 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:36.372525 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="cdfd3e18-e160-43d8-9414-c221598df44a" containerName="kube-rbac-proxy" Apr 23 18:24:36.375539 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:36.375522 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-5df402-predictor-599c8b945f-xzctz" Apr 23 18:24:36.377650 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:36.377628 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-5df402\"" Apr 23 18:24:36.378015 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:36.377995 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-5df402-predictor-serving-cert\"" Apr 23 18:24:36.378126 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:36.377996 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-5df402-dockercfg-hprls\"" Apr 23 18:24:36.378126 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:36.378036 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-5df402-kube-rbac-proxy-sar-config\"" Apr 23 18:24:36.378126 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:36.378036 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 23 18:24:36.386489 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:36.386463 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-5df402-predictor-599c8b945f-xzctz"] Apr 23 18:24:36.533658 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:36.533613 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-secondary-5df402-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8d4e1951-715a-4431-b6e5-d344a39ea131-isvc-secondary-5df402-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-5df402-predictor-599c8b945f-xzctz\" (UID: \"8d4e1951-715a-4431-b6e5-d344a39ea131\") " pod="kserve-ci-e2e-test/isvc-secondary-5df402-predictor-599c8b945f-xzctz" Apr 23 18:24:36.533658 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:36.533654 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/8d4e1951-715a-4431-b6e5-d344a39ea131-cabundle-cert\") pod \"isvc-secondary-5df402-predictor-599c8b945f-xzctz\" (UID: \"8d4e1951-715a-4431-b6e5-d344a39ea131\") " pod="kserve-ci-e2e-test/isvc-secondary-5df402-predictor-599c8b945f-xzctz" Apr 23 18:24:36.533882 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:36.533698 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kntcx\" (UniqueName: \"kubernetes.io/projected/8d4e1951-715a-4431-b6e5-d344a39ea131-kube-api-access-kntcx\") pod \"isvc-secondary-5df402-predictor-599c8b945f-xzctz\" (UID: \"8d4e1951-715a-4431-b6e5-d344a39ea131\") " pod="kserve-ci-e2e-test/isvc-secondary-5df402-predictor-599c8b945f-xzctz" Apr 23 18:24:36.533882 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:36.533717 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d4e1951-715a-4431-b6e5-d344a39ea131-kserve-provision-location\") pod \"isvc-secondary-5df402-predictor-599c8b945f-xzctz\" (UID: \"8d4e1951-715a-4431-b6e5-d344a39ea131\") " pod="kserve-ci-e2e-test/isvc-secondary-5df402-predictor-599c8b945f-xzctz" Apr 23 18:24:36.533882 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:36.533748 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8d4e1951-715a-4431-b6e5-d344a39ea131-proxy-tls\") pod \"isvc-secondary-5df402-predictor-599c8b945f-xzctz\" (UID: \"8d4e1951-715a-4431-b6e5-d344a39ea131\") " pod="kserve-ci-e2e-test/isvc-secondary-5df402-predictor-599c8b945f-xzctz" Apr 23 18:24:36.634294 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:36.634206 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kntcx\" (UniqueName: \"kubernetes.io/projected/8d4e1951-715a-4431-b6e5-d344a39ea131-kube-api-access-kntcx\") pod \"isvc-secondary-5df402-predictor-599c8b945f-xzctz\" (UID: \"8d4e1951-715a-4431-b6e5-d344a39ea131\") " pod="kserve-ci-e2e-test/isvc-secondary-5df402-predictor-599c8b945f-xzctz" Apr 23 18:24:36.634294 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:36.634246 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d4e1951-715a-4431-b6e5-d344a39ea131-kserve-provision-location\") pod \"isvc-secondary-5df402-predictor-599c8b945f-xzctz\" (UID: \"8d4e1951-715a-4431-b6e5-d344a39ea131\") " pod="kserve-ci-e2e-test/isvc-secondary-5df402-predictor-599c8b945f-xzctz" Apr 23 18:24:36.634294 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:36.634280 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8d4e1951-715a-4431-b6e5-d344a39ea131-proxy-tls\") pod \"isvc-secondary-5df402-predictor-599c8b945f-xzctz\" (UID: \"8d4e1951-715a-4431-b6e5-d344a39ea131\") " pod="kserve-ci-e2e-test/isvc-secondary-5df402-predictor-599c8b945f-xzctz" Apr 23 18:24:36.634589 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:36.634344 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-secondary-5df402-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8d4e1951-715a-4431-b6e5-d344a39ea131-isvc-secondary-5df402-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-5df402-predictor-599c8b945f-xzctz\" (UID: \"8d4e1951-715a-4431-b6e5-d344a39ea131\") " pod="kserve-ci-e2e-test/isvc-secondary-5df402-predictor-599c8b945f-xzctz" Apr 23 18:24:36.634589 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:36.634366 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/8d4e1951-715a-4431-b6e5-d344a39ea131-cabundle-cert\") pod \"isvc-secondary-5df402-predictor-599c8b945f-xzctz\" (UID: \"8d4e1951-715a-4431-b6e5-d344a39ea131\") " pod="kserve-ci-e2e-test/isvc-secondary-5df402-predictor-599c8b945f-xzctz" Apr 23 18:24:36.634589 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:24:36.634468 2568 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-secondary-5df402-predictor-serving-cert: secret "isvc-secondary-5df402-predictor-serving-cert" not found Apr 23 18:24:36.634589 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:24:36.634555 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d4e1951-715a-4431-b6e5-d344a39ea131-proxy-tls podName:8d4e1951-715a-4431-b6e5-d344a39ea131 nodeName:}" failed. No retries permitted until 2026-04-23 18:24:37.134531765 +0000 UTC m=+1935.054624232 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/8d4e1951-715a-4431-b6e5-d344a39ea131-proxy-tls") pod "isvc-secondary-5df402-predictor-599c8b945f-xzctz" (UID: "8d4e1951-715a-4431-b6e5-d344a39ea131") : secret "isvc-secondary-5df402-predictor-serving-cert" not found Apr 23 18:24:36.634797 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:36.634706 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d4e1951-715a-4431-b6e5-d344a39ea131-kserve-provision-location\") pod \"isvc-secondary-5df402-predictor-599c8b945f-xzctz\" (UID: \"8d4e1951-715a-4431-b6e5-d344a39ea131\") " pod="kserve-ci-e2e-test/isvc-secondary-5df402-predictor-599c8b945f-xzctz" Apr 23 18:24:36.635112 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:36.635088 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-secondary-5df402-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8d4e1951-715a-4431-b6e5-d344a39ea131-isvc-secondary-5df402-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-5df402-predictor-599c8b945f-xzctz\" (UID: \"8d4e1951-715a-4431-b6e5-d344a39ea131\") " pod="kserve-ci-e2e-test/isvc-secondary-5df402-predictor-599c8b945f-xzctz" Apr 23 18:24:36.635158 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:36.635095 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/8d4e1951-715a-4431-b6e5-d344a39ea131-cabundle-cert\") pod \"isvc-secondary-5df402-predictor-599c8b945f-xzctz\" (UID: \"8d4e1951-715a-4431-b6e5-d344a39ea131\") " pod="kserve-ci-e2e-test/isvc-secondary-5df402-predictor-599c8b945f-xzctz" Apr 23 18:24:36.643514 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:36.643489 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kntcx\" (UniqueName: \"kubernetes.io/projected/8d4e1951-715a-4431-b6e5-d344a39ea131-kube-api-access-kntcx\") pod \"isvc-secondary-5df402-predictor-599c8b945f-xzctz\" (UID: \"8d4e1951-715a-4431-b6e5-d344a39ea131\") " pod="kserve-ci-e2e-test/isvc-secondary-5df402-predictor-599c8b945f-xzctz" Apr 23 18:24:37.139755 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:37.139718 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8d4e1951-715a-4431-b6e5-d344a39ea131-proxy-tls\") pod \"isvc-secondary-5df402-predictor-599c8b945f-xzctz\" (UID: \"8d4e1951-715a-4431-b6e5-d344a39ea131\") " pod="kserve-ci-e2e-test/isvc-secondary-5df402-predictor-599c8b945f-xzctz" Apr 23 18:24:37.142304 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:37.142270 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8d4e1951-715a-4431-b6e5-d344a39ea131-proxy-tls\") pod \"isvc-secondary-5df402-predictor-599c8b945f-xzctz\" (UID: \"8d4e1951-715a-4431-b6e5-d344a39ea131\") " pod="kserve-ci-e2e-test/isvc-secondary-5df402-predictor-599c8b945f-xzctz" Apr 23 18:24:37.287149 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:37.287110 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-5df402-predictor-599c8b945f-xzctz" Apr 23 18:24:37.412179 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:37.412065 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-5df402-predictor-599c8b945f-xzctz"] Apr 23 18:24:37.414804 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:24:37.414774 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d4e1951_715a_4431_b6e5_d344a39ea131.slice/crio-8084f7aa5b9fd3b36c5c1c107c91661cd2017b26b166d2f417e76b77d7a03aaf WatchSource:0}: Error finding container 8084f7aa5b9fd3b36c5c1c107c91661cd2017b26b166d2f417e76b77d7a03aaf: Status 404 returned error can't find the container with id 8084f7aa5b9fd3b36c5c1c107c91661cd2017b26b166d2f417e76b77d7a03aaf Apr 23 18:24:38.099569 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:38.099534 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-5df402-predictor-599c8b945f-xzctz" event={"ID":"8d4e1951-715a-4431-b6e5-d344a39ea131","Type":"ContainerStarted","Data":"b8b71ed440be4fece5bf5cf64b07f2fe13a4f9b629ec55f9b7955081b8185784"} Apr 23 18:24:38.099569 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:38.099570 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-5df402-predictor-599c8b945f-xzctz" event={"ID":"8d4e1951-715a-4431-b6e5-d344a39ea131","Type":"ContainerStarted","Data":"8084f7aa5b9fd3b36c5c1c107c91661cd2017b26b166d2f417e76b77d7a03aaf"} Apr 23 18:24:42.111072 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:42.111045 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-5df402-predictor-599c8b945f-xzctz_8d4e1951-715a-4431-b6e5-d344a39ea131/storage-initializer/0.log" Apr 23 18:24:42.111506 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:42.111081 2568 generic.go:358] "Generic (PLEG): container finished" podID="8d4e1951-715a-4431-b6e5-d344a39ea131" containerID="b8b71ed440be4fece5bf5cf64b07f2fe13a4f9b629ec55f9b7955081b8185784" exitCode=1 Apr 23 18:24:42.111506 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:42.111138 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-5df402-predictor-599c8b945f-xzctz" event={"ID":"8d4e1951-715a-4431-b6e5-d344a39ea131","Type":"ContainerDied","Data":"b8b71ed440be4fece5bf5cf64b07f2fe13a4f9b629ec55f9b7955081b8185784"} Apr 23 18:24:43.115624 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:43.115590 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-5df402-predictor-599c8b945f-xzctz_8d4e1951-715a-4431-b6e5-d344a39ea131/storage-initializer/0.log" Apr 23 18:24:43.116048 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:43.115658 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-5df402-predictor-599c8b945f-xzctz" event={"ID":"8d4e1951-715a-4431-b6e5-d344a39ea131","Type":"ContainerStarted","Data":"dece7cb02ae14a9bb4d63e255d416f49fd31e523f060fb6a52f234348a3fa421"} Apr 23 18:24:49.134629 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:49.134599 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-5df402-predictor-599c8b945f-xzctz_8d4e1951-715a-4431-b6e5-d344a39ea131/storage-initializer/1.log" Apr 23 18:24:49.135042 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:49.135009 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-5df402-predictor-599c8b945f-xzctz_8d4e1951-715a-4431-b6e5-d344a39ea131/storage-initializer/0.log" Apr 23 18:24:49.135096 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:49.135041 2568 generic.go:358] "Generic (PLEG): container finished" podID="8d4e1951-715a-4431-b6e5-d344a39ea131" containerID="dece7cb02ae14a9bb4d63e255d416f49fd31e523f060fb6a52f234348a3fa421" exitCode=1 Apr 23 18:24:49.135096 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:49.135077 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-5df402-predictor-599c8b945f-xzctz" event={"ID":"8d4e1951-715a-4431-b6e5-d344a39ea131","Type":"ContainerDied","Data":"dece7cb02ae14a9bb4d63e255d416f49fd31e523f060fb6a52f234348a3fa421"} Apr 23 18:24:49.135195 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:49.135105 2568 scope.go:117] "RemoveContainer" containerID="b8b71ed440be4fece5bf5cf64b07f2fe13a4f9b629ec55f9b7955081b8185784" Apr 23 18:24:49.135492 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:49.135474 2568 scope.go:117] "RemoveContainer" containerID="b8b71ed440be4fece5bf5cf64b07f2fe13a4f9b629ec55f9b7955081b8185784" Apr 23 18:24:49.145531 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:24:49.145504 2568 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-5df402-predictor-599c8b945f-xzctz_kserve-ci-e2e-test_8d4e1951-715a-4431-b6e5-d344a39ea131_0 in pod sandbox 8084f7aa5b9fd3b36c5c1c107c91661cd2017b26b166d2f417e76b77d7a03aaf from index: no such id: 'b8b71ed440be4fece5bf5cf64b07f2fe13a4f9b629ec55f9b7955081b8185784'" containerID="b8b71ed440be4fece5bf5cf64b07f2fe13a4f9b629ec55f9b7955081b8185784" Apr 23 18:24:49.145617 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:24:49.145558 2568 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-5df402-predictor-599c8b945f-xzctz_kserve-ci-e2e-test_8d4e1951-715a-4431-b6e5-d344a39ea131_0 in pod sandbox 8084f7aa5b9fd3b36c5c1c107c91661cd2017b26b166d2f417e76b77d7a03aaf from index: no such id: 'b8b71ed440be4fece5bf5cf64b07f2fe13a4f9b629ec55f9b7955081b8185784'; Skipping pod \"isvc-secondary-5df402-predictor-599c8b945f-xzctz_kserve-ci-e2e-test(8d4e1951-715a-4431-b6e5-d344a39ea131)\"" logger="UnhandledError" Apr 23 18:24:49.146896 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:24:49.146876 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-5df402-predictor-599c8b945f-xzctz_kserve-ci-e2e-test(8d4e1951-715a-4431-b6e5-d344a39ea131)\"" pod="kserve-ci-e2e-test/isvc-secondary-5df402-predictor-599c8b945f-xzctz" podUID="8d4e1951-715a-4431-b6e5-d344a39ea131" Apr 23 18:24:50.138900 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:50.138872 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-5df402-predictor-599c8b945f-xzctz_8d4e1951-715a-4431-b6e5-d344a39ea131/storage-initializer/1.log" Apr 23 18:24:52.454762 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:52.454724 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-5df402-predictor-599c8b945f-xzctz"] Apr 23 18:24:52.492269 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:52.492230 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-5df402-predictor-5766465f6-lzp5b"] Apr 23 18:24:52.492642 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:52.492593 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-5df402-predictor-5766465f6-lzp5b" podUID="1b2f3978-5d08-4fa9-b961-309e6c14b2b4" containerName="kserve-container" containerID="cri-o://42d49ede221f38fd919b996cade1e7a39e37164b849aa1b46e5f66190371a17b" gracePeriod=30 Apr 23 18:24:52.492719 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:52.492644 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-5df402-predictor-5766465f6-lzp5b" podUID="1b2f3978-5d08-4fa9-b961-309e6c14b2b4" containerName="kube-rbac-proxy" containerID="cri-o://24cc04c01dc55fd060df3ad82e79b5c8451ad508d5bf3cf83770cb92cbd4b702" gracePeriod=30 Apr 23 18:24:52.609872 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:52.609833 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7"] Apr 23 18:24:52.613282 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:52.613259 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7" Apr 23 18:24:52.613686 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:52.613667 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-5df402-predictor-599c8b945f-xzctz_8d4e1951-715a-4431-b6e5-d344a39ea131/storage-initializer/1.log" Apr 23 18:24:52.613778 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:52.613752 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-5df402-predictor-599c8b945f-xzctz" Apr 23 18:24:52.615298 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:52.615278 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-3eb8cd-dockercfg-cvhxd\"" Apr 23 18:24:52.615412 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:52.615303 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-3eb8cd\"" Apr 23 18:24:52.615412 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:52.615283 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-3eb8cd-kube-rbac-proxy-sar-config\"" Apr 23 18:24:52.615412 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:52.615321 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-3eb8cd-predictor-serving-cert\"" Apr 23 18:24:52.629336 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:52.629311 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7"] Apr 23 18:24:52.659163 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:52.659133 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/8d4e1951-715a-4431-b6e5-d344a39ea131-cabundle-cert\") pod \"8d4e1951-715a-4431-b6e5-d344a39ea131\" (UID: \"8d4e1951-715a-4431-b6e5-d344a39ea131\") " Apr 23 18:24:52.659338 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:52.659174 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d4e1951-715a-4431-b6e5-d344a39ea131-kserve-provision-location\") pod \"8d4e1951-715a-4431-b6e5-d344a39ea131\" (UID: \"8d4e1951-715a-4431-b6e5-d344a39ea131\") " Apr 23 18:24:52.659338 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:52.659221 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kntcx\" (UniqueName: \"kubernetes.io/projected/8d4e1951-715a-4431-b6e5-d344a39ea131-kube-api-access-kntcx\") pod \"8d4e1951-715a-4431-b6e5-d344a39ea131\" (UID: \"8d4e1951-715a-4431-b6e5-d344a39ea131\") " Apr 23 18:24:52.659338 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:52.659282 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8d4e1951-715a-4431-b6e5-d344a39ea131-proxy-tls\") pod \"8d4e1951-715a-4431-b6e5-d344a39ea131\" (UID: \"8d4e1951-715a-4431-b6e5-d344a39ea131\") " Apr 23 18:24:52.659506 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:52.659349 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-secondary-5df402-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8d4e1951-715a-4431-b6e5-d344a39ea131-isvc-secondary-5df402-kube-rbac-proxy-sar-config\") pod \"8d4e1951-715a-4431-b6e5-d344a39ea131\" (UID: \"8d4e1951-715a-4431-b6e5-d344a39ea131\") " Apr 23 18:24:52.659506 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:52.659433 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-init-fail-3eb8cd-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/18a40db4-7165-4f48-86a5-aa2b578fb14b-isvc-init-fail-3eb8cd-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7\" (UID: \"18a40db4-7165-4f48-86a5-aa2b578fb14b\") " pod="kserve-ci-e2e-test/isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7" Apr 23 18:24:52.659506 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:52.659478 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/18a40db4-7165-4f48-86a5-aa2b578fb14b-cabundle-cert\") pod \"isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7\" (UID: \"18a40db4-7165-4f48-86a5-aa2b578fb14b\") " pod="kserve-ci-e2e-test/isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7" Apr 23 18:24:52.659741 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:52.659505 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d4e1951-715a-4431-b6e5-d344a39ea131-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8d4e1951-715a-4431-b6e5-d344a39ea131" (UID: "8d4e1951-715a-4431-b6e5-d344a39ea131"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:24:52.659741 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:52.659535 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/18a40db4-7165-4f48-86a5-aa2b578fb14b-kserve-provision-location\") pod \"isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7\" (UID: \"18a40db4-7165-4f48-86a5-aa2b578fb14b\") " pod="kserve-ci-e2e-test/isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7" Apr 23 18:24:52.659741 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:52.659630 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5tvm\" (UniqueName: \"kubernetes.io/projected/18a40db4-7165-4f48-86a5-aa2b578fb14b-kube-api-access-g5tvm\") pod \"isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7\" (UID: \"18a40db4-7165-4f48-86a5-aa2b578fb14b\") " pod="kserve-ci-e2e-test/isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7" Apr 23 18:24:52.659741 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:52.659669 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/18a40db4-7165-4f48-86a5-aa2b578fb14b-proxy-tls\") pod \"isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7\" (UID: \"18a40db4-7165-4f48-86a5-aa2b578fb14b\") " pod="kserve-ci-e2e-test/isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7" Apr 23 18:24:52.659741 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:52.659675 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d4e1951-715a-4431-b6e5-d344a39ea131-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "8d4e1951-715a-4431-b6e5-d344a39ea131" (UID: "8d4e1951-715a-4431-b6e5-d344a39ea131"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:24:52.659741 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:52.659728 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d4e1951-715a-4431-b6e5-d344a39ea131-isvc-secondary-5df402-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-secondary-5df402-kube-rbac-proxy-sar-config") pod "8d4e1951-715a-4431-b6e5-d344a39ea131" (UID: "8d4e1951-715a-4431-b6e5-d344a39ea131"). InnerVolumeSpecName "isvc-secondary-5df402-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:24:52.660014 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:52.659750 2568 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/8d4e1951-715a-4431-b6e5-d344a39ea131-cabundle-cert\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:24:52.660014 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:52.659773 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d4e1951-715a-4431-b6e5-d344a39ea131-kserve-provision-location\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:24:52.661466 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:52.661446 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d4e1951-715a-4431-b6e5-d344a39ea131-kube-api-access-kntcx" (OuterVolumeSpecName: "kube-api-access-kntcx") pod "8d4e1951-715a-4431-b6e5-d344a39ea131" (UID: "8d4e1951-715a-4431-b6e5-d344a39ea131"). InnerVolumeSpecName "kube-api-access-kntcx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:24:52.661575 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:52.661557 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d4e1951-715a-4431-b6e5-d344a39ea131-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "8d4e1951-715a-4431-b6e5-d344a39ea131" (UID: "8d4e1951-715a-4431-b6e5-d344a39ea131"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:24:52.761098 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:52.761007 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/18a40db4-7165-4f48-86a5-aa2b578fb14b-cabundle-cert\") pod \"isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7\" (UID: \"18a40db4-7165-4f48-86a5-aa2b578fb14b\") " pod="kserve-ci-e2e-test/isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7" Apr 23 18:24:52.761098 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:52.761067 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/18a40db4-7165-4f48-86a5-aa2b578fb14b-kserve-provision-location\") pod \"isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7\" (UID: \"18a40db4-7165-4f48-86a5-aa2b578fb14b\") " pod="kserve-ci-e2e-test/isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7" Apr 23 18:24:52.761343 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:52.761110 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g5tvm\" (UniqueName: \"kubernetes.io/projected/18a40db4-7165-4f48-86a5-aa2b578fb14b-kube-api-access-g5tvm\") pod \"isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7\" (UID: \"18a40db4-7165-4f48-86a5-aa2b578fb14b\") " pod="kserve-ci-e2e-test/isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7" Apr 23 18:24:52.761343 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:52.761148 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/18a40db4-7165-4f48-86a5-aa2b578fb14b-proxy-tls\") pod \"isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7\" (UID: \"18a40db4-7165-4f48-86a5-aa2b578fb14b\") " pod="kserve-ci-e2e-test/isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7" Apr 23 18:24:52.761343 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:52.761192 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-init-fail-3eb8cd-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/18a40db4-7165-4f48-86a5-aa2b578fb14b-isvc-init-fail-3eb8cd-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7\" (UID: \"18a40db4-7165-4f48-86a5-aa2b578fb14b\") " pod="kserve-ci-e2e-test/isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7" Apr 23 18:24:52.761343 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:52.761226 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kntcx\" (UniqueName: \"kubernetes.io/projected/8d4e1951-715a-4431-b6e5-d344a39ea131-kube-api-access-kntcx\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:24:52.761343 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:52.761239 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8d4e1951-715a-4431-b6e5-d344a39ea131-proxy-tls\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:24:52.761343 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:52.761253 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-secondary-5df402-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8d4e1951-715a-4431-b6e5-d344a39ea131-isvc-secondary-5df402-kube-rbac-proxy-sar-config\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:24:52.761343 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:24:52.761312 2568 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-init-fail-3eb8cd-predictor-serving-cert: secret "isvc-init-fail-3eb8cd-predictor-serving-cert" not found Apr 23 18:24:52.761816 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:24:52.761395 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18a40db4-7165-4f48-86a5-aa2b578fb14b-proxy-tls podName:18a40db4-7165-4f48-86a5-aa2b578fb14b nodeName:}" failed. No retries permitted until 2026-04-23 18:24:53.261375164 +0000 UTC m=+1951.181467630 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/18a40db4-7165-4f48-86a5-aa2b578fb14b-proxy-tls") pod "isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7" (UID: "18a40db4-7165-4f48-86a5-aa2b578fb14b") : secret "isvc-init-fail-3eb8cd-predictor-serving-cert" not found Apr 23 18:24:52.761816 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:52.761542 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/18a40db4-7165-4f48-86a5-aa2b578fb14b-kserve-provision-location\") pod \"isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7\" (UID: \"18a40db4-7165-4f48-86a5-aa2b578fb14b\") " pod="kserve-ci-e2e-test/isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7" Apr 23 18:24:52.761816 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:52.761693 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/18a40db4-7165-4f48-86a5-aa2b578fb14b-cabundle-cert\") pod \"isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7\" (UID: \"18a40db4-7165-4f48-86a5-aa2b578fb14b\") " pod="kserve-ci-e2e-test/isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7" Apr 23 18:24:52.762001 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:52.761907 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-init-fail-3eb8cd-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/18a40db4-7165-4f48-86a5-aa2b578fb14b-isvc-init-fail-3eb8cd-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7\" (UID: \"18a40db4-7165-4f48-86a5-aa2b578fb14b\") " pod="kserve-ci-e2e-test/isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7" Apr 23 18:24:52.771328 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:52.771299 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5tvm\" (UniqueName: \"kubernetes.io/projected/18a40db4-7165-4f48-86a5-aa2b578fb14b-kube-api-access-g5tvm\") pod \"isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7\" (UID: \"18a40db4-7165-4f48-86a5-aa2b578fb14b\") " pod="kserve-ci-e2e-test/isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7" Apr 23 18:24:53.149606 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:53.149567 2568 generic.go:358] "Generic (PLEG): container finished" podID="1b2f3978-5d08-4fa9-b961-309e6c14b2b4" containerID="24cc04c01dc55fd060df3ad82e79b5c8451ad508d5bf3cf83770cb92cbd4b702" exitCode=2 Apr 23 18:24:53.149830 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:53.149641 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-5df402-predictor-5766465f6-lzp5b" event={"ID":"1b2f3978-5d08-4fa9-b961-309e6c14b2b4","Type":"ContainerDied","Data":"24cc04c01dc55fd060df3ad82e79b5c8451ad508d5bf3cf83770cb92cbd4b702"} Apr 23 18:24:53.150701 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:53.150676 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-5df402-predictor-599c8b945f-xzctz_8d4e1951-715a-4431-b6e5-d344a39ea131/storage-initializer/1.log" Apr 23 18:24:53.150824 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:53.150760 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-5df402-predictor-599c8b945f-xzctz" event={"ID":"8d4e1951-715a-4431-b6e5-d344a39ea131","Type":"ContainerDied","Data":"8084f7aa5b9fd3b36c5c1c107c91661cd2017b26b166d2f417e76b77d7a03aaf"} Apr 23 18:24:53.150824 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:53.150787 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-5df402-predictor-599c8b945f-xzctz" Apr 23 18:24:53.150824 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:53.150792 2568 scope.go:117] "RemoveContainer" containerID="dece7cb02ae14a9bb4d63e255d416f49fd31e523f060fb6a52f234348a3fa421" Apr 23 18:24:53.189552 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:53.189520 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-5df402-predictor-599c8b945f-xzctz"] Apr 23 18:24:53.195649 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:53.195623 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-5df402-predictor-599c8b945f-xzctz"] Apr 23 18:24:53.265285 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:53.265242 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/18a40db4-7165-4f48-86a5-aa2b578fb14b-proxy-tls\") pod \"isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7\" (UID: \"18a40db4-7165-4f48-86a5-aa2b578fb14b\") " pod="kserve-ci-e2e-test/isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7" Apr 23 18:24:53.265464 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:24:53.265390 2568 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-init-fail-3eb8cd-predictor-serving-cert: secret "isvc-init-fail-3eb8cd-predictor-serving-cert" not found Apr 23 18:24:53.265464 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:24:53.265456 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18a40db4-7165-4f48-86a5-aa2b578fb14b-proxy-tls podName:18a40db4-7165-4f48-86a5-aa2b578fb14b nodeName:}" failed. No retries permitted until 2026-04-23 18:24:54.265441106 +0000 UTC m=+1952.185533559 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/18a40db4-7165-4f48-86a5-aa2b578fb14b-proxy-tls") pod "isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7" (UID: "18a40db4-7165-4f48-86a5-aa2b578fb14b") : secret "isvc-init-fail-3eb8cd-predictor-serving-cert" not found Apr 23 18:24:53.900307 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:53.900241 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-5df402-predictor-5766465f6-lzp5b" podUID="1b2f3978-5d08-4fa9-b961-309e6c14b2b4" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.32:8643/healthz\": dial tcp 10.133.0.32:8643: connect: connection refused" Apr 23 18:24:54.274616 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:54.274502 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/18a40db4-7165-4f48-86a5-aa2b578fb14b-proxy-tls\") pod \"isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7\" (UID: \"18a40db4-7165-4f48-86a5-aa2b578fb14b\") " pod="kserve-ci-e2e-test/isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7" Apr 23 18:24:54.277163 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:54.277135 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/18a40db4-7165-4f48-86a5-aa2b578fb14b-proxy-tls\") pod \"isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7\" (UID: \"18a40db4-7165-4f48-86a5-aa2b578fb14b\") " pod="kserve-ci-e2e-test/isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7" Apr 23 18:24:54.422726 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:54.422675 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7" Apr 23 18:24:54.547005 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:54.546911 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7"] Apr 23 18:24:54.550314 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:24:54.550281 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18a40db4_7165_4f48_86a5_aa2b578fb14b.slice/crio-3e6ee98980ba3ecb214923200eb3e3a5660aab203543bd6eb71e1a2b7fb66044 WatchSource:0}: Error finding container 3e6ee98980ba3ecb214923200eb3e3a5660aab203543bd6eb71e1a2b7fb66044: Status 404 returned error can't find the container with id 3e6ee98980ba3ecb214923200eb3e3a5660aab203543bd6eb71e1a2b7fb66044 Apr 23 18:24:54.598758 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:54.598728 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d4e1951-715a-4431-b6e5-d344a39ea131" path="/var/lib/kubelet/pods/8d4e1951-715a-4431-b6e5-d344a39ea131/volumes" Apr 23 18:24:55.158445 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:55.158405 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7" event={"ID":"18a40db4-7165-4f48-86a5-aa2b578fb14b","Type":"ContainerStarted","Data":"bd80f286fa714c0884ee78859875251eab2469a22efca132b3262dfb82993db8"} Apr 23 18:24:55.158970 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:55.158453 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7" event={"ID":"18a40db4-7165-4f48-86a5-aa2b578fb14b","Type":"ContainerStarted","Data":"3e6ee98980ba3ecb214923200eb3e3a5660aab203543bd6eb71e1a2b7fb66044"} Apr 23 18:24:56.932327 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:56.932301 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-5df402-predictor-5766465f6-lzp5b" Apr 23 18:24:56.997090 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:56.997049 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b2f3978-5d08-4fa9-b961-309e6c14b2b4-proxy-tls\") pod \"1b2f3978-5d08-4fa9-b961-309e6c14b2b4\" (UID: \"1b2f3978-5d08-4fa9-b961-309e6c14b2b4\") " Apr 23 18:24:56.997298 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:56.997122 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-primary-5df402-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1b2f3978-5d08-4fa9-b961-309e6c14b2b4-isvc-primary-5df402-kube-rbac-proxy-sar-config\") pod \"1b2f3978-5d08-4fa9-b961-309e6c14b2b4\" (UID: \"1b2f3978-5d08-4fa9-b961-309e6c14b2b4\") " Apr 23 18:24:56.997298 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:56.997166 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b2f3978-5d08-4fa9-b961-309e6c14b2b4-kserve-provision-location\") pod \"1b2f3978-5d08-4fa9-b961-309e6c14b2b4\" (UID: \"1b2f3978-5d08-4fa9-b961-309e6c14b2b4\") " Apr 23 18:24:56.997298 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:56.997192 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nczr5\" (UniqueName: \"kubernetes.io/projected/1b2f3978-5d08-4fa9-b961-309e6c14b2b4-kube-api-access-nczr5\") pod \"1b2f3978-5d08-4fa9-b961-309e6c14b2b4\" (UID: \"1b2f3978-5d08-4fa9-b961-309e6c14b2b4\") " Apr 23 18:24:56.997539 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:56.997507 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b2f3978-5d08-4fa9-b961-309e6c14b2b4-isvc-primary-5df402-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-primary-5df402-kube-rbac-proxy-sar-config") pod "1b2f3978-5d08-4fa9-b961-309e6c14b2b4" (UID: "1b2f3978-5d08-4fa9-b961-309e6c14b2b4"). InnerVolumeSpecName "isvc-primary-5df402-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:24:56.997604 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:56.997511 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b2f3978-5d08-4fa9-b961-309e6c14b2b4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1b2f3978-5d08-4fa9-b961-309e6c14b2b4" (UID: "1b2f3978-5d08-4fa9-b961-309e6c14b2b4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:24:56.999306 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:56.999284 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b2f3978-5d08-4fa9-b961-309e6c14b2b4-kube-api-access-nczr5" (OuterVolumeSpecName: "kube-api-access-nczr5") pod "1b2f3978-5d08-4fa9-b961-309e6c14b2b4" (UID: "1b2f3978-5d08-4fa9-b961-309e6c14b2b4"). InnerVolumeSpecName "kube-api-access-nczr5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:24:56.999386 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:56.999346 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b2f3978-5d08-4fa9-b961-309e6c14b2b4-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "1b2f3978-5d08-4fa9-b961-309e6c14b2b4" (UID: "1b2f3978-5d08-4fa9-b961-309e6c14b2b4"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:24:57.098001 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:57.097965 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-primary-5df402-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1b2f3978-5d08-4fa9-b961-309e6c14b2b4-isvc-primary-5df402-kube-rbac-proxy-sar-config\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:24:57.098001 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:57.098001 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b2f3978-5d08-4fa9-b961-309e6c14b2b4-kserve-provision-location\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:24:57.098212 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:57.098011 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nczr5\" (UniqueName: \"kubernetes.io/projected/1b2f3978-5d08-4fa9-b961-309e6c14b2b4-kube-api-access-nczr5\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:24:57.098212 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:57.098020 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b2f3978-5d08-4fa9-b961-309e6c14b2b4-proxy-tls\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:24:57.165731 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:57.165701 2568 generic.go:358] "Generic (PLEG): container finished" podID="1b2f3978-5d08-4fa9-b961-309e6c14b2b4" containerID="42d49ede221f38fd919b996cade1e7a39e37164b849aa1b46e5f66190371a17b" exitCode=0 Apr 23 18:24:57.165909 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:57.165785 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-5df402-predictor-5766465f6-lzp5b" Apr 23 18:24:57.165909 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:57.165788 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-5df402-predictor-5766465f6-lzp5b" event={"ID":"1b2f3978-5d08-4fa9-b961-309e6c14b2b4","Type":"ContainerDied","Data":"42d49ede221f38fd919b996cade1e7a39e37164b849aa1b46e5f66190371a17b"} Apr 23 18:24:57.165909 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:57.165888 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-5df402-predictor-5766465f6-lzp5b" event={"ID":"1b2f3978-5d08-4fa9-b961-309e6c14b2b4","Type":"ContainerDied","Data":"c8c6ed7324b7d3bfd79d97102ec7dfc07c2f0cd0662f6f39cf6747373149f743"} Apr 23 18:24:57.166052 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:57.165905 2568 scope.go:117] "RemoveContainer" containerID="24cc04c01dc55fd060df3ad82e79b5c8451ad508d5bf3cf83770cb92cbd4b702" Apr 23 18:24:57.173674 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:57.173618 2568 scope.go:117] "RemoveContainer" containerID="42d49ede221f38fd919b996cade1e7a39e37164b849aa1b46e5f66190371a17b" Apr 23 18:24:57.180676 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:57.180657 2568 scope.go:117] "RemoveContainer" containerID="739fb25f35738ebaa85bbc0172e94920af837579c52fd41bd8e9a504d2c70247" Apr 23 18:24:57.187738 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:57.187716 2568 scope.go:117] "RemoveContainer" containerID="24cc04c01dc55fd060df3ad82e79b5c8451ad508d5bf3cf83770cb92cbd4b702" Apr 23 18:24:57.188040 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:24:57.188016 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24cc04c01dc55fd060df3ad82e79b5c8451ad508d5bf3cf83770cb92cbd4b702\": container with ID starting with 24cc04c01dc55fd060df3ad82e79b5c8451ad508d5bf3cf83770cb92cbd4b702 not found: ID does not exist" containerID="24cc04c01dc55fd060df3ad82e79b5c8451ad508d5bf3cf83770cb92cbd4b702" Apr 23 18:24:57.188104 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:57.188050 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24cc04c01dc55fd060df3ad82e79b5c8451ad508d5bf3cf83770cb92cbd4b702"} err="failed to get container status \"24cc04c01dc55fd060df3ad82e79b5c8451ad508d5bf3cf83770cb92cbd4b702\": rpc error: code = NotFound desc = could not find container \"24cc04c01dc55fd060df3ad82e79b5c8451ad508d5bf3cf83770cb92cbd4b702\": container with ID starting with 24cc04c01dc55fd060df3ad82e79b5c8451ad508d5bf3cf83770cb92cbd4b702 not found: ID does not exist" Apr 23 18:24:57.188104 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:57.188069 2568 scope.go:117] "RemoveContainer" containerID="42d49ede221f38fd919b996cade1e7a39e37164b849aa1b46e5f66190371a17b" Apr 23 18:24:57.188441 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:24:57.188420 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42d49ede221f38fd919b996cade1e7a39e37164b849aa1b46e5f66190371a17b\": container with ID starting with 42d49ede221f38fd919b996cade1e7a39e37164b849aa1b46e5f66190371a17b not found: ID does not exist" containerID="42d49ede221f38fd919b996cade1e7a39e37164b849aa1b46e5f66190371a17b" Apr 23 18:24:57.188441 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:57.188445 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42d49ede221f38fd919b996cade1e7a39e37164b849aa1b46e5f66190371a17b"} err="failed to get container status \"42d49ede221f38fd919b996cade1e7a39e37164b849aa1b46e5f66190371a17b\": rpc error: code = NotFound desc = could not find container \"42d49ede221f38fd919b996cade1e7a39e37164b849aa1b46e5f66190371a17b\": container with ID starting with 42d49ede221f38fd919b996cade1e7a39e37164b849aa1b46e5f66190371a17b not found: ID does not exist" Apr 23 18:24:57.188594 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:57.188460 2568 scope.go:117] "RemoveContainer" containerID="739fb25f35738ebaa85bbc0172e94920af837579c52fd41bd8e9a504d2c70247" Apr 23 18:24:57.188752 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:24:57.188714 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"739fb25f35738ebaa85bbc0172e94920af837579c52fd41bd8e9a504d2c70247\": container with ID starting with 739fb25f35738ebaa85bbc0172e94920af837579c52fd41bd8e9a504d2c70247 not found: ID does not exist" containerID="739fb25f35738ebaa85bbc0172e94920af837579c52fd41bd8e9a504d2c70247" Apr 23 18:24:57.188825 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:57.188760 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"739fb25f35738ebaa85bbc0172e94920af837579c52fd41bd8e9a504d2c70247"} err="failed to get container status \"739fb25f35738ebaa85bbc0172e94920af837579c52fd41bd8e9a504d2c70247\": rpc error: code = NotFound desc = could not find container \"739fb25f35738ebaa85bbc0172e94920af837579c52fd41bd8e9a504d2c70247\": container with ID starting with 739fb25f35738ebaa85bbc0172e94920af837579c52fd41bd8e9a504d2c70247 not found: ID does not exist" Apr 23 18:24:57.190359 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:57.190335 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-5df402-predictor-5766465f6-lzp5b"] Apr 23 18:24:57.196910 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:57.196886 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-5df402-predictor-5766465f6-lzp5b"] Apr 23 18:24:58.171595 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:58.171522 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7_18a40db4-7165-4f48-86a5-aa2b578fb14b/storage-initializer/0.log" Apr 23 18:24:58.171595 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:58.171558 2568 generic.go:358] "Generic (PLEG): container finished" podID="18a40db4-7165-4f48-86a5-aa2b578fb14b" containerID="bd80f286fa714c0884ee78859875251eab2469a22efca132b3262dfb82993db8" exitCode=1 Apr 23 18:24:58.172005 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:58.171632 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7" event={"ID":"18a40db4-7165-4f48-86a5-aa2b578fb14b","Type":"ContainerDied","Data":"bd80f286fa714c0884ee78859875251eab2469a22efca132b3262dfb82993db8"} Apr 23 18:24:58.603161 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:58.603123 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b2f3978-5d08-4fa9-b961-309e6c14b2b4" path="/var/lib/kubelet/pods/1b2f3978-5d08-4fa9-b961-309e6c14b2b4/volumes" Apr 23 18:24:59.176093 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:59.176063 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7_18a40db4-7165-4f48-86a5-aa2b578fb14b/storage-initializer/0.log" Apr 23 18:24:59.176550 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:24:59.176180 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7" event={"ID":"18a40db4-7165-4f48-86a5-aa2b578fb14b","Type":"ContainerStarted","Data":"c7d8c1d14640d71316895a0c1f91f9b67c2f8d41482f37c808bdbd7d95db2e21"} Apr 23 18:25:02.583078 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:02.582993 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7"] Apr 23 18:25:02.583504 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:02.583322 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7" podUID="18a40db4-7165-4f48-86a5-aa2b578fb14b" containerName="storage-initializer" containerID="cri-o://c7d8c1d14640d71316895a0c1f91f9b67c2f8d41482f37c808bdbd7d95db2e21" gracePeriod=30 Apr 23 18:25:02.750353 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:02.750321 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-6lst9"] Apr 23 18:25:02.750623 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:02.750610 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d4e1951-715a-4431-b6e5-d344a39ea131" containerName="storage-initializer" Apr 23 18:25:02.750673 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:02.750626 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d4e1951-715a-4431-b6e5-d344a39ea131" containerName="storage-initializer" Apr 23 18:25:02.750673 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:02.750641 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b2f3978-5d08-4fa9-b961-309e6c14b2b4" containerName="kserve-container" Apr 23 18:25:02.750673 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:02.750647 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b2f3978-5d08-4fa9-b961-309e6c14b2b4" containerName="kserve-container" Apr 23 18:25:02.750673 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:02.750660 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b2f3978-5d08-4fa9-b961-309e6c14b2b4" containerName="storage-initializer" Apr 23 18:25:02.750673 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:02.750666 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b2f3978-5d08-4fa9-b961-309e6c14b2b4" containerName="storage-initializer" Apr 23 18:25:02.750673 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:02.750673 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b2f3978-5d08-4fa9-b961-309e6c14b2b4" containerName="kube-rbac-proxy" Apr 23 18:25:02.750860 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:02.750679 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b2f3978-5d08-4fa9-b961-309e6c14b2b4" containerName="kube-rbac-proxy" Apr 23 18:25:02.750860 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:02.750724 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="1b2f3978-5d08-4fa9-b961-309e6c14b2b4" containerName="kube-rbac-proxy" Apr 23 18:25:02.750860 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:02.750733 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="8d4e1951-715a-4431-b6e5-d344a39ea131" containerName="storage-initializer" Apr 23 18:25:02.750860 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:02.750741 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="1b2f3978-5d08-4fa9-b961-309e6c14b2b4" containerName="kserve-container" Apr 23 18:25:02.750860 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:02.750747 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="8d4e1951-715a-4431-b6e5-d344a39ea131" containerName="storage-initializer" Apr 23 18:25:02.750860 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:02.750810 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d4e1951-715a-4431-b6e5-d344a39ea131" containerName="storage-initializer" Apr 23 18:25:02.750860 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:02.750816 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d4e1951-715a-4431-b6e5-d344a39ea131" containerName="storage-initializer" Apr 23 18:25:02.753868 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:02.753849 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-6lst9" Apr 23 18:25:02.755632 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:02.755611 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-predictor-serving-cert\"" Apr 23 18:25:02.755876 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:02.755860 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\"" Apr 23 18:25:02.755990 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:02.755863 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-nl4mc\"" Apr 23 18:25:02.764459 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:02.764438 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-6lst9"] Apr 23 18:25:02.840979 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:02.840874 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e45d1c3c-bb34-4044-b416-b9f6123a4c97-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-6lst9\" (UID: \"e45d1c3c-bb34-4044-b416-b9f6123a4c97\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-6lst9" Apr 23 18:25:02.840979 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:02.840968 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e45d1c3c-bb34-4044-b416-b9f6123a4c97-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-6lst9\" (UID: \"e45d1c3c-bb34-4044-b416-b9f6123a4c97\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-6lst9" Apr 23 18:25:02.841186 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:02.841002 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbg7j\" (UniqueName: \"kubernetes.io/projected/e45d1c3c-bb34-4044-b416-b9f6123a4c97-kube-api-access-kbg7j\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-6lst9\" (UID: \"e45d1c3c-bb34-4044-b416-b9f6123a4c97\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-6lst9" Apr 23 18:25:02.841186 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:02.841024 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e45d1c3c-bb34-4044-b416-b9f6123a4c97-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-6lst9\" (UID: \"e45d1c3c-bb34-4044-b416-b9f6123a4c97\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-6lst9" Apr 23 18:25:02.942055 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:02.942024 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e45d1c3c-bb34-4044-b416-b9f6123a4c97-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-6lst9\" (UID: \"e45d1c3c-bb34-4044-b416-b9f6123a4c97\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-6lst9" Apr 23 18:25:02.942223 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:02.942065 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kbg7j\" (UniqueName: \"kubernetes.io/projected/e45d1c3c-bb34-4044-b416-b9f6123a4c97-kube-api-access-kbg7j\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-6lst9\" (UID: \"e45d1c3c-bb34-4044-b416-b9f6123a4c97\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-6lst9" Apr 23 18:25:02.942223 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:02.942092 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e45d1c3c-bb34-4044-b416-b9f6123a4c97-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-6lst9\" (UID: \"e45d1c3c-bb34-4044-b416-b9f6123a4c97\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-6lst9" Apr 23 18:25:02.942223 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:02.942166 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e45d1c3c-bb34-4044-b416-b9f6123a4c97-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-6lst9\" (UID: \"e45d1c3c-bb34-4044-b416-b9f6123a4c97\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-6lst9" Apr 23 18:25:02.942409 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:25:02.942308 2568 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-serving-cert: secret "isvc-predictive-sklearn-predictor-serving-cert" not found Apr 23 18:25:02.942409 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:25:02.942374 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e45d1c3c-bb34-4044-b416-b9f6123a4c97-proxy-tls podName:e45d1c3c-bb34-4044-b416-b9f6123a4c97 nodeName:}" failed. No retries permitted until 2026-04-23 18:25:03.442353876 +0000 UTC m=+1961.362446348 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/e45d1c3c-bb34-4044-b416-b9f6123a4c97-proxy-tls") pod "isvc-predictive-sklearn-predictor-cd7c759c9-6lst9" (UID: "e45d1c3c-bb34-4044-b416-b9f6123a4c97") : secret "isvc-predictive-sklearn-predictor-serving-cert" not found Apr 23 18:25:02.942534 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:02.942510 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e45d1c3c-bb34-4044-b416-b9f6123a4c97-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-6lst9\" (UID: \"e45d1c3c-bb34-4044-b416-b9f6123a4c97\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-6lst9" Apr 23 18:25:02.942736 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:02.942718 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e45d1c3c-bb34-4044-b416-b9f6123a4c97-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-6lst9\" (UID: \"e45d1c3c-bb34-4044-b416-b9f6123a4c97\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-6lst9" Apr 23 18:25:02.959840 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:02.959816 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbg7j\" (UniqueName: \"kubernetes.io/projected/e45d1c3c-bb34-4044-b416-b9f6123a4c97-kube-api-access-kbg7j\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-6lst9\" (UID: \"e45d1c3c-bb34-4044-b416-b9f6123a4c97\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-6lst9" Apr 23 18:25:03.446247 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:03.446212 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e45d1c3c-bb34-4044-b416-b9f6123a4c97-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-6lst9\" (UID: \"e45d1c3c-bb34-4044-b416-b9f6123a4c97\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-6lst9" Apr 23 18:25:03.448641 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:03.448621 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e45d1c3c-bb34-4044-b416-b9f6123a4c97-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-6lst9\" (UID: \"e45d1c3c-bb34-4044-b416-b9f6123a4c97\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-6lst9" Apr 23 18:25:03.663401 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:03.663358 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-6lst9" Apr 23 18:25:03.794255 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:03.794211 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-6lst9"] Apr 23 18:25:03.819246 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:25:03.819216 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode45d1c3c_bb34_4044_b416_b9f6123a4c97.slice/crio-e462ff653c0cd055eb2e401682662ee3723a48d591e712d83924a17493cd2081 WatchSource:0}: Error finding container e462ff653c0cd055eb2e401682662ee3723a48d591e712d83924a17493cd2081: Status 404 returned error can't find the container with id e462ff653c0cd055eb2e401682662ee3723a48d591e712d83924a17493cd2081 Apr 23 18:25:03.922743 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:03.922720 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7_18a40db4-7165-4f48-86a5-aa2b578fb14b/storage-initializer/1.log" Apr 23 18:25:03.923112 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:03.923091 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7_18a40db4-7165-4f48-86a5-aa2b578fb14b/storage-initializer/0.log" Apr 23 18:25:03.923234 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:03.923169 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7" Apr 23 18:25:03.951907 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:03.951874 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/18a40db4-7165-4f48-86a5-aa2b578fb14b-kserve-provision-location\") pod \"18a40db4-7165-4f48-86a5-aa2b578fb14b\" (UID: \"18a40db4-7165-4f48-86a5-aa2b578fb14b\") " Apr 23 18:25:03.952087 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:03.951916 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/18a40db4-7165-4f48-86a5-aa2b578fb14b-proxy-tls\") pod \"18a40db4-7165-4f48-86a5-aa2b578fb14b\" (UID: \"18a40db4-7165-4f48-86a5-aa2b578fb14b\") " Apr 23 18:25:03.952087 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:03.951987 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/18a40db4-7165-4f48-86a5-aa2b578fb14b-cabundle-cert\") pod \"18a40db4-7165-4f48-86a5-aa2b578fb14b\" (UID: \"18a40db4-7165-4f48-86a5-aa2b578fb14b\") " Apr 23 18:25:03.952087 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:03.952010 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-init-fail-3eb8cd-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/18a40db4-7165-4f48-86a5-aa2b578fb14b-isvc-init-fail-3eb8cd-kube-rbac-proxy-sar-config\") pod \"18a40db4-7165-4f48-86a5-aa2b578fb14b\" (UID: \"18a40db4-7165-4f48-86a5-aa2b578fb14b\") " Apr 23 18:25:03.952252 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:03.952149 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5tvm\" (UniqueName: \"kubernetes.io/projected/18a40db4-7165-4f48-86a5-aa2b578fb14b-kube-api-access-g5tvm\") pod \"18a40db4-7165-4f48-86a5-aa2b578fb14b\" (UID: \"18a40db4-7165-4f48-86a5-aa2b578fb14b\") " Apr 23 18:25:03.952252 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:03.952188 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18a40db4-7165-4f48-86a5-aa2b578fb14b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "18a40db4-7165-4f48-86a5-aa2b578fb14b" (UID: "18a40db4-7165-4f48-86a5-aa2b578fb14b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:25:03.952387 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:03.952367 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/18a40db4-7165-4f48-86a5-aa2b578fb14b-kserve-provision-location\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:25:03.952433 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:03.952379 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18a40db4-7165-4f48-86a5-aa2b578fb14b-isvc-init-fail-3eb8cd-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-init-fail-3eb8cd-kube-rbac-proxy-sar-config") pod "18a40db4-7165-4f48-86a5-aa2b578fb14b" (UID: "18a40db4-7165-4f48-86a5-aa2b578fb14b"). InnerVolumeSpecName "isvc-init-fail-3eb8cd-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:25:03.952433 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:03.952383 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18a40db4-7165-4f48-86a5-aa2b578fb14b-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "18a40db4-7165-4f48-86a5-aa2b578fb14b" (UID: "18a40db4-7165-4f48-86a5-aa2b578fb14b"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:25:03.954125 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:03.954106 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18a40db4-7165-4f48-86a5-aa2b578fb14b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "18a40db4-7165-4f48-86a5-aa2b578fb14b" (UID: "18a40db4-7165-4f48-86a5-aa2b578fb14b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:25:03.954180 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:03.954152 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18a40db4-7165-4f48-86a5-aa2b578fb14b-kube-api-access-g5tvm" (OuterVolumeSpecName: "kube-api-access-g5tvm") pod "18a40db4-7165-4f48-86a5-aa2b578fb14b" (UID: "18a40db4-7165-4f48-86a5-aa2b578fb14b"). InnerVolumeSpecName "kube-api-access-g5tvm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:25:04.053629 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:04.053595 2568 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/18a40db4-7165-4f48-86a5-aa2b578fb14b-cabundle-cert\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:25:04.053629 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:04.053623 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-init-fail-3eb8cd-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/18a40db4-7165-4f48-86a5-aa2b578fb14b-isvc-init-fail-3eb8cd-kube-rbac-proxy-sar-config\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:25:04.053629 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:04.053634 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g5tvm\" (UniqueName: \"kubernetes.io/projected/18a40db4-7165-4f48-86a5-aa2b578fb14b-kube-api-access-g5tvm\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:25:04.053864 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:04.053645 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/18a40db4-7165-4f48-86a5-aa2b578fb14b-proxy-tls\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:25:04.191073 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:04.191033 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-6lst9" event={"ID":"e45d1c3c-bb34-4044-b416-b9f6123a4c97","Type":"ContainerStarted","Data":"19eb35eae724f4db31e8648cacc41451f3daedc4ef5204f1a6cf9a6eab39cb66"} Apr 23 18:25:04.191073 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:04.191077 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-6lst9" event={"ID":"e45d1c3c-bb34-4044-b416-b9f6123a4c97","Type":"ContainerStarted","Data":"e462ff653c0cd055eb2e401682662ee3723a48d591e712d83924a17493cd2081"} Apr 23 18:25:04.192158 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:04.192137 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7_18a40db4-7165-4f48-86a5-aa2b578fb14b/storage-initializer/1.log" Apr 23 18:25:04.192460 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:04.192445 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7_18a40db4-7165-4f48-86a5-aa2b578fb14b/storage-initializer/0.log" Apr 23 18:25:04.192538 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:04.192477 2568 generic.go:358] "Generic (PLEG): container finished" podID="18a40db4-7165-4f48-86a5-aa2b578fb14b" containerID="c7d8c1d14640d71316895a0c1f91f9b67c2f8d41482f37c808bdbd7d95db2e21" exitCode=1 Apr 23 18:25:04.192538 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:04.192505 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7" event={"ID":"18a40db4-7165-4f48-86a5-aa2b578fb14b","Type":"ContainerDied","Data":"c7d8c1d14640d71316895a0c1f91f9b67c2f8d41482f37c808bdbd7d95db2e21"} Apr 23 18:25:04.192655 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:04.192537 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7" event={"ID":"18a40db4-7165-4f48-86a5-aa2b578fb14b","Type":"ContainerDied","Data":"3e6ee98980ba3ecb214923200eb3e3a5660aab203543bd6eb71e1a2b7fb66044"} Apr 23 18:25:04.192655 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:04.192548 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7" Apr 23 18:25:04.192655 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:04.192554 2568 scope.go:117] "RemoveContainer" containerID="c7d8c1d14640d71316895a0c1f91f9b67c2f8d41482f37c808bdbd7d95db2e21" Apr 23 18:25:04.201078 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:04.201059 2568 scope.go:117] "RemoveContainer" containerID="bd80f286fa714c0884ee78859875251eab2469a22efca132b3262dfb82993db8" Apr 23 18:25:04.208379 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:04.208363 2568 scope.go:117] "RemoveContainer" containerID="c7d8c1d14640d71316895a0c1f91f9b67c2f8d41482f37c808bdbd7d95db2e21" Apr 23 18:25:04.208629 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:25:04.208611 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7d8c1d14640d71316895a0c1f91f9b67c2f8d41482f37c808bdbd7d95db2e21\": container with ID starting with c7d8c1d14640d71316895a0c1f91f9b67c2f8d41482f37c808bdbd7d95db2e21 not found: ID does not exist" containerID="c7d8c1d14640d71316895a0c1f91f9b67c2f8d41482f37c808bdbd7d95db2e21" Apr 23 18:25:04.208678 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:04.208639 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7d8c1d14640d71316895a0c1f91f9b67c2f8d41482f37c808bdbd7d95db2e21"} err="failed to get container status \"c7d8c1d14640d71316895a0c1f91f9b67c2f8d41482f37c808bdbd7d95db2e21\": rpc error: code = NotFound desc = could not find container \"c7d8c1d14640d71316895a0c1f91f9b67c2f8d41482f37c808bdbd7d95db2e21\": container with ID starting with c7d8c1d14640d71316895a0c1f91f9b67c2f8d41482f37c808bdbd7d95db2e21 not found: ID does not exist" Apr 23 18:25:04.208678 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:04.208656 2568 scope.go:117] "RemoveContainer" containerID="bd80f286fa714c0884ee78859875251eab2469a22efca132b3262dfb82993db8" Apr 23 18:25:04.208869 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:25:04.208854 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd80f286fa714c0884ee78859875251eab2469a22efca132b3262dfb82993db8\": container with ID starting with bd80f286fa714c0884ee78859875251eab2469a22efca132b3262dfb82993db8 not found: ID does not exist" containerID="bd80f286fa714c0884ee78859875251eab2469a22efca132b3262dfb82993db8" Apr 23 18:25:04.208906 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:04.208872 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd80f286fa714c0884ee78859875251eab2469a22efca132b3262dfb82993db8"} err="failed to get container status \"bd80f286fa714c0884ee78859875251eab2469a22efca132b3262dfb82993db8\": rpc error: code = NotFound desc = could not find container \"bd80f286fa714c0884ee78859875251eab2469a22efca132b3262dfb82993db8\": container with ID starting with bd80f286fa714c0884ee78859875251eab2469a22efca132b3262dfb82993db8 not found: ID does not exist" Apr 23 18:25:04.246000 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:04.242922 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7"] Apr 23 18:25:04.249465 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:04.249424 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-3eb8cd-predictor-cfb9c76df-vlnr7"] Apr 23 18:25:04.598431 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:04.598402 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18a40db4-7165-4f48-86a5-aa2b578fb14b" path="/var/lib/kubelet/pods/18a40db4-7165-4f48-86a5-aa2b578fb14b/volumes" Apr 23 18:25:08.206200 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:08.206159 2568 generic.go:358] "Generic (PLEG): container finished" podID="e45d1c3c-bb34-4044-b416-b9f6123a4c97" containerID="19eb35eae724f4db31e8648cacc41451f3daedc4ef5204f1a6cf9a6eab39cb66" exitCode=0 Apr 23 18:25:08.206657 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:08.206221 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-6lst9" event={"ID":"e45d1c3c-bb34-4044-b416-b9f6123a4c97","Type":"ContainerDied","Data":"19eb35eae724f4db31e8648cacc41451f3daedc4ef5204f1a6cf9a6eab39cb66"} Apr 23 18:25:29.279904 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:29.279870 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-6lst9" event={"ID":"e45d1c3c-bb34-4044-b416-b9f6123a4c97","Type":"ContainerStarted","Data":"3ac731b04e7264e67aeb6972bb19821ad8421198609444222afac246e3394f29"} Apr 23 18:25:29.279904 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:29.279912 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-6lst9" event={"ID":"e45d1c3c-bb34-4044-b416-b9f6123a4c97","Type":"ContainerStarted","Data":"6099c5ac9e6d251359e9a64bd35fe5177f3b5b09c24068e9ad07da8e86924b8d"} Apr 23 18:25:29.280378 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:29.280123 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-6lst9" Apr 23 18:25:29.303261 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:29.303206 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-6lst9" podStartSLOduration=7.053853482 podStartE2EDuration="27.303190478s" podCreationTimestamp="2026-04-23 18:25:02 +0000 UTC" firstStartedPulling="2026-04-23 18:25:08.20741012 +0000 UTC m=+1966.127502573" lastFinishedPulling="2026-04-23 18:25:28.456747112 +0000 UTC m=+1986.376839569" observedRunningTime="2026-04-23 18:25:29.302316767 +0000 UTC m=+1987.222409242" watchObservedRunningTime="2026-04-23 18:25:29.303190478 +0000 UTC m=+1987.223282953" Apr 23 18:25:30.283214 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:30.283181 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-6lst9" Apr 23 18:25:30.284156 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:30.284129 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-6lst9" podUID="e45d1c3c-bb34-4044-b416-b9f6123a4c97" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 23 18:25:31.287131 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:31.287093 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-6lst9" podUID="e45d1c3c-bb34-4044-b416-b9f6123a4c97" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 23 18:25:36.291947 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:36.291901 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-6lst9" Apr 23 18:25:36.292530 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:36.292503 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-6lst9" podUID="e45d1c3c-bb34-4044-b416-b9f6123a4c97" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 23 18:25:46.293037 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:46.292997 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-6lst9" podUID="e45d1c3c-bb34-4044-b416-b9f6123a4c97" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 23 18:25:56.293094 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:25:56.293051 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-6lst9" podUID="e45d1c3c-bb34-4044-b416-b9f6123a4c97" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 23 18:26:06.292422 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:06.292371 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-6lst9" podUID="e45d1c3c-bb34-4044-b416-b9f6123a4c97" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 23 18:26:16.292463 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:16.292422 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-6lst9" podUID="e45d1c3c-bb34-4044-b416-b9f6123a4c97" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 23 18:26:26.293357 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:26.293313 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-6lst9" podUID="e45d1c3c-bb34-4044-b416-b9f6123a4c97" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 23 18:26:36.292642 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:36.292555 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-6lst9" podUID="e45d1c3c-bb34-4044-b416-b9f6123a4c97" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 23 18:26:44.599306 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:44.599280 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-6lst9" Apr 23 18:26:52.903849 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:52.903820 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-6lst9"] Apr 23 18:26:52.904451 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:52.904144 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-6lst9" podUID="e45d1c3c-bb34-4044-b416-b9f6123a4c97" containerName="kserve-container" containerID="cri-o://6099c5ac9e6d251359e9a64bd35fe5177f3b5b09c24068e9ad07da8e86924b8d" gracePeriod=30 Apr 23 18:26:52.904451 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:52.904166 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-6lst9" podUID="e45d1c3c-bb34-4044-b416-b9f6123a4c97" containerName="kube-rbac-proxy" containerID="cri-o://3ac731b04e7264e67aeb6972bb19821ad8421198609444222afac246e3394f29" gracePeriod=30 Apr 23 18:26:53.019544 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:53.019512 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl"] Apr 23 18:26:53.019906 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:53.019881 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="18a40db4-7165-4f48-86a5-aa2b578fb14b" containerName="storage-initializer" Apr 23 18:26:53.019906 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:53.019906 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="18a40db4-7165-4f48-86a5-aa2b578fb14b" containerName="storage-initializer" Apr 23 18:26:53.020093 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:53.020042 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="18a40db4-7165-4f48-86a5-aa2b578fb14b" containerName="storage-initializer" Apr 23 18:26:53.020093 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:53.020058 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="18a40db4-7165-4f48-86a5-aa2b578fb14b" containerName="storage-initializer" Apr 23 18:26:53.020198 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:53.020128 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="18a40db4-7165-4f48-86a5-aa2b578fb14b" containerName="storage-initializer" Apr 23 18:26:53.020198 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:53.020139 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="18a40db4-7165-4f48-86a5-aa2b578fb14b" containerName="storage-initializer" Apr 23 18:26:53.023013 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:53.022994 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl" Apr 23 18:26:53.024686 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:53.024661 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-predictor-serving-cert\"" Apr 23 18:26:53.025015 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:53.024977 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\"" Apr 23 18:26:53.032235 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:53.032216 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl"] Apr 23 18:26:53.073898 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:53.073879 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9110aa9a-a256-44c3-a2f3-08f4bb955ab0-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl\" (UID: \"9110aa9a-a256-44c3-a2f3-08f4bb955ab0\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl" Apr 23 18:26:53.074076 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:53.073920 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9110aa9a-a256-44c3-a2f3-08f4bb955ab0-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl\" (UID: \"9110aa9a-a256-44c3-a2f3-08f4bb955ab0\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl" Apr 23 18:26:53.074076 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:53.074008 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9110aa9a-a256-44c3-a2f3-08f4bb955ab0-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl\" (UID: \"9110aa9a-a256-44c3-a2f3-08f4bb955ab0\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl" Apr 23 18:26:53.074157 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:53.074102 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rln6d\" (UniqueName: \"kubernetes.io/projected/9110aa9a-a256-44c3-a2f3-08f4bb955ab0-kube-api-access-rln6d\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl\" (UID: \"9110aa9a-a256-44c3-a2f3-08f4bb955ab0\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl" Apr 23 18:26:53.174963 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:53.174854 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rln6d\" (UniqueName: \"kubernetes.io/projected/9110aa9a-a256-44c3-a2f3-08f4bb955ab0-kube-api-access-rln6d\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl\" (UID: \"9110aa9a-a256-44c3-a2f3-08f4bb955ab0\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl" Apr 23 18:26:53.174963 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:53.174900 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9110aa9a-a256-44c3-a2f3-08f4bb955ab0-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl\" (UID: \"9110aa9a-a256-44c3-a2f3-08f4bb955ab0\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl" Apr 23 18:26:53.174963 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:53.174945 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9110aa9a-a256-44c3-a2f3-08f4bb955ab0-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl\" (UID: \"9110aa9a-a256-44c3-a2f3-08f4bb955ab0\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl" Apr 23 18:26:53.175236 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:53.174977 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9110aa9a-a256-44c3-a2f3-08f4bb955ab0-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl\" (UID: \"9110aa9a-a256-44c3-a2f3-08f4bb955ab0\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl" Apr 23 18:26:53.175460 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:53.175434 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9110aa9a-a256-44c3-a2f3-08f4bb955ab0-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl\" (UID: \"9110aa9a-a256-44c3-a2f3-08f4bb955ab0\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl" Apr 23 18:26:53.175691 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:53.175672 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9110aa9a-a256-44c3-a2f3-08f4bb955ab0-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl\" (UID: \"9110aa9a-a256-44c3-a2f3-08f4bb955ab0\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl" Apr 23 18:26:53.177274 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:53.177255 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9110aa9a-a256-44c3-a2f3-08f4bb955ab0-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl\" (UID: \"9110aa9a-a256-44c3-a2f3-08f4bb955ab0\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl" Apr 23 18:26:53.186253 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:53.186231 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rln6d\" (UniqueName: \"kubernetes.io/projected/9110aa9a-a256-44c3-a2f3-08f4bb955ab0-kube-api-access-rln6d\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl\" (UID: \"9110aa9a-a256-44c3-a2f3-08f4bb955ab0\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl" Apr 23 18:26:53.334316 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:53.334288 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl" Apr 23 18:26:53.453713 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:53.453539 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl"] Apr 23 18:26:53.456240 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:26:53.456210 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9110aa9a_a256_44c3_a2f3_08f4bb955ab0.slice/crio-ba358316fa3fbd5edcf9817dab14260dbd6bbceb21de8d353fa39c05dace95b8 WatchSource:0}: Error finding container ba358316fa3fbd5edcf9817dab14260dbd6bbceb21de8d353fa39c05dace95b8: Status 404 returned error can't find the container with id ba358316fa3fbd5edcf9817dab14260dbd6bbceb21de8d353fa39c05dace95b8 Apr 23 18:26:53.509360 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:53.509337 2568 generic.go:358] "Generic (PLEG): container finished" podID="e45d1c3c-bb34-4044-b416-b9f6123a4c97" containerID="3ac731b04e7264e67aeb6972bb19821ad8421198609444222afac246e3394f29" exitCode=2 Apr 23 18:26:53.509479 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:53.509408 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-6lst9" event={"ID":"e45d1c3c-bb34-4044-b416-b9f6123a4c97","Type":"ContainerDied","Data":"3ac731b04e7264e67aeb6972bb19821ad8421198609444222afac246e3394f29"} Apr 23 18:26:53.510388 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:53.510368 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl" event={"ID":"9110aa9a-a256-44c3-a2f3-08f4bb955ab0","Type":"ContainerStarted","Data":"ba358316fa3fbd5edcf9817dab14260dbd6bbceb21de8d353fa39c05dace95b8"} Apr 23 18:26:54.514561 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:54.514520 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl" event={"ID":"9110aa9a-a256-44c3-a2f3-08f4bb955ab0","Type":"ContainerStarted","Data":"c99cfbfd4f7b2c15f31aa972822c203128c05a17fea9d7a74193ede1442507f4"} Apr 23 18:26:54.596495 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:54.596461 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-6lst9" podUID="e45d1c3c-bb34-4044-b416-b9f6123a4c97" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 23 18:26:56.288173 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:56.288129 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-6lst9" podUID="e45d1c3c-bb34-4044-b416-b9f6123a4c97" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.35:8643/healthz\": dial tcp 10.133.0.35:8643: connect: connection refused" Apr 23 18:26:57.523755 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:57.523664 2568 generic.go:358] "Generic (PLEG): container finished" podID="9110aa9a-a256-44c3-a2f3-08f4bb955ab0" containerID="c99cfbfd4f7b2c15f31aa972822c203128c05a17fea9d7a74193ede1442507f4" exitCode=0 Apr 23 18:26:57.523755 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:57.523740 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl" event={"ID":"9110aa9a-a256-44c3-a2f3-08f4bb955ab0","Type":"ContainerDied","Data":"c99cfbfd4f7b2c15f31aa972822c203128c05a17fea9d7a74193ede1442507f4"} Apr 23 18:26:57.843260 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:57.843235 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-6lst9" Apr 23 18:26:57.909035 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:57.909005 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbg7j\" (UniqueName: \"kubernetes.io/projected/e45d1c3c-bb34-4044-b416-b9f6123a4c97-kube-api-access-kbg7j\") pod \"e45d1c3c-bb34-4044-b416-b9f6123a4c97\" (UID: \"e45d1c3c-bb34-4044-b416-b9f6123a4c97\") " Apr 23 18:26:57.909250 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:57.909051 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e45d1c3c-bb34-4044-b416-b9f6123a4c97-kserve-provision-location\") pod \"e45d1c3c-bb34-4044-b416-b9f6123a4c97\" (UID: \"e45d1c3c-bb34-4044-b416-b9f6123a4c97\") " Apr 23 18:26:57.909250 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:57.909099 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e45d1c3c-bb34-4044-b416-b9f6123a4c97-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"e45d1c3c-bb34-4044-b416-b9f6123a4c97\" (UID: \"e45d1c3c-bb34-4044-b416-b9f6123a4c97\") " Apr 23 18:26:57.909250 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:57.909134 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e45d1c3c-bb34-4044-b416-b9f6123a4c97-proxy-tls\") pod \"e45d1c3c-bb34-4044-b416-b9f6123a4c97\" (UID: \"e45d1c3c-bb34-4044-b416-b9f6123a4c97\") " Apr 23 18:26:57.909437 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:57.909407 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e45d1c3c-bb34-4044-b416-b9f6123a4c97-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e45d1c3c-bb34-4044-b416-b9f6123a4c97" (UID: "e45d1c3c-bb34-4044-b416-b9f6123a4c97"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:26:57.909514 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:57.909448 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e45d1c3c-bb34-4044-b416-b9f6123a4c97-isvc-predictive-sklearn-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-sklearn-kube-rbac-proxy-sar-config") pod "e45d1c3c-bb34-4044-b416-b9f6123a4c97" (UID: "e45d1c3c-bb34-4044-b416-b9f6123a4c97"). InnerVolumeSpecName "isvc-predictive-sklearn-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:26:57.911542 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:57.911515 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e45d1c3c-bb34-4044-b416-b9f6123a4c97-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e45d1c3c-bb34-4044-b416-b9f6123a4c97" (UID: "e45d1c3c-bb34-4044-b416-b9f6123a4c97"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:26:57.911661 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:57.911561 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e45d1c3c-bb34-4044-b416-b9f6123a4c97-kube-api-access-kbg7j" (OuterVolumeSpecName: "kube-api-access-kbg7j") pod "e45d1c3c-bb34-4044-b416-b9f6123a4c97" (UID: "e45d1c3c-bb34-4044-b416-b9f6123a4c97"). InnerVolumeSpecName "kube-api-access-kbg7j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:26:58.009799 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:58.009764 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kbg7j\" (UniqueName: \"kubernetes.io/projected/e45d1c3c-bb34-4044-b416-b9f6123a4c97-kube-api-access-kbg7j\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:26:58.009799 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:58.009793 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e45d1c3c-bb34-4044-b416-b9f6123a4c97-kserve-provision-location\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:26:58.009799 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:58.009805 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e45d1c3c-bb34-4044-b416-b9f6123a4c97-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:26:58.010068 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:58.009817 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e45d1c3c-bb34-4044-b416-b9f6123a4c97-proxy-tls\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:26:58.528163 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:58.528076 2568 generic.go:358] "Generic (PLEG): container finished" podID="e45d1c3c-bb34-4044-b416-b9f6123a4c97" containerID="6099c5ac9e6d251359e9a64bd35fe5177f3b5b09c24068e9ad07da8e86924b8d" exitCode=0 Apr 23 18:26:58.528163 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:58.528141 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-6lst9" event={"ID":"e45d1c3c-bb34-4044-b416-b9f6123a4c97","Type":"ContainerDied","Data":"6099c5ac9e6d251359e9a64bd35fe5177f3b5b09c24068e9ad07da8e86924b8d"} Apr 23 18:26:58.528163 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:58.528164 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-6lst9" event={"ID":"e45d1c3c-bb34-4044-b416-b9f6123a4c97","Type":"ContainerDied","Data":"e462ff653c0cd055eb2e401682662ee3723a48d591e712d83924a17493cd2081"} Apr 23 18:26:58.528811 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:58.528164 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-6lst9" Apr 23 18:26:58.528811 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:58.528179 2568 scope.go:117] "RemoveContainer" containerID="3ac731b04e7264e67aeb6972bb19821ad8421198609444222afac246e3394f29" Apr 23 18:26:58.530110 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:58.530087 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl" event={"ID":"9110aa9a-a256-44c3-a2f3-08f4bb955ab0","Type":"ContainerStarted","Data":"1d3163d324104fda9a0ce24475e2186a89f6558881c778197212caca28495a7c"} Apr 23 18:26:58.530206 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:58.530113 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl" event={"ID":"9110aa9a-a256-44c3-a2f3-08f4bb955ab0","Type":"ContainerStarted","Data":"a5e3ee3a5340b5da198b5a58bd430ab826cdb63d68fa15a2191278f3bedc865e"} Apr 23 18:26:58.530439 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:58.530424 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl" Apr 23 18:26:58.530495 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:58.530453 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl" Apr 23 18:26:58.532038 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:58.532012 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl" podUID="9110aa9a-a256-44c3-a2f3-08f4bb955ab0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 18:26:58.535980 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:58.535961 2568 scope.go:117] "RemoveContainer" containerID="6099c5ac9e6d251359e9a64bd35fe5177f3b5b09c24068e9ad07da8e86924b8d" Apr 23 18:26:58.542747 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:58.542730 2568 scope.go:117] "RemoveContainer" containerID="19eb35eae724f4db31e8648cacc41451f3daedc4ef5204f1a6cf9a6eab39cb66" Apr 23 18:26:58.549010 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:58.548993 2568 scope.go:117] "RemoveContainer" containerID="3ac731b04e7264e67aeb6972bb19821ad8421198609444222afac246e3394f29" Apr 23 18:26:58.549258 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:26:58.549232 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ac731b04e7264e67aeb6972bb19821ad8421198609444222afac246e3394f29\": container with ID starting with 3ac731b04e7264e67aeb6972bb19821ad8421198609444222afac246e3394f29 not found: ID does not exist" containerID="3ac731b04e7264e67aeb6972bb19821ad8421198609444222afac246e3394f29" Apr 23 18:26:58.549348 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:58.549269 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ac731b04e7264e67aeb6972bb19821ad8421198609444222afac246e3394f29"} err="failed to get container status \"3ac731b04e7264e67aeb6972bb19821ad8421198609444222afac246e3394f29\": rpc error: code = NotFound desc = could not find container \"3ac731b04e7264e67aeb6972bb19821ad8421198609444222afac246e3394f29\": container with ID starting with 3ac731b04e7264e67aeb6972bb19821ad8421198609444222afac246e3394f29 not found: ID does not exist" Apr 23 18:26:58.549348 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:58.549288 2568 scope.go:117] "RemoveContainer" containerID="6099c5ac9e6d251359e9a64bd35fe5177f3b5b09c24068e9ad07da8e86924b8d" Apr 23 18:26:58.549509 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:26:58.549492 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6099c5ac9e6d251359e9a64bd35fe5177f3b5b09c24068e9ad07da8e86924b8d\": container with ID starting with 6099c5ac9e6d251359e9a64bd35fe5177f3b5b09c24068e9ad07da8e86924b8d not found: ID does not exist" containerID="6099c5ac9e6d251359e9a64bd35fe5177f3b5b09c24068e9ad07da8e86924b8d" Apr 23 18:26:58.549548 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:58.549517 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6099c5ac9e6d251359e9a64bd35fe5177f3b5b09c24068e9ad07da8e86924b8d"} err="failed to get container status \"6099c5ac9e6d251359e9a64bd35fe5177f3b5b09c24068e9ad07da8e86924b8d\": rpc error: code = NotFound desc = could not find container \"6099c5ac9e6d251359e9a64bd35fe5177f3b5b09c24068e9ad07da8e86924b8d\": container with ID starting with 6099c5ac9e6d251359e9a64bd35fe5177f3b5b09c24068e9ad07da8e86924b8d not found: ID does not exist" Apr 23 18:26:58.549548 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:58.549532 2568 scope.go:117] "RemoveContainer" containerID="19eb35eae724f4db31e8648cacc41451f3daedc4ef5204f1a6cf9a6eab39cb66" Apr 23 18:26:58.549780 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:26:58.549763 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19eb35eae724f4db31e8648cacc41451f3daedc4ef5204f1a6cf9a6eab39cb66\": container with ID starting with 19eb35eae724f4db31e8648cacc41451f3daedc4ef5204f1a6cf9a6eab39cb66 not found: ID does not exist" containerID="19eb35eae724f4db31e8648cacc41451f3daedc4ef5204f1a6cf9a6eab39cb66" Apr 23 18:26:58.549821 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:58.549785 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19eb35eae724f4db31e8648cacc41451f3daedc4ef5204f1a6cf9a6eab39cb66"} err="failed to get container status \"19eb35eae724f4db31e8648cacc41451f3daedc4ef5204f1a6cf9a6eab39cb66\": rpc error: code = NotFound desc = could not find container \"19eb35eae724f4db31e8648cacc41451f3daedc4ef5204f1a6cf9a6eab39cb66\": container with ID starting with 19eb35eae724f4db31e8648cacc41451f3daedc4ef5204f1a6cf9a6eab39cb66 not found: ID does not exist" Apr 23 18:26:58.559293 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:58.559244 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl" podStartSLOduration=5.559228879 podStartE2EDuration="5.559228879s" podCreationTimestamp="2026-04-23 18:26:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:26:58.556398116 +0000 UTC m=+2076.476490601" watchObservedRunningTime="2026-04-23 18:26:58.559228879 +0000 UTC m=+2076.479321353" Apr 23 18:26:58.580982 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:58.580954 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-6lst9"] Apr 23 18:26:58.586502 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:58.586473 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-6lst9"] Apr 23 18:26:58.599297 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:58.599272 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e45d1c3c-bb34-4044-b416-b9f6123a4c97" path="/var/lib/kubelet/pods/e45d1c3c-bb34-4044-b416-b9f6123a4c97/volumes" Apr 23 18:26:59.541345 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:26:59.541310 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl" podUID="9110aa9a-a256-44c3-a2f3-08f4bb955ab0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 18:27:04.545290 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:27:04.545262 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl" Apr 23 18:27:04.545829 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:27:04.545804 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl" podUID="9110aa9a-a256-44c3-a2f3-08f4bb955ab0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 18:27:14.546280 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:27:14.546238 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl" podUID="9110aa9a-a256-44c3-a2f3-08f4bb955ab0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 18:27:22.651660 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:27:22.651631 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwrk7_52f4d4c2-438f-4486-8ffb-4afbaad14bf1/ovn-acl-logging/0.log" Apr 23 18:27:22.654154 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:27:22.654130 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwrk7_52f4d4c2-438f-4486-8ffb-4afbaad14bf1/ovn-acl-logging/0.log" Apr 23 18:27:22.655382 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:27:22.655364 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_8dc4456d0c7e39a0756b16fde0e83318/kube-rbac-proxy-crio/4.log" Apr 23 18:27:22.657737 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:27:22.657712 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_8dc4456d0c7e39a0756b16fde0e83318/kube-rbac-proxy-crio/4.log" Apr 23 18:27:24.545907 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:27:24.545867 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl" podUID="9110aa9a-a256-44c3-a2f3-08f4bb955ab0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 18:27:34.545806 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:27:34.545765 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl" podUID="9110aa9a-a256-44c3-a2f3-08f4bb955ab0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 18:27:44.546443 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:27:44.546404 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl" podUID="9110aa9a-a256-44c3-a2f3-08f4bb955ab0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 18:27:54.546444 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:27:54.546401 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl" podUID="9110aa9a-a256-44c3-a2f3-08f4bb955ab0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 18:28:04.546091 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:04.546003 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl" podUID="9110aa9a-a256-44c3-a2f3-08f4bb955ab0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 18:28:07.595433 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:07.595394 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl" podUID="9110aa9a-a256-44c3-a2f3-08f4bb955ab0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 18:28:17.596088 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:17.596052 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl" Apr 23 18:28:23.112660 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:23.112628 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl"] Apr 23 18:28:23.113137 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:23.112988 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl" podUID="9110aa9a-a256-44c3-a2f3-08f4bb955ab0" containerName="kserve-container" containerID="cri-o://a5e3ee3a5340b5da198b5a58bd430ab826cdb63d68fa15a2191278f3bedc865e" gracePeriod=30 Apr 23 18:28:23.113137 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:23.113084 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl" podUID="9110aa9a-a256-44c3-a2f3-08f4bb955ab0" containerName="kube-rbac-proxy" containerID="cri-o://1d3163d324104fda9a0ce24475e2186a89f6558881c778197212caca28495a7c" gracePeriod=30 Apr 23 18:28:23.226574 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:23.226544 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp"] Apr 23 18:28:23.226821 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:23.226808 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e45d1c3c-bb34-4044-b416-b9f6123a4c97" containerName="kserve-container" Apr 23 18:28:23.226821 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:23.226821 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="e45d1c3c-bb34-4044-b416-b9f6123a4c97" containerName="kserve-container" Apr 23 18:28:23.226970 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:23.226842 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e45d1c3c-bb34-4044-b416-b9f6123a4c97" containerName="storage-initializer" Apr 23 18:28:23.226970 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:23.226848 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="e45d1c3c-bb34-4044-b416-b9f6123a4c97" containerName="storage-initializer" Apr 23 18:28:23.226970 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:23.226858 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e45d1c3c-bb34-4044-b416-b9f6123a4c97" containerName="kube-rbac-proxy" Apr 23 18:28:23.226970 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:23.226864 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="e45d1c3c-bb34-4044-b416-b9f6123a4c97" containerName="kube-rbac-proxy" Apr 23 18:28:23.226970 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:23.226910 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="e45d1c3c-bb34-4044-b416-b9f6123a4c97" containerName="kube-rbac-proxy" Apr 23 18:28:23.226970 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:23.226919 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="e45d1c3c-bb34-4044-b416-b9f6123a4c97" containerName="kserve-container" Apr 23 18:28:23.229841 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:23.229822 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp" Apr 23 18:28:23.231511 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:23.231490 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-predictor-serving-cert\"" Apr 23 18:28:23.231792 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:23.231776 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\"" Apr 23 18:28:23.241632 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:23.241613 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp"] Apr 23 18:28:23.338892 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:23.338861 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3915026e-f1f0-49b5-9504-ee7444f2e48c-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp\" (UID: \"3915026e-f1f0-49b5-9504-ee7444f2e48c\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp" Apr 23 18:28:23.339057 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:23.338896 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k86ng\" (UniqueName: \"kubernetes.io/projected/3915026e-f1f0-49b5-9504-ee7444f2e48c-kube-api-access-k86ng\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp\" (UID: \"3915026e-f1f0-49b5-9504-ee7444f2e48c\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp" Apr 23 18:28:23.339057 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:23.338919 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3915026e-f1f0-49b5-9504-ee7444f2e48c-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp\" (UID: \"3915026e-f1f0-49b5-9504-ee7444f2e48c\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp" Apr 23 18:28:23.339057 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:23.339015 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3915026e-f1f0-49b5-9504-ee7444f2e48c-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp\" (UID: \"3915026e-f1f0-49b5-9504-ee7444f2e48c\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp" Apr 23 18:28:23.440266 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:23.440190 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3915026e-f1f0-49b5-9504-ee7444f2e48c-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp\" (UID: \"3915026e-f1f0-49b5-9504-ee7444f2e48c\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp" Apr 23 18:28:23.440266 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:23.440234 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3915026e-f1f0-49b5-9504-ee7444f2e48c-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp\" (UID: \"3915026e-f1f0-49b5-9504-ee7444f2e48c\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp" Apr 23 18:28:23.440486 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:23.440281 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3915026e-f1f0-49b5-9504-ee7444f2e48c-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp\" (UID: \"3915026e-f1f0-49b5-9504-ee7444f2e48c\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp" Apr 23 18:28:23.440486 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:23.440418 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k86ng\" (UniqueName: \"kubernetes.io/projected/3915026e-f1f0-49b5-9504-ee7444f2e48c-kube-api-access-k86ng\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp\" (UID: \"3915026e-f1f0-49b5-9504-ee7444f2e48c\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp" Apr 23 18:28:23.440770 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:23.440741 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3915026e-f1f0-49b5-9504-ee7444f2e48c-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp\" (UID: \"3915026e-f1f0-49b5-9504-ee7444f2e48c\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp" Apr 23 18:28:23.440900 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:23.440881 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3915026e-f1f0-49b5-9504-ee7444f2e48c-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp\" (UID: \"3915026e-f1f0-49b5-9504-ee7444f2e48c\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp" Apr 23 18:28:23.442616 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:23.442596 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3915026e-f1f0-49b5-9504-ee7444f2e48c-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp\" (UID: \"3915026e-f1f0-49b5-9504-ee7444f2e48c\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp" Apr 23 18:28:23.449837 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:23.449814 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k86ng\" (UniqueName: \"kubernetes.io/projected/3915026e-f1f0-49b5-9504-ee7444f2e48c-kube-api-access-k86ng\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp\" (UID: \"3915026e-f1f0-49b5-9504-ee7444f2e48c\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp" Apr 23 18:28:23.539300 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:23.539275 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp" Apr 23 18:28:23.658353 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:23.658321 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp"] Apr 23 18:28:23.662074 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:28:23.662047 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3915026e_f1f0_49b5_9504_ee7444f2e48c.slice/crio-6d85e472f3a0287e51fb1dcd7610961856d44dab6d6b78695140268640245415 WatchSource:0}: Error finding container 6d85e472f3a0287e51fb1dcd7610961856d44dab6d6b78695140268640245415: Status 404 returned error can't find the container with id 6d85e472f3a0287e51fb1dcd7610961856d44dab6d6b78695140268640245415 Apr 23 18:28:23.663779 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:23.663757 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 18:28:23.766692 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:23.766659 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp" event={"ID":"3915026e-f1f0-49b5-9504-ee7444f2e48c","Type":"ContainerStarted","Data":"fb21f5dce894dabb9829037705134489380c9f082458a00d8864cae39f9163b1"} Apr 23 18:28:23.766835 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:23.766702 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp" event={"ID":"3915026e-f1f0-49b5-9504-ee7444f2e48c","Type":"ContainerStarted","Data":"6d85e472f3a0287e51fb1dcd7610961856d44dab6d6b78695140268640245415"} Apr 23 18:28:23.768586 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:23.768563 2568 generic.go:358] "Generic (PLEG): container finished" podID="9110aa9a-a256-44c3-a2f3-08f4bb955ab0" containerID="1d3163d324104fda9a0ce24475e2186a89f6558881c778197212caca28495a7c" exitCode=2 Apr 23 18:28:23.768695 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:23.768605 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl" event={"ID":"9110aa9a-a256-44c3-a2f3-08f4bb955ab0","Type":"ContainerDied","Data":"1d3163d324104fda9a0ce24475e2186a89f6558881c778197212caca28495a7c"} Apr 23 18:28:24.542219 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:24.542180 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl" podUID="9110aa9a-a256-44c3-a2f3-08f4bb955ab0" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.36:8643/healthz\": dial tcp 10.133.0.36:8643: connect: connection refused" Apr 23 18:28:27.596303 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:27.596266 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl" podUID="9110aa9a-a256-44c3-a2f3-08f4bb955ab0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 18:28:27.782709 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:27.782679 2568 generic.go:358] "Generic (PLEG): container finished" podID="9110aa9a-a256-44c3-a2f3-08f4bb955ab0" containerID="a5e3ee3a5340b5da198b5a58bd430ab826cdb63d68fa15a2191278f3bedc865e" exitCode=0 Apr 23 18:28:27.782876 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:27.782753 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl" event={"ID":"9110aa9a-a256-44c3-a2f3-08f4bb955ab0","Type":"ContainerDied","Data":"a5e3ee3a5340b5da198b5a58bd430ab826cdb63d68fa15a2191278f3bedc865e"} Apr 23 18:28:27.783972 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:27.783953 2568 generic.go:358] "Generic (PLEG): container finished" podID="3915026e-f1f0-49b5-9504-ee7444f2e48c" containerID="fb21f5dce894dabb9829037705134489380c9f082458a00d8864cae39f9163b1" exitCode=0 Apr 23 18:28:27.784070 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:27.784021 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp" event={"ID":"3915026e-f1f0-49b5-9504-ee7444f2e48c","Type":"ContainerDied","Data":"fb21f5dce894dabb9829037705134489380c9f082458a00d8864cae39f9163b1"} Apr 23 18:28:27.850612 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:27.850589 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl" Apr 23 18:28:27.975804 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:27.975700 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9110aa9a-a256-44c3-a2f3-08f4bb955ab0-kserve-provision-location\") pod \"9110aa9a-a256-44c3-a2f3-08f4bb955ab0\" (UID: \"9110aa9a-a256-44c3-a2f3-08f4bb955ab0\") " Apr 23 18:28:27.975804 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:27.975766 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9110aa9a-a256-44c3-a2f3-08f4bb955ab0-proxy-tls\") pod \"9110aa9a-a256-44c3-a2f3-08f4bb955ab0\" (UID: \"9110aa9a-a256-44c3-a2f3-08f4bb955ab0\") " Apr 23 18:28:27.975804 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:27.975803 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9110aa9a-a256-44c3-a2f3-08f4bb955ab0-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"9110aa9a-a256-44c3-a2f3-08f4bb955ab0\" (UID: \"9110aa9a-a256-44c3-a2f3-08f4bb955ab0\") " Apr 23 18:28:27.976145 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:27.975846 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rln6d\" (UniqueName: \"kubernetes.io/projected/9110aa9a-a256-44c3-a2f3-08f4bb955ab0-kube-api-access-rln6d\") pod \"9110aa9a-a256-44c3-a2f3-08f4bb955ab0\" (UID: \"9110aa9a-a256-44c3-a2f3-08f4bb955ab0\") " Apr 23 18:28:27.976145 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:27.976076 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9110aa9a-a256-44c3-a2f3-08f4bb955ab0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9110aa9a-a256-44c3-a2f3-08f4bb955ab0" (UID: "9110aa9a-a256-44c3-a2f3-08f4bb955ab0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:28:27.976271 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:27.976212 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9110aa9a-a256-44c3-a2f3-08f4bb955ab0-isvc-predictive-xgboost-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-xgboost-kube-rbac-proxy-sar-config") pod "9110aa9a-a256-44c3-a2f3-08f4bb955ab0" (UID: "9110aa9a-a256-44c3-a2f3-08f4bb955ab0"). InnerVolumeSpecName "isvc-predictive-xgboost-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:28:27.978196 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:27.978159 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9110aa9a-a256-44c3-a2f3-08f4bb955ab0-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9110aa9a-a256-44c3-a2f3-08f4bb955ab0" (UID: "9110aa9a-a256-44c3-a2f3-08f4bb955ab0"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:28:27.978339 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:27.978280 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9110aa9a-a256-44c3-a2f3-08f4bb955ab0-kube-api-access-rln6d" (OuterVolumeSpecName: "kube-api-access-rln6d") pod "9110aa9a-a256-44c3-a2f3-08f4bb955ab0" (UID: "9110aa9a-a256-44c3-a2f3-08f4bb955ab0"). InnerVolumeSpecName "kube-api-access-rln6d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:28:28.077004 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:28.076959 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rln6d\" (UniqueName: \"kubernetes.io/projected/9110aa9a-a256-44c3-a2f3-08f4bb955ab0-kube-api-access-rln6d\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:28:28.077004 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:28.077002 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9110aa9a-a256-44c3-a2f3-08f4bb955ab0-kserve-provision-location\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:28:28.077429 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:28.077017 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9110aa9a-a256-44c3-a2f3-08f4bb955ab0-proxy-tls\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:28:28.077429 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:28.077032 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9110aa9a-a256-44c3-a2f3-08f4bb955ab0-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:28:28.788514 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:28.788409 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl" event={"ID":"9110aa9a-a256-44c3-a2f3-08f4bb955ab0","Type":"ContainerDied","Data":"ba358316fa3fbd5edcf9817dab14260dbd6bbceb21de8d353fa39c05dace95b8"} Apr 23 18:28:28.788514 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:28.788444 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl" Apr 23 18:28:28.788514 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:28.788470 2568 scope.go:117] "RemoveContainer" containerID="1d3163d324104fda9a0ce24475e2186a89f6558881c778197212caca28495a7c" Apr 23 18:28:28.790463 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:28.790433 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp" event={"ID":"3915026e-f1f0-49b5-9504-ee7444f2e48c","Type":"ContainerStarted","Data":"f2531f35ed15a904495d9666bee0f003c24a420669890488979e8c1899eb8b4f"} Apr 23 18:28:28.790574 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:28.790476 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp" event={"ID":"3915026e-f1f0-49b5-9504-ee7444f2e48c","Type":"ContainerStarted","Data":"3475a68c7c50ff3c6195d19a7da15e39fbe19b6329049b7883f51352b46754de"} Apr 23 18:28:28.790978 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:28.790957 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp" Apr 23 18:28:28.791078 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:28.790987 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp" Apr 23 18:28:28.792407 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:28.792378 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp" podUID="3915026e-f1f0-49b5-9504-ee7444f2e48c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 18:28:28.796541 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:28.796524 2568 scope.go:117] "RemoveContainer" containerID="a5e3ee3a5340b5da198b5a58bd430ab826cdb63d68fa15a2191278f3bedc865e" Apr 23 18:28:28.802950 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:28.802919 2568 scope.go:117] "RemoveContainer" containerID="c99cfbfd4f7b2c15f31aa972822c203128c05a17fea9d7a74193ede1442507f4" Apr 23 18:28:28.807530 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:28.807510 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl"] Apr 23 18:28:28.811444 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:28.811422 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fkntl"] Apr 23 18:28:28.831026 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:28.830980 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp" podStartSLOduration=5.830964651 podStartE2EDuration="5.830964651s" podCreationTimestamp="2026-04-23 18:28:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:28:28.829720684 +0000 UTC m=+2166.749813158" watchObservedRunningTime="2026-04-23 18:28:28.830964651 +0000 UTC m=+2166.751057126" Apr 23 18:28:29.794639 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:29.794600 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp" podUID="3915026e-f1f0-49b5-9504-ee7444f2e48c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 18:28:30.598947 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:30.598901 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9110aa9a-a256-44c3-a2f3-08f4bb955ab0" path="/var/lib/kubelet/pods/9110aa9a-a256-44c3-a2f3-08f4bb955ab0/volumes" Apr 23 18:28:34.799923 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:34.799894 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp" Apr 23 18:28:34.800504 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:34.800476 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp" podUID="3915026e-f1f0-49b5-9504-ee7444f2e48c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 18:28:44.800602 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:44.800560 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp" podUID="3915026e-f1f0-49b5-9504-ee7444f2e48c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 18:28:54.800651 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:28:54.800607 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp" podUID="3915026e-f1f0-49b5-9504-ee7444f2e48c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 18:29:04.800840 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:04.800799 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp" podUID="3915026e-f1f0-49b5-9504-ee7444f2e48c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 18:29:14.800775 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:14.800730 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp" podUID="3915026e-f1f0-49b5-9504-ee7444f2e48c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 18:29:24.800783 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:24.800728 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp" podUID="3915026e-f1f0-49b5-9504-ee7444f2e48c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 18:29:34.801417 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:34.801331 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp" podUID="3915026e-f1f0-49b5-9504-ee7444f2e48c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 18:29:44.801231 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:44.801198 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp" Apr 23 18:29:53.397974 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:53.397919 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp"] Apr 23 18:29:53.398448 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:53.398379 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp" podUID="3915026e-f1f0-49b5-9504-ee7444f2e48c" containerName="kserve-container" containerID="cri-o://3475a68c7c50ff3c6195d19a7da15e39fbe19b6329049b7883f51352b46754de" gracePeriod=30 Apr 23 18:29:53.398762 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:53.398700 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp" podUID="3915026e-f1f0-49b5-9504-ee7444f2e48c" containerName="kube-rbac-proxy" containerID="cri-o://f2531f35ed15a904495d9666bee0f003c24a420669890488979e8c1899eb8b4f" gracePeriod=30 Apr 23 18:29:53.500274 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:53.500235 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9"] Apr 23 18:29:53.500498 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:53.500487 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9110aa9a-a256-44c3-a2f3-08f4bb955ab0" containerName="kserve-container" Apr 23 18:29:53.500543 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:53.500501 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="9110aa9a-a256-44c3-a2f3-08f4bb955ab0" containerName="kserve-container" Apr 23 18:29:53.500543 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:53.500514 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9110aa9a-a256-44c3-a2f3-08f4bb955ab0" containerName="storage-initializer" Apr 23 18:29:53.500543 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:53.500520 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="9110aa9a-a256-44c3-a2f3-08f4bb955ab0" containerName="storage-initializer" Apr 23 18:29:53.500543 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:53.500534 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9110aa9a-a256-44c3-a2f3-08f4bb955ab0" containerName="kube-rbac-proxy" Apr 23 18:29:53.500543 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:53.500540 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="9110aa9a-a256-44c3-a2f3-08f4bb955ab0" containerName="kube-rbac-proxy" Apr 23 18:29:53.500701 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:53.500590 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="9110aa9a-a256-44c3-a2f3-08f4bb955ab0" containerName="kube-rbac-proxy" Apr 23 18:29:53.500701 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:53.500599 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="9110aa9a-a256-44c3-a2f3-08f4bb955ab0" containerName="kserve-container" Apr 23 18:29:53.503656 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:53.503634 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9" Apr 23 18:29:53.505596 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:53.505575 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-v2-predictor-serving-cert\"" Apr 23 18:29:53.505724 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:53.505617 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\"" Apr 23 18:29:53.513852 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:53.513828 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9"] Apr 23 18:29:53.617231 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:53.617201 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-927tm\" (UniqueName: \"kubernetes.io/projected/e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd-kube-api-access-927tm\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9\" (UID: \"e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9" Apr 23 18:29:53.617399 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:53.617254 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9\" (UID: \"e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9" Apr 23 18:29:53.617399 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:53.617277 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9\" (UID: \"e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9" Apr 23 18:29:53.617399 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:53.617329 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9\" (UID: \"e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9" Apr 23 18:29:53.718347 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:53.718263 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9\" (UID: \"e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9" Apr 23 18:29:53.718347 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:53.718300 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9\" (UID: \"e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9" Apr 23 18:29:53.718347 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:53.718339 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9\" (UID: \"e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9" Apr 23 18:29:53.718587 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:53.718368 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-927tm\" (UniqueName: \"kubernetes.io/projected/e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd-kube-api-access-927tm\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9\" (UID: \"e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9" Apr 23 18:29:53.718774 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:53.718752 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9\" (UID: \"e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9" Apr 23 18:29:53.718986 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:53.718969 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9\" (UID: \"e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9" Apr 23 18:29:53.720839 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:53.720819 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9\" (UID: \"e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9" Apr 23 18:29:53.727408 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:53.727387 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-927tm\" (UniqueName: \"kubernetes.io/projected/e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd-kube-api-access-927tm\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9\" (UID: \"e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9" Apr 23 18:29:53.814442 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:53.814406 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9" Apr 23 18:29:53.935296 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:53.935266 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9"] Apr 23 18:29:53.937591 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:29:53.937562 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5986cc1_52f7_4aec_a1fd_a217d5bcbcdd.slice/crio-93b24c4b86529b7897b5cf6838ae811708f1136705c89166a2cecbd23b15fd56 WatchSource:0}: Error finding container 93b24c4b86529b7897b5cf6838ae811708f1136705c89166a2cecbd23b15fd56: Status 404 returned error can't find the container with id 93b24c4b86529b7897b5cf6838ae811708f1136705c89166a2cecbd23b15fd56 Apr 23 18:29:54.023773 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:54.023741 2568 generic.go:358] "Generic (PLEG): container finished" podID="3915026e-f1f0-49b5-9504-ee7444f2e48c" containerID="f2531f35ed15a904495d9666bee0f003c24a420669890488979e8c1899eb8b4f" exitCode=2 Apr 23 18:29:54.024003 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:54.023807 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp" event={"ID":"3915026e-f1f0-49b5-9504-ee7444f2e48c","Type":"ContainerDied","Data":"f2531f35ed15a904495d9666bee0f003c24a420669890488979e8c1899eb8b4f"} Apr 23 18:29:54.025256 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:54.025233 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9" event={"ID":"e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd","Type":"ContainerStarted","Data":"7b4d65ab59faa7acdb0597c04fb205d1c5787f1d766555194ddd4eb74bfac6b3"} Apr 23 18:29:54.025358 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:54.025262 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9" event={"ID":"e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd","Type":"ContainerStarted","Data":"93b24c4b86529b7897b5cf6838ae811708f1136705c89166a2cecbd23b15fd56"} Apr 23 18:29:54.795143 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:54.795099 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp" podUID="3915026e-f1f0-49b5-9504-ee7444f2e48c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.37:8643/healthz\": dial tcp 10.133.0.37:8643: connect: connection refused" Apr 23 18:29:54.800440 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:54.800414 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp" podUID="3915026e-f1f0-49b5-9504-ee7444f2e48c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 18:29:58.039744 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:58.039655 2568 generic.go:358] "Generic (PLEG): container finished" podID="e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd" containerID="7b4d65ab59faa7acdb0597c04fb205d1c5787f1d766555194ddd4eb74bfac6b3" exitCode=0 Apr 23 18:29:58.040136 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:58.039733 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9" event={"ID":"e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd","Type":"ContainerDied","Data":"7b4d65ab59faa7acdb0597c04fb205d1c5787f1d766555194ddd4eb74bfac6b3"} Apr 23 18:29:58.533417 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:58.533396 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp" Apr 23 18:29:58.654886 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:58.654850 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k86ng\" (UniqueName: \"kubernetes.io/projected/3915026e-f1f0-49b5-9504-ee7444f2e48c-kube-api-access-k86ng\") pod \"3915026e-f1f0-49b5-9504-ee7444f2e48c\" (UID: \"3915026e-f1f0-49b5-9504-ee7444f2e48c\") " Apr 23 18:29:58.655080 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:58.654913 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3915026e-f1f0-49b5-9504-ee7444f2e48c-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"3915026e-f1f0-49b5-9504-ee7444f2e48c\" (UID: \"3915026e-f1f0-49b5-9504-ee7444f2e48c\") " Apr 23 18:29:58.655080 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:58.654974 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3915026e-f1f0-49b5-9504-ee7444f2e48c-kserve-provision-location\") pod \"3915026e-f1f0-49b5-9504-ee7444f2e48c\" (UID: \"3915026e-f1f0-49b5-9504-ee7444f2e48c\") " Apr 23 18:29:58.655080 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:58.655003 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3915026e-f1f0-49b5-9504-ee7444f2e48c-proxy-tls\") pod \"3915026e-f1f0-49b5-9504-ee7444f2e48c\" (UID: \"3915026e-f1f0-49b5-9504-ee7444f2e48c\") " Apr 23 18:29:58.655301 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:58.655275 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3915026e-f1f0-49b5-9504-ee7444f2e48c-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-lightgbm-kube-rbac-proxy-sar-config") pod "3915026e-f1f0-49b5-9504-ee7444f2e48c" (UID: "3915026e-f1f0-49b5-9504-ee7444f2e48c"). InnerVolumeSpecName "isvc-predictive-lightgbm-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:29:58.655301 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:58.655287 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3915026e-f1f0-49b5-9504-ee7444f2e48c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3915026e-f1f0-49b5-9504-ee7444f2e48c" (UID: "3915026e-f1f0-49b5-9504-ee7444f2e48c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:29:58.657029 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:58.657009 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3915026e-f1f0-49b5-9504-ee7444f2e48c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "3915026e-f1f0-49b5-9504-ee7444f2e48c" (UID: "3915026e-f1f0-49b5-9504-ee7444f2e48c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:29:58.657153 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:58.657132 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3915026e-f1f0-49b5-9504-ee7444f2e48c-kube-api-access-k86ng" (OuterVolumeSpecName: "kube-api-access-k86ng") pod "3915026e-f1f0-49b5-9504-ee7444f2e48c" (UID: "3915026e-f1f0-49b5-9504-ee7444f2e48c"). InnerVolumeSpecName "kube-api-access-k86ng". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:29:58.756341 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:58.756303 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3915026e-f1f0-49b5-9504-ee7444f2e48c-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:29:58.756341 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:58.756333 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3915026e-f1f0-49b5-9504-ee7444f2e48c-kserve-provision-location\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:29:58.756341 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:58.756345 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3915026e-f1f0-49b5-9504-ee7444f2e48c-proxy-tls\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:29:58.756561 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:58.756355 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k86ng\" (UniqueName: \"kubernetes.io/projected/3915026e-f1f0-49b5-9504-ee7444f2e48c-kube-api-access-k86ng\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:29:59.045340 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:59.045252 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9" event={"ID":"e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd","Type":"ContainerStarted","Data":"c690fa44a41fc6f29c66c95f619df8ae9c520089f87ef6a6a2b1d3d2902c1595"} Apr 23 18:29:59.045340 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:59.045299 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9" event={"ID":"e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd","Type":"ContainerStarted","Data":"b4d0ead1273c421dccb4c29f8ca04b77276760fe5f5f4bc4f9c690f7da6a928f"} Apr 23 18:29:59.045829 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:59.045541 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9" Apr 23 18:29:59.045829 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:59.045569 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9" Apr 23 18:29:59.046921 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:59.046897 2568 generic.go:358] "Generic (PLEG): container finished" podID="3915026e-f1f0-49b5-9504-ee7444f2e48c" containerID="3475a68c7c50ff3c6195d19a7da15e39fbe19b6329049b7883f51352b46754de" exitCode=0 Apr 23 18:29:59.047049 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:59.046944 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp" event={"ID":"3915026e-f1f0-49b5-9504-ee7444f2e48c","Type":"ContainerDied","Data":"3475a68c7c50ff3c6195d19a7da15e39fbe19b6329049b7883f51352b46754de"} Apr 23 18:29:59.047049 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:59.046973 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp" event={"ID":"3915026e-f1f0-49b5-9504-ee7444f2e48c","Type":"ContainerDied","Data":"6d85e472f3a0287e51fb1dcd7610961856d44dab6d6b78695140268640245415"} Apr 23 18:29:59.047049 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:59.046988 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp" Apr 23 18:29:59.047163 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:59.046992 2568 scope.go:117] "RemoveContainer" containerID="f2531f35ed15a904495d9666bee0f003c24a420669890488979e8c1899eb8b4f" Apr 23 18:29:59.054924 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:59.054870 2568 scope.go:117] "RemoveContainer" containerID="3475a68c7c50ff3c6195d19a7da15e39fbe19b6329049b7883f51352b46754de" Apr 23 18:29:59.061789 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:59.061771 2568 scope.go:117] "RemoveContainer" containerID="fb21f5dce894dabb9829037705134489380c9f082458a00d8864cae39f9163b1" Apr 23 18:29:59.067659 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:59.067611 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9" podStartSLOduration=6.067594581 podStartE2EDuration="6.067594581s" podCreationTimestamp="2026-04-23 18:29:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:29:59.065895947 +0000 UTC m=+2256.985988425" watchObservedRunningTime="2026-04-23 18:29:59.067594581 +0000 UTC m=+2256.987687056" Apr 23 18:29:59.069103 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:59.069076 2568 scope.go:117] "RemoveContainer" containerID="f2531f35ed15a904495d9666bee0f003c24a420669890488979e8c1899eb8b4f" Apr 23 18:29:59.069390 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:29:59.069365 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2531f35ed15a904495d9666bee0f003c24a420669890488979e8c1899eb8b4f\": container with ID starting with f2531f35ed15a904495d9666bee0f003c24a420669890488979e8c1899eb8b4f not found: ID does not exist" containerID="f2531f35ed15a904495d9666bee0f003c24a420669890488979e8c1899eb8b4f" Apr 23 18:29:59.069459 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:59.069400 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2531f35ed15a904495d9666bee0f003c24a420669890488979e8c1899eb8b4f"} err="failed to get container status \"f2531f35ed15a904495d9666bee0f003c24a420669890488979e8c1899eb8b4f\": rpc error: code = NotFound desc = could not find container \"f2531f35ed15a904495d9666bee0f003c24a420669890488979e8c1899eb8b4f\": container with ID starting with f2531f35ed15a904495d9666bee0f003c24a420669890488979e8c1899eb8b4f not found: ID does not exist" Apr 23 18:29:59.069459 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:59.069423 2568 scope.go:117] "RemoveContainer" containerID="3475a68c7c50ff3c6195d19a7da15e39fbe19b6329049b7883f51352b46754de" Apr 23 18:29:59.069675 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:29:59.069654 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3475a68c7c50ff3c6195d19a7da15e39fbe19b6329049b7883f51352b46754de\": container with ID starting with 3475a68c7c50ff3c6195d19a7da15e39fbe19b6329049b7883f51352b46754de not found: ID does not exist" containerID="3475a68c7c50ff3c6195d19a7da15e39fbe19b6329049b7883f51352b46754de" Apr 23 18:29:59.069730 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:59.069687 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3475a68c7c50ff3c6195d19a7da15e39fbe19b6329049b7883f51352b46754de"} err="failed to get container status \"3475a68c7c50ff3c6195d19a7da15e39fbe19b6329049b7883f51352b46754de\": rpc error: code = NotFound desc = could not find container \"3475a68c7c50ff3c6195d19a7da15e39fbe19b6329049b7883f51352b46754de\": container with ID starting with 3475a68c7c50ff3c6195d19a7da15e39fbe19b6329049b7883f51352b46754de not found: ID does not exist" Apr 23 18:29:59.069730 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:59.069712 2568 scope.go:117] "RemoveContainer" containerID="fb21f5dce894dabb9829037705134489380c9f082458a00d8864cae39f9163b1" Apr 23 18:29:59.069973 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:29:59.069950 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb21f5dce894dabb9829037705134489380c9f082458a00d8864cae39f9163b1\": container with ID starting with fb21f5dce894dabb9829037705134489380c9f082458a00d8864cae39f9163b1 not found: ID does not exist" containerID="fb21f5dce894dabb9829037705134489380c9f082458a00d8864cae39f9163b1" Apr 23 18:29:59.070052 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:59.069981 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb21f5dce894dabb9829037705134489380c9f082458a00d8864cae39f9163b1"} err="failed to get container status \"fb21f5dce894dabb9829037705134489380c9f082458a00d8864cae39f9163b1\": rpc error: code = NotFound desc = could not find container \"fb21f5dce894dabb9829037705134489380c9f082458a00d8864cae39f9163b1\": container with ID starting with fb21f5dce894dabb9829037705134489380c9f082458a00d8864cae39f9163b1 not found: ID does not exist" Apr 23 18:29:59.082980 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:59.082955 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp"] Apr 23 18:29:59.087748 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:29:59.087726 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-brdjp"] Apr 23 18:30:00.598535 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:30:00.598499 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3915026e-f1f0-49b5-9504-ee7444f2e48c" path="/var/lib/kubelet/pods/3915026e-f1f0-49b5-9504-ee7444f2e48c/volumes" Apr 23 18:30:05.055503 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:30:05.055474 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9" Apr 23 18:30:35.056983 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:30:35.056915 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9" podUID="e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.38:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.38:8080: connect: connection refused" Apr 23 18:30:45.056896 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:30:45.056857 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9" podUID="e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.38:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.38:8080: connect: connection refused" Apr 23 18:30:55.056363 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:30:55.056319 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9" podUID="e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.38:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.38:8080: connect: connection refused" Apr 23 18:31:05.056442 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:05.056355 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9" podUID="e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.38:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.38:8080: connect: connection refused" Apr 23 18:31:15.059253 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:15.059216 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9" Apr 23 18:31:23.651753 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:23.651718 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9"] Apr 23 18:31:23.652280 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:23.652105 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9" podUID="e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd" containerName="kserve-container" containerID="cri-o://b4d0ead1273c421dccb4c29f8ca04b77276760fe5f5f4bc4f9c690f7da6a928f" gracePeriod=30 Apr 23 18:31:23.652280 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:23.652129 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9" podUID="e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd" containerName="kube-rbac-proxy" containerID="cri-o://c690fa44a41fc6f29c66c95f619df8ae9c520089f87ef6a6a2b1d3d2902c1595" gracePeriod=30 Apr 23 18:31:23.757547 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:23.757514 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z"] Apr 23 18:31:23.757837 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:23.757823 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3915026e-f1f0-49b5-9504-ee7444f2e48c" containerName="storage-initializer" Apr 23 18:31:23.757901 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:23.757839 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="3915026e-f1f0-49b5-9504-ee7444f2e48c" containerName="storage-initializer" Apr 23 18:31:23.757901 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:23.757848 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3915026e-f1f0-49b5-9504-ee7444f2e48c" containerName="kserve-container" Apr 23 18:31:23.757901 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:23.757856 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="3915026e-f1f0-49b5-9504-ee7444f2e48c" containerName="kserve-container" Apr 23 18:31:23.757901 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:23.757868 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3915026e-f1f0-49b5-9504-ee7444f2e48c" containerName="kube-rbac-proxy" Apr 23 18:31:23.757901 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:23.757874 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="3915026e-f1f0-49b5-9504-ee7444f2e48c" containerName="kube-rbac-proxy" Apr 23 18:31:23.758075 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:23.757923 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="3915026e-f1f0-49b5-9504-ee7444f2e48c" containerName="kube-rbac-proxy" Apr 23 18:31:23.758075 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:23.757948 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="3915026e-f1f0-49b5-9504-ee7444f2e48c" containerName="kserve-container" Apr 23 18:31:23.760940 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:23.760910 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z" Apr 23 18:31:23.762922 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:23.762855 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\"" Apr 23 18:31:23.763158 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:23.763137 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-v2-predictor-serving-cert\"" Apr 23 18:31:23.773890 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:23.773858 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z"] Apr 23 18:31:23.793058 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:23.793030 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48hjv\" (UniqueName: \"kubernetes.io/projected/753035b8-f8c4-4399-9008-8b195c79833e-kube-api-access-48hjv\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z\" (UID: \"753035b8-f8c4-4399-9008-8b195c79833e\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z" Apr 23 18:31:23.793058 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:23.793072 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/753035b8-f8c4-4399-9008-8b195c79833e-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z\" (UID: \"753035b8-f8c4-4399-9008-8b195c79833e\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z" Apr 23 18:31:23.793291 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:23.793171 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/753035b8-f8c4-4399-9008-8b195c79833e-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z\" (UID: \"753035b8-f8c4-4399-9008-8b195c79833e\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z" Apr 23 18:31:23.793291 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:23.793216 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/753035b8-f8c4-4399-9008-8b195c79833e-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z\" (UID: \"753035b8-f8c4-4399-9008-8b195c79833e\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z" Apr 23 18:31:23.893642 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:23.893601 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/753035b8-f8c4-4399-9008-8b195c79833e-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z\" (UID: \"753035b8-f8c4-4399-9008-8b195c79833e\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z" Apr 23 18:31:23.893840 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:23.893658 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/753035b8-f8c4-4399-9008-8b195c79833e-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z\" (UID: \"753035b8-f8c4-4399-9008-8b195c79833e\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z" Apr 23 18:31:23.893840 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:23.893773 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/753035b8-f8c4-4399-9008-8b195c79833e-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z\" (UID: \"753035b8-f8c4-4399-9008-8b195c79833e\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z" Apr 23 18:31:23.893977 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:23.893862 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-48hjv\" (UniqueName: \"kubernetes.io/projected/753035b8-f8c4-4399-9008-8b195c79833e-kube-api-access-48hjv\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z\" (UID: \"753035b8-f8c4-4399-9008-8b195c79833e\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z" Apr 23 18:31:23.894234 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:23.894209 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/753035b8-f8c4-4399-9008-8b195c79833e-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z\" (UID: \"753035b8-f8c4-4399-9008-8b195c79833e\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z" Apr 23 18:31:23.894403 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:23.894382 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/753035b8-f8c4-4399-9008-8b195c79833e-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z\" (UID: \"753035b8-f8c4-4399-9008-8b195c79833e\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z" Apr 23 18:31:23.896248 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:23.896227 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/753035b8-f8c4-4399-9008-8b195c79833e-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z\" (UID: \"753035b8-f8c4-4399-9008-8b195c79833e\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z" Apr 23 18:31:23.903629 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:23.903557 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-48hjv\" (UniqueName: \"kubernetes.io/projected/753035b8-f8c4-4399-9008-8b195c79833e-kube-api-access-48hjv\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z\" (UID: \"753035b8-f8c4-4399-9008-8b195c79833e\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z" Apr 23 18:31:24.072034 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:24.071989 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z" Apr 23 18:31:24.208052 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:24.207974 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z"] Apr 23 18:31:24.211057 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:31:24.211027 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod753035b8_f8c4_4399_9008_8b195c79833e.slice/crio-986aaf6a6c5acafbf674308553fc704b3639203510c6ad2ecbd7c65ede1c192b WatchSource:0}: Error finding container 986aaf6a6c5acafbf674308553fc704b3639203510c6ad2ecbd7c65ede1c192b: Status 404 returned error can't find the container with id 986aaf6a6c5acafbf674308553fc704b3639203510c6ad2ecbd7c65ede1c192b Apr 23 18:31:24.281118 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:24.281090 2568 generic.go:358] "Generic (PLEG): container finished" podID="e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd" containerID="c690fa44a41fc6f29c66c95f619df8ae9c520089f87ef6a6a2b1d3d2902c1595" exitCode=2 Apr 23 18:31:24.281271 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:24.281157 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9" event={"ID":"e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd","Type":"ContainerDied","Data":"c690fa44a41fc6f29c66c95f619df8ae9c520089f87ef6a6a2b1d3d2902c1595"} Apr 23 18:31:24.284887 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:24.284858 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z" event={"ID":"753035b8-f8c4-4399-9008-8b195c79833e","Type":"ContainerStarted","Data":"a353db8e92035611d83373a522ff293e8227cf5d59d587c3ced7ed5dcb36648e"} Apr 23 18:31:24.285040 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:24.284894 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z" event={"ID":"753035b8-f8c4-4399-9008-8b195c79833e","Type":"ContainerStarted","Data":"986aaf6a6c5acafbf674308553fc704b3639203510c6ad2ecbd7c65ede1c192b"} Apr 23 18:31:25.051455 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:25.051409 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9" podUID="e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.38:8643/healthz\": dial tcp 10.133.0.38:8643: connect: connection refused" Apr 23 18:31:25.056923 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:25.056888 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9" podUID="e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.38:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.38:8080: connect: connection refused" Apr 23 18:31:28.296511 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:28.296476 2568 generic.go:358] "Generic (PLEG): container finished" podID="753035b8-f8c4-4399-9008-8b195c79833e" containerID="a353db8e92035611d83373a522ff293e8227cf5d59d587c3ced7ed5dcb36648e" exitCode=0 Apr 23 18:31:28.296879 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:28.296548 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z" event={"ID":"753035b8-f8c4-4399-9008-8b195c79833e","Type":"ContainerDied","Data":"a353db8e92035611d83373a522ff293e8227cf5d59d587c3ced7ed5dcb36648e"} Apr 23 18:31:28.788058 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:28.788024 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9" Apr 23 18:31:28.833873 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:28.833779 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-927tm\" (UniqueName: \"kubernetes.io/projected/e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd-kube-api-access-927tm\") pod \"e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd\" (UID: \"e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd\") " Apr 23 18:31:28.833873 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:28.833818 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd\" (UID: \"e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd\") " Apr 23 18:31:28.833873 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:28.833856 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd-proxy-tls\") pod \"e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd\" (UID: \"e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd\") " Apr 23 18:31:28.834195 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:28.833904 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd-kserve-provision-location\") pod \"e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd\" (UID: \"e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd\") " Apr 23 18:31:28.834298 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:28.834267 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config") pod "e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd" (UID: "e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd"). InnerVolumeSpecName "isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:31:28.834416 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:28.834299 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd" (UID: "e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:31:28.836035 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:28.836006 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd-kube-api-access-927tm" (OuterVolumeSpecName: "kube-api-access-927tm") pod "e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd" (UID: "e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd"). InnerVolumeSpecName "kube-api-access-927tm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:31:28.836141 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:28.836043 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd" (UID: "e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:31:28.934456 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:28.934422 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd-kserve-provision-location\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:31:28.934456 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:28.934450 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-927tm\" (UniqueName: \"kubernetes.io/projected/e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd-kube-api-access-927tm\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:31:28.934456 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:28.934461 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:31:28.934684 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:28.934472 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd-proxy-tls\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:31:29.300706 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:29.300671 2568 generic.go:358] "Generic (PLEG): container finished" podID="e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd" containerID="b4d0ead1273c421dccb4c29f8ca04b77276760fe5f5f4bc4f9c690f7da6a928f" exitCode=0 Apr 23 18:31:29.301192 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:29.300736 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9" event={"ID":"e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd","Type":"ContainerDied","Data":"b4d0ead1273c421dccb4c29f8ca04b77276760fe5f5f4bc4f9c690f7da6a928f"} Apr 23 18:31:29.301192 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:29.300774 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9" event={"ID":"e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd","Type":"ContainerDied","Data":"93b24c4b86529b7897b5cf6838ae811708f1136705c89166a2cecbd23b15fd56"} Apr 23 18:31:29.301192 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:29.300772 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9" Apr 23 18:31:29.301192 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:29.300836 2568 scope.go:117] "RemoveContainer" containerID="c690fa44a41fc6f29c66c95f619df8ae9c520089f87ef6a6a2b1d3d2902c1595" Apr 23 18:31:29.302698 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:29.302669 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z" event={"ID":"753035b8-f8c4-4399-9008-8b195c79833e","Type":"ContainerStarted","Data":"4196667c1e0552dcc087a75fec052773ded602dc61e262f8e9724e7ba8daa09d"} Apr 23 18:31:29.302779 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:29.302699 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z" event={"ID":"753035b8-f8c4-4399-9008-8b195c79833e","Type":"ContainerStarted","Data":"ba165bc40dae4e502e475f72599d0e918630ed8ce9bba9f4b984fef67321dab8"} Apr 23 18:31:29.303035 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:29.303014 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z" Apr 23 18:31:29.303082 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:29.303051 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z" Apr 23 18:31:29.308872 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:29.308812 2568 scope.go:117] "RemoveContainer" containerID="b4d0ead1273c421dccb4c29f8ca04b77276760fe5f5f4bc4f9c690f7da6a928f" Apr 23 18:31:29.316048 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:29.316025 2568 scope.go:117] "RemoveContainer" containerID="7b4d65ab59faa7acdb0597c04fb205d1c5787f1d766555194ddd4eb74bfac6b3" Apr 23 18:31:29.323555 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:29.323534 2568 scope.go:117] "RemoveContainer" containerID="c690fa44a41fc6f29c66c95f619df8ae9c520089f87ef6a6a2b1d3d2902c1595" Apr 23 18:31:29.324338 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:31:29.324307 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c690fa44a41fc6f29c66c95f619df8ae9c520089f87ef6a6a2b1d3d2902c1595\": container with ID starting with c690fa44a41fc6f29c66c95f619df8ae9c520089f87ef6a6a2b1d3d2902c1595 not found: ID does not exist" containerID="c690fa44a41fc6f29c66c95f619df8ae9c520089f87ef6a6a2b1d3d2902c1595" Apr 23 18:31:29.324424 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:29.324351 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c690fa44a41fc6f29c66c95f619df8ae9c520089f87ef6a6a2b1d3d2902c1595"} err="failed to get container status \"c690fa44a41fc6f29c66c95f619df8ae9c520089f87ef6a6a2b1d3d2902c1595\": rpc error: code = NotFound desc = could not find container \"c690fa44a41fc6f29c66c95f619df8ae9c520089f87ef6a6a2b1d3d2902c1595\": container with ID starting with c690fa44a41fc6f29c66c95f619df8ae9c520089f87ef6a6a2b1d3d2902c1595 not found: ID does not exist" Apr 23 18:31:29.324424 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:29.324376 2568 scope.go:117] "RemoveContainer" containerID="b4d0ead1273c421dccb4c29f8ca04b77276760fe5f5f4bc4f9c690f7da6a928f" Apr 23 18:31:29.324663 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:31:29.324645 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4d0ead1273c421dccb4c29f8ca04b77276760fe5f5f4bc4f9c690f7da6a928f\": container with ID starting with b4d0ead1273c421dccb4c29f8ca04b77276760fe5f5f4bc4f9c690f7da6a928f not found: ID does not exist" containerID="b4d0ead1273c421dccb4c29f8ca04b77276760fe5f5f4bc4f9c690f7da6a928f" Apr 23 18:31:29.324718 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:29.324669 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4d0ead1273c421dccb4c29f8ca04b77276760fe5f5f4bc4f9c690f7da6a928f"} err="failed to get container status \"b4d0ead1273c421dccb4c29f8ca04b77276760fe5f5f4bc4f9c690f7da6a928f\": rpc error: code = NotFound desc = could not find container \"b4d0ead1273c421dccb4c29f8ca04b77276760fe5f5f4bc4f9c690f7da6a928f\": container with ID starting with b4d0ead1273c421dccb4c29f8ca04b77276760fe5f5f4bc4f9c690f7da6a928f not found: ID does not exist" Apr 23 18:31:29.324718 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:29.324687 2568 scope.go:117] "RemoveContainer" containerID="7b4d65ab59faa7acdb0597c04fb205d1c5787f1d766555194ddd4eb74bfac6b3" Apr 23 18:31:29.324820 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:29.324783 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z" podStartSLOduration=6.324770097 podStartE2EDuration="6.324770097s" podCreationTimestamp="2026-04-23 18:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:31:29.323490159 +0000 UTC m=+2347.243582635" watchObservedRunningTime="2026-04-23 18:31:29.324770097 +0000 UTC m=+2347.244862571" Apr 23 18:31:29.324942 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:31:29.324906 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b4d65ab59faa7acdb0597c04fb205d1c5787f1d766555194ddd4eb74bfac6b3\": container with ID starting with 7b4d65ab59faa7acdb0597c04fb205d1c5787f1d766555194ddd4eb74bfac6b3 not found: ID does not exist" containerID="7b4d65ab59faa7acdb0597c04fb205d1c5787f1d766555194ddd4eb74bfac6b3" Apr 23 18:31:29.325046 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:29.325024 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b4d65ab59faa7acdb0597c04fb205d1c5787f1d766555194ddd4eb74bfac6b3"} err="failed to get container status \"7b4d65ab59faa7acdb0597c04fb205d1c5787f1d766555194ddd4eb74bfac6b3\": rpc error: code = NotFound desc = could not find container \"7b4d65ab59faa7acdb0597c04fb205d1c5787f1d766555194ddd4eb74bfac6b3\": container with ID starting with 7b4d65ab59faa7acdb0597c04fb205d1c5787f1d766555194ddd4eb74bfac6b3 not found: ID does not exist" Apr 23 18:31:29.338510 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:29.338478 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9"] Apr 23 18:31:29.345223 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:29.345201 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-tqtb9"] Apr 23 18:31:30.598295 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:30.598259 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd" path="/var/lib/kubelet/pods/e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd/volumes" Apr 23 18:31:35.312794 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:31:35.312762 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z" Apr 23 18:32:05.314148 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:05.314109 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z" podUID="753035b8-f8c4-4399-9008-8b195c79833e" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.39:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.39:8080: connect: connection refused" Apr 23 18:32:15.313901 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:15.313866 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z" podUID="753035b8-f8c4-4399-9008-8b195c79833e" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.39:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.39:8080: connect: connection refused" Apr 23 18:32:22.670921 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:22.670890 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwrk7_52f4d4c2-438f-4486-8ffb-4afbaad14bf1/ovn-acl-logging/0.log" Apr 23 18:32:22.673975 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:22.673950 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwrk7_52f4d4c2-438f-4486-8ffb-4afbaad14bf1/ovn-acl-logging/0.log" Apr 23 18:32:22.674872 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:22.674851 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_8dc4456d0c7e39a0756b16fde0e83318/kube-rbac-proxy-crio/4.log" Apr 23 18:32:22.677628 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:22.677614 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_8dc4456d0c7e39a0756b16fde0e83318/kube-rbac-proxy-crio/4.log" Apr 23 18:32:25.313363 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:25.313326 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z" podUID="753035b8-f8c4-4399-9008-8b195c79833e" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.39:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.39:8080: connect: connection refused" Apr 23 18:32:35.313582 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:35.313498 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z" podUID="753035b8-f8c4-4399-9008-8b195c79833e" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.39:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.39:8080: connect: connection refused" Apr 23 18:32:45.317327 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:45.317295 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z" Apr 23 18:32:53.846742 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:53.846708 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z"] Apr 23 18:32:53.847328 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:53.847157 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z" podUID="753035b8-f8c4-4399-9008-8b195c79833e" containerName="kserve-container" containerID="cri-o://ba165bc40dae4e502e475f72599d0e918630ed8ce9bba9f4b984fef67321dab8" gracePeriod=30 Apr 23 18:32:53.847328 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:53.847224 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z" podUID="753035b8-f8c4-4399-9008-8b195c79833e" containerName="kube-rbac-proxy" containerID="cri-o://4196667c1e0552dcc087a75fec052773ded602dc61e262f8e9724e7ba8daa09d" gracePeriod=30 Apr 23 18:32:53.948378 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:53.948343 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj"] Apr 23 18:32:53.948620 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:53.948608 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd" containerName="kserve-container" Apr 23 18:32:53.948665 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:53.948621 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd" containerName="kserve-container" Apr 23 18:32:53.948665 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:53.948632 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd" containerName="storage-initializer" Apr 23 18:32:53.948665 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:53.948638 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd" containerName="storage-initializer" Apr 23 18:32:53.948665 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:53.948652 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd" containerName="kube-rbac-proxy" Apr 23 18:32:53.948665 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:53.948659 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd" containerName="kube-rbac-proxy" Apr 23 18:32:53.948823 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:53.948697 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd" containerName="kube-rbac-proxy" Apr 23 18:32:53.948823 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:53.948704 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="e5986cc1-52f7-4aec-a1fd-a217d5bcbcdd" containerName="kserve-container" Apr 23 18:32:53.951719 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:53.951693 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj" Apr 23 18:32:53.953648 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:53.953624 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\"" Apr 23 18:32:53.953648 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:53.953641 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-v2-predictor-serving-cert\"" Apr 23 18:32:53.960898 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:53.960875 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj"] Apr 23 18:32:54.055591 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:54.055551 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/606736e1-be53-4d8d-a634-658e10dfe192-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj\" (UID: \"606736e1-be53-4d8d-a634-658e10dfe192\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj" Apr 23 18:32:54.055780 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:54.055605 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9qbh\" (UniqueName: \"kubernetes.io/projected/606736e1-be53-4d8d-a634-658e10dfe192-kube-api-access-c9qbh\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj\" (UID: \"606736e1-be53-4d8d-a634-658e10dfe192\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj" Apr 23 18:32:54.055780 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:54.055625 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/606736e1-be53-4d8d-a634-658e10dfe192-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj\" (UID: \"606736e1-be53-4d8d-a634-658e10dfe192\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj" Apr 23 18:32:54.055780 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:54.055682 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/606736e1-be53-4d8d-a634-658e10dfe192-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj\" (UID: \"606736e1-be53-4d8d-a634-658e10dfe192\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj" Apr 23 18:32:54.156414 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:54.156335 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/606736e1-be53-4d8d-a634-658e10dfe192-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj\" (UID: \"606736e1-be53-4d8d-a634-658e10dfe192\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj" Apr 23 18:32:54.156414 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:54.156373 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/606736e1-be53-4d8d-a634-658e10dfe192-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj\" (UID: \"606736e1-be53-4d8d-a634-658e10dfe192\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj" Apr 23 18:32:54.156634 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:54.156416 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c9qbh\" (UniqueName: \"kubernetes.io/projected/606736e1-be53-4d8d-a634-658e10dfe192-kube-api-access-c9qbh\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj\" (UID: \"606736e1-be53-4d8d-a634-658e10dfe192\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj" Apr 23 18:32:54.156634 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:54.156437 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/606736e1-be53-4d8d-a634-658e10dfe192-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj\" (UID: \"606736e1-be53-4d8d-a634-658e10dfe192\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj" Apr 23 18:32:54.156894 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:54.156872 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/606736e1-be53-4d8d-a634-658e10dfe192-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj\" (UID: \"606736e1-be53-4d8d-a634-658e10dfe192\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj" Apr 23 18:32:54.157168 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:54.157151 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/606736e1-be53-4d8d-a634-658e10dfe192-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj\" (UID: \"606736e1-be53-4d8d-a634-658e10dfe192\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj" Apr 23 18:32:54.158755 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:54.158736 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/606736e1-be53-4d8d-a634-658e10dfe192-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj\" (UID: \"606736e1-be53-4d8d-a634-658e10dfe192\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj" Apr 23 18:32:54.165048 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:54.165029 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9qbh\" (UniqueName: \"kubernetes.io/projected/606736e1-be53-4d8d-a634-658e10dfe192-kube-api-access-c9qbh\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj\" (UID: \"606736e1-be53-4d8d-a634-658e10dfe192\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj" Apr 23 18:32:54.263011 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:54.262968 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj" Apr 23 18:32:54.387266 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:54.387232 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj"] Apr 23 18:32:54.389986 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:32:54.389957 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod606736e1_be53_4d8d_a634_658e10dfe192.slice/crio-eed0c0d3fbe039b925444a3bcd3bbfc91928364ef5ca9bb6808d99be3c410015 WatchSource:0}: Error finding container eed0c0d3fbe039b925444a3bcd3bbfc91928364ef5ca9bb6808d99be3c410015: Status 404 returned error can't find the container with id eed0c0d3fbe039b925444a3bcd3bbfc91928364ef5ca9bb6808d99be3c410015 Apr 23 18:32:54.542270 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:54.542231 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj" event={"ID":"606736e1-be53-4d8d-a634-658e10dfe192","Type":"ContainerStarted","Data":"93b4fb536e55c3a11263db768436a82df4e0b72dbd30f2921e1a4598432b70fc"} Apr 23 18:32:54.542270 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:54.542280 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj" event={"ID":"606736e1-be53-4d8d-a634-658e10dfe192","Type":"ContainerStarted","Data":"eed0c0d3fbe039b925444a3bcd3bbfc91928364ef5ca9bb6808d99be3c410015"} Apr 23 18:32:54.544320 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:54.544298 2568 generic.go:358] "Generic (PLEG): container finished" podID="753035b8-f8c4-4399-9008-8b195c79833e" containerID="4196667c1e0552dcc087a75fec052773ded602dc61e262f8e9724e7ba8daa09d" exitCode=2 Apr 23 18:32:54.544429 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:54.544359 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z" event={"ID":"753035b8-f8c4-4399-9008-8b195c79833e","Type":"ContainerDied","Data":"4196667c1e0552dcc087a75fec052773ded602dc61e262f8e9724e7ba8daa09d"} Apr 23 18:32:55.308750 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:55.308712 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z" podUID="753035b8-f8c4-4399-9008-8b195c79833e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.39:8643/healthz\": dial tcp 10.133.0.39:8643: connect: connection refused" Apr 23 18:32:55.313274 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:55.313250 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z" podUID="753035b8-f8c4-4399-9008-8b195c79833e" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.39:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.39:8080: connect: connection refused" Apr 23 18:32:58.557174 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:58.557087 2568 generic.go:358] "Generic (PLEG): container finished" podID="606736e1-be53-4d8d-a634-658e10dfe192" containerID="93b4fb536e55c3a11263db768436a82df4e0b72dbd30f2921e1a4598432b70fc" exitCode=0 Apr 23 18:32:58.557174 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:58.557158 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj" event={"ID":"606736e1-be53-4d8d-a634-658e10dfe192","Type":"ContainerDied","Data":"93b4fb536e55c3a11263db768436a82df4e0b72dbd30f2921e1a4598432b70fc"} Apr 23 18:32:58.789528 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:58.789503 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z" Apr 23 18:32:58.891182 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:58.891147 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48hjv\" (UniqueName: \"kubernetes.io/projected/753035b8-f8c4-4399-9008-8b195c79833e-kube-api-access-48hjv\") pod \"753035b8-f8c4-4399-9008-8b195c79833e\" (UID: \"753035b8-f8c4-4399-9008-8b195c79833e\") " Apr 23 18:32:58.891182 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:58.891182 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/753035b8-f8c4-4399-9008-8b195c79833e-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"753035b8-f8c4-4399-9008-8b195c79833e\" (UID: \"753035b8-f8c4-4399-9008-8b195c79833e\") " Apr 23 18:32:58.891413 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:58.891211 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/753035b8-f8c4-4399-9008-8b195c79833e-kserve-provision-location\") pod \"753035b8-f8c4-4399-9008-8b195c79833e\" (UID: \"753035b8-f8c4-4399-9008-8b195c79833e\") " Apr 23 18:32:58.891413 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:58.891253 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/753035b8-f8c4-4399-9008-8b195c79833e-proxy-tls\") pod \"753035b8-f8c4-4399-9008-8b195c79833e\" (UID: \"753035b8-f8c4-4399-9008-8b195c79833e\") " Apr 23 18:32:58.891534 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:58.891505 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/753035b8-f8c4-4399-9008-8b195c79833e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "753035b8-f8c4-4399-9008-8b195c79833e" (UID: "753035b8-f8c4-4399-9008-8b195c79833e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:32:58.891657 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:58.891634 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/753035b8-f8c4-4399-9008-8b195c79833e-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config") pod "753035b8-f8c4-4399-9008-8b195c79833e" (UID: "753035b8-f8c4-4399-9008-8b195c79833e"). InnerVolumeSpecName "isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:32:58.893344 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:58.893312 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/753035b8-f8c4-4399-9008-8b195c79833e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "753035b8-f8c4-4399-9008-8b195c79833e" (UID: "753035b8-f8c4-4399-9008-8b195c79833e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:32:58.893459 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:58.893363 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/753035b8-f8c4-4399-9008-8b195c79833e-kube-api-access-48hjv" (OuterVolumeSpecName: "kube-api-access-48hjv") pod "753035b8-f8c4-4399-9008-8b195c79833e" (UID: "753035b8-f8c4-4399-9008-8b195c79833e"). InnerVolumeSpecName "kube-api-access-48hjv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:32:58.992246 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:58.992203 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-48hjv\" (UniqueName: \"kubernetes.io/projected/753035b8-f8c4-4399-9008-8b195c79833e-kube-api-access-48hjv\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:32:58.992246 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:58.992240 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/753035b8-f8c4-4399-9008-8b195c79833e-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:32:58.992246 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:58.992251 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/753035b8-f8c4-4399-9008-8b195c79833e-kserve-provision-location\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:32:58.992480 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:58.992260 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/753035b8-f8c4-4399-9008-8b195c79833e-proxy-tls\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:32:59.561998 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:59.561897 2568 generic.go:358] "Generic (PLEG): container finished" podID="753035b8-f8c4-4399-9008-8b195c79833e" containerID="ba165bc40dae4e502e475f72599d0e918630ed8ce9bba9f4b984fef67321dab8" exitCode=0 Apr 23 18:32:59.562437 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:59.561985 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z" event={"ID":"753035b8-f8c4-4399-9008-8b195c79833e","Type":"ContainerDied","Data":"ba165bc40dae4e502e475f72599d0e918630ed8ce9bba9f4b984fef67321dab8"} Apr 23 18:32:59.562437 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:59.562004 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z" Apr 23 18:32:59.562437 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:59.562026 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z" event={"ID":"753035b8-f8c4-4399-9008-8b195c79833e","Type":"ContainerDied","Data":"986aaf6a6c5acafbf674308553fc704b3639203510c6ad2ecbd7c65ede1c192b"} Apr 23 18:32:59.562437 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:59.562046 2568 scope.go:117] "RemoveContainer" containerID="4196667c1e0552dcc087a75fec052773ded602dc61e262f8e9724e7ba8daa09d" Apr 23 18:32:59.563973 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:59.563952 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj" event={"ID":"606736e1-be53-4d8d-a634-658e10dfe192","Type":"ContainerStarted","Data":"92ba21ca92bd332984698c851e7eff159f4de9ff3d08956837aa7b7ba2a4d1e6"} Apr 23 18:32:59.564091 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:59.563986 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj" event={"ID":"606736e1-be53-4d8d-a634-658e10dfe192","Type":"ContainerStarted","Data":"1e0ad88ff9a8217b33b10154e1d29e0ada6486dabb9b19d99244617f73c83a9b"} Apr 23 18:32:59.564210 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:59.564190 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj" Apr 23 18:32:59.564285 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:59.564223 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj" Apr 23 18:32:59.570289 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:59.570147 2568 scope.go:117] "RemoveContainer" containerID="ba165bc40dae4e502e475f72599d0e918630ed8ce9bba9f4b984fef67321dab8" Apr 23 18:32:59.576920 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:59.576898 2568 scope.go:117] "RemoveContainer" containerID="a353db8e92035611d83373a522ff293e8227cf5d59d587c3ced7ed5dcb36648e" Apr 23 18:32:59.583230 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:59.583214 2568 scope.go:117] "RemoveContainer" containerID="4196667c1e0552dcc087a75fec052773ded602dc61e262f8e9724e7ba8daa09d" Apr 23 18:32:59.583474 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:32:59.583456 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4196667c1e0552dcc087a75fec052773ded602dc61e262f8e9724e7ba8daa09d\": container with ID starting with 4196667c1e0552dcc087a75fec052773ded602dc61e262f8e9724e7ba8daa09d not found: ID does not exist" containerID="4196667c1e0552dcc087a75fec052773ded602dc61e262f8e9724e7ba8daa09d" Apr 23 18:32:59.583536 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:59.583482 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4196667c1e0552dcc087a75fec052773ded602dc61e262f8e9724e7ba8daa09d"} err="failed to get container status \"4196667c1e0552dcc087a75fec052773ded602dc61e262f8e9724e7ba8daa09d\": rpc error: code = NotFound desc = could not find container \"4196667c1e0552dcc087a75fec052773ded602dc61e262f8e9724e7ba8daa09d\": container with ID starting with 4196667c1e0552dcc087a75fec052773ded602dc61e262f8e9724e7ba8daa09d not found: ID does not exist" Apr 23 18:32:59.583536 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:59.583501 2568 scope.go:117] "RemoveContainer" containerID="ba165bc40dae4e502e475f72599d0e918630ed8ce9bba9f4b984fef67321dab8" Apr 23 18:32:59.583720 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:32:59.583703 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba165bc40dae4e502e475f72599d0e918630ed8ce9bba9f4b984fef67321dab8\": container with ID starting with ba165bc40dae4e502e475f72599d0e918630ed8ce9bba9f4b984fef67321dab8 not found: ID does not exist" containerID="ba165bc40dae4e502e475f72599d0e918630ed8ce9bba9f4b984fef67321dab8" Apr 23 18:32:59.583760 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:59.583726 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba165bc40dae4e502e475f72599d0e918630ed8ce9bba9f4b984fef67321dab8"} err="failed to get container status \"ba165bc40dae4e502e475f72599d0e918630ed8ce9bba9f4b984fef67321dab8\": rpc error: code = NotFound desc = could not find container \"ba165bc40dae4e502e475f72599d0e918630ed8ce9bba9f4b984fef67321dab8\": container with ID starting with ba165bc40dae4e502e475f72599d0e918630ed8ce9bba9f4b984fef67321dab8 not found: ID does not exist" Apr 23 18:32:59.583760 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:59.583741 2568 scope.go:117] "RemoveContainer" containerID="a353db8e92035611d83373a522ff293e8227cf5d59d587c3ced7ed5dcb36648e" Apr 23 18:32:59.583976 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:32:59.583958 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a353db8e92035611d83373a522ff293e8227cf5d59d587c3ced7ed5dcb36648e\": container with ID starting with a353db8e92035611d83373a522ff293e8227cf5d59d587c3ced7ed5dcb36648e not found: ID does not exist" containerID="a353db8e92035611d83373a522ff293e8227cf5d59d587c3ced7ed5dcb36648e" Apr 23 18:32:59.584054 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:59.583979 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a353db8e92035611d83373a522ff293e8227cf5d59d587c3ced7ed5dcb36648e"} err="failed to get container status \"a353db8e92035611d83373a522ff293e8227cf5d59d587c3ced7ed5dcb36648e\": rpc error: code = NotFound desc = could not find container \"a353db8e92035611d83373a522ff293e8227cf5d59d587c3ced7ed5dcb36648e\": container with ID starting with a353db8e92035611d83373a522ff293e8227cf5d59d587c3ced7ed5dcb36648e not found: ID does not exist" Apr 23 18:32:59.587574 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:59.587538 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj" podStartSLOduration=6.587527297 podStartE2EDuration="6.587527297s" podCreationTimestamp="2026-04-23 18:32:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:32:59.586012154 +0000 UTC m=+2437.506104626" watchObservedRunningTime="2026-04-23 18:32:59.587527297 +0000 UTC m=+2437.507619826" Apr 23 18:32:59.599587 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:59.599565 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z"] Apr 23 18:32:59.605581 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:32:59.605562 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-s5n7z"] Apr 23 18:33:00.598707 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:33:00.598673 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="753035b8-f8c4-4399-9008-8b195c79833e" path="/var/lib/kubelet/pods/753035b8-f8c4-4399-9008-8b195c79833e/volumes" Apr 23 18:33:05.573731 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:33:05.573697 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj" Apr 23 18:33:35.574304 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:33:35.574259 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj" podUID="606736e1-be53-4d8d-a634-658e10dfe192" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.40:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.40:8080: connect: connection refused" Apr 23 18:33:45.575292 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:33:45.575245 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj" podUID="606736e1-be53-4d8d-a634-658e10dfe192" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.40:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.40:8080: connect: connection refused" Apr 23 18:33:55.574772 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:33:55.574731 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj" podUID="606736e1-be53-4d8d-a634-658e10dfe192" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.40:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.40:8080: connect: connection refused" Apr 23 18:34:05.575132 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:05.575040 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj" podUID="606736e1-be53-4d8d-a634-658e10dfe192" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.40:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.40:8080: connect: connection refused" Apr 23 18:34:15.578225 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:15.578193 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj" Apr 23 18:34:24.055192 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:24.055160 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj"] Apr 23 18:34:24.055770 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:24.055471 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj" podUID="606736e1-be53-4d8d-a634-658e10dfe192" containerName="kserve-container" containerID="cri-o://1e0ad88ff9a8217b33b10154e1d29e0ada6486dabb9b19d99244617f73c83a9b" gracePeriod=30 Apr 23 18:34:24.055770 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:24.055510 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj" podUID="606736e1-be53-4d8d-a634-658e10dfe192" containerName="kube-rbac-proxy" containerID="cri-o://92ba21ca92bd332984698c851e7eff159f4de9ff3d08956837aa7b7ba2a4d1e6" gracePeriod=30 Apr 23 18:34:24.797818 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:24.797783 2568 generic.go:358] "Generic (PLEG): container finished" podID="606736e1-be53-4d8d-a634-658e10dfe192" containerID="92ba21ca92bd332984698c851e7eff159f4de9ff3d08956837aa7b7ba2a4d1e6" exitCode=2 Apr 23 18:34:24.798033 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:24.797863 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj" event={"ID":"606736e1-be53-4d8d-a634-658e10dfe192","Type":"ContainerDied","Data":"92ba21ca92bd332984698c851e7eff159f4de9ff3d08956837aa7b7ba2a4d1e6"} Apr 23 18:34:25.568719 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:25.568679 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj" podUID="606736e1-be53-4d8d-a634-658e10dfe192" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.40:8643/healthz\": dial tcp 10.133.0.40:8643: connect: connection refused" Apr 23 18:34:25.574442 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:25.574406 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj" podUID="606736e1-be53-4d8d-a634-658e10dfe192" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.40:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.40:8080: connect: connection refused" Apr 23 18:34:26.239055 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:26.239023 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-7d84d8b7f9-zhsm7"] Apr 23 18:34:26.239457 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:26.239422 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="753035b8-f8c4-4399-9008-8b195c79833e" containerName="kube-rbac-proxy" Apr 23 18:34:26.239457 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:26.239449 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="753035b8-f8c4-4399-9008-8b195c79833e" containerName="kube-rbac-proxy" Apr 23 18:34:26.239611 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:26.239471 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="753035b8-f8c4-4399-9008-8b195c79833e" containerName="kserve-container" Apr 23 18:34:26.239611 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:26.239481 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="753035b8-f8c4-4399-9008-8b195c79833e" containerName="kserve-container" Apr 23 18:34:26.239611 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:26.239494 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="753035b8-f8c4-4399-9008-8b195c79833e" containerName="storage-initializer" Apr 23 18:34:26.239611 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:26.239504 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="753035b8-f8c4-4399-9008-8b195c79833e" containerName="storage-initializer" Apr 23 18:34:26.239611 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:26.239585 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="753035b8-f8c4-4399-9008-8b195c79833e" containerName="kube-rbac-proxy" Apr 23 18:34:26.239611 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:26.239600 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="753035b8-f8c4-4399-9008-8b195c79833e" containerName="kserve-container" Apr 23 18:34:26.242749 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:26.242733 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7d84d8b7f9-zhsm7" Apr 23 18:34:26.244645 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:26.244624 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-predictor-serving-cert\"" Apr 23 18:34:26.244763 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:26.244673 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-kube-rbac-proxy-sar-config\"" Apr 23 18:34:26.255973 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:26.255953 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-7d84d8b7f9-zhsm7"] Apr 23 18:34:26.257689 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:26.257664 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqv6h\" (UniqueName: \"kubernetes.io/projected/42ff5a6b-5db2-4982-ad73-549c177b4074-kube-api-access-hqv6h\") pod \"isvc-sklearn-predictor-7d84d8b7f9-zhsm7\" (UID: \"42ff5a6b-5db2-4982-ad73-549c177b4074\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7d84d8b7f9-zhsm7" Apr 23 18:34:26.257808 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:26.257730 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42ff5a6b-5db2-4982-ad73-549c177b4074-kserve-provision-location\") pod \"isvc-sklearn-predictor-7d84d8b7f9-zhsm7\" (UID: \"42ff5a6b-5db2-4982-ad73-549c177b4074\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7d84d8b7f9-zhsm7" Apr 23 18:34:26.257808 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:26.257762 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/42ff5a6b-5db2-4982-ad73-549c177b4074-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-predictor-7d84d8b7f9-zhsm7\" (UID: \"42ff5a6b-5db2-4982-ad73-549c177b4074\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7d84d8b7f9-zhsm7" Apr 23 18:34:26.257808 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:26.257788 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42ff5a6b-5db2-4982-ad73-549c177b4074-proxy-tls\") pod \"isvc-sklearn-predictor-7d84d8b7f9-zhsm7\" (UID: \"42ff5a6b-5db2-4982-ad73-549c177b4074\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7d84d8b7f9-zhsm7" Apr 23 18:34:26.359137 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:26.359111 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42ff5a6b-5db2-4982-ad73-549c177b4074-kserve-provision-location\") pod \"isvc-sklearn-predictor-7d84d8b7f9-zhsm7\" (UID: \"42ff5a6b-5db2-4982-ad73-549c177b4074\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7d84d8b7f9-zhsm7" Apr 23 18:34:26.359316 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:26.359142 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/42ff5a6b-5db2-4982-ad73-549c177b4074-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-predictor-7d84d8b7f9-zhsm7\" (UID: \"42ff5a6b-5db2-4982-ad73-549c177b4074\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7d84d8b7f9-zhsm7" Apr 23 18:34:26.359316 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:26.359162 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42ff5a6b-5db2-4982-ad73-549c177b4074-proxy-tls\") pod \"isvc-sklearn-predictor-7d84d8b7f9-zhsm7\" (UID: \"42ff5a6b-5db2-4982-ad73-549c177b4074\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7d84d8b7f9-zhsm7" Apr 23 18:34:26.359316 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:26.359215 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hqv6h\" (UniqueName: \"kubernetes.io/projected/42ff5a6b-5db2-4982-ad73-549c177b4074-kube-api-access-hqv6h\") pod \"isvc-sklearn-predictor-7d84d8b7f9-zhsm7\" (UID: \"42ff5a6b-5db2-4982-ad73-549c177b4074\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7d84d8b7f9-zhsm7" Apr 23 18:34:26.359557 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:26.359536 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42ff5a6b-5db2-4982-ad73-549c177b4074-kserve-provision-location\") pod \"isvc-sklearn-predictor-7d84d8b7f9-zhsm7\" (UID: \"42ff5a6b-5db2-4982-ad73-549c177b4074\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7d84d8b7f9-zhsm7" Apr 23 18:34:26.359869 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:26.359849 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/42ff5a6b-5db2-4982-ad73-549c177b4074-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-predictor-7d84d8b7f9-zhsm7\" (UID: \"42ff5a6b-5db2-4982-ad73-549c177b4074\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7d84d8b7f9-zhsm7" Apr 23 18:34:26.361538 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:26.361518 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42ff5a6b-5db2-4982-ad73-549c177b4074-proxy-tls\") pod \"isvc-sklearn-predictor-7d84d8b7f9-zhsm7\" (UID: \"42ff5a6b-5db2-4982-ad73-549c177b4074\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7d84d8b7f9-zhsm7" Apr 23 18:34:26.369290 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:26.369263 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqv6h\" (UniqueName: \"kubernetes.io/projected/42ff5a6b-5db2-4982-ad73-549c177b4074-kube-api-access-hqv6h\") pod \"isvc-sklearn-predictor-7d84d8b7f9-zhsm7\" (UID: \"42ff5a6b-5db2-4982-ad73-549c177b4074\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7d84d8b7f9-zhsm7" Apr 23 18:34:26.552884 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:26.552799 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7d84d8b7f9-zhsm7" Apr 23 18:34:26.672217 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:26.672180 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-7d84d8b7f9-zhsm7"] Apr 23 18:34:26.675104 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:34:26.675069 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42ff5a6b_5db2_4982_ad73_549c177b4074.slice/crio-31a51f687ea6b5737231c8940002000218dff1620f5fa7d7e1d94452200f8252 WatchSource:0}: Error finding container 31a51f687ea6b5737231c8940002000218dff1620f5fa7d7e1d94452200f8252: Status 404 returned error can't find the container with id 31a51f687ea6b5737231c8940002000218dff1620f5fa7d7e1d94452200f8252 Apr 23 18:34:26.677124 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:26.677106 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 18:34:26.805116 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:26.805038 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7d84d8b7f9-zhsm7" event={"ID":"42ff5a6b-5db2-4982-ad73-549c177b4074","Type":"ContainerStarted","Data":"6b8fcc2bd6a3e953334e4685291a0acb91a743ebf78abe5abef5bf443b4e4459"} Apr 23 18:34:26.805116 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:26.805074 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7d84d8b7f9-zhsm7" event={"ID":"42ff5a6b-5db2-4982-ad73-549c177b4074","Type":"ContainerStarted","Data":"31a51f687ea6b5737231c8940002000218dff1620f5fa7d7e1d94452200f8252"} Apr 23 18:34:28.812334 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:28.812248 2568 generic.go:358] "Generic (PLEG): container finished" podID="606736e1-be53-4d8d-a634-658e10dfe192" containerID="1e0ad88ff9a8217b33b10154e1d29e0ada6486dabb9b19d99244617f73c83a9b" exitCode=0 Apr 23 18:34:28.812334 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:28.812299 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj" event={"ID":"606736e1-be53-4d8d-a634-658e10dfe192","Type":"ContainerDied","Data":"1e0ad88ff9a8217b33b10154e1d29e0ada6486dabb9b19d99244617f73c83a9b"} Apr 23 18:34:29.088809 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:29.088784 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj" Apr 23 18:34:29.178582 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:29.178545 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/606736e1-be53-4d8d-a634-658e10dfe192-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"606736e1-be53-4d8d-a634-658e10dfe192\" (UID: \"606736e1-be53-4d8d-a634-658e10dfe192\") " Apr 23 18:34:29.178751 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:29.178593 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/606736e1-be53-4d8d-a634-658e10dfe192-proxy-tls\") pod \"606736e1-be53-4d8d-a634-658e10dfe192\" (UID: \"606736e1-be53-4d8d-a634-658e10dfe192\") " Apr 23 18:34:29.178751 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:29.178716 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9qbh\" (UniqueName: \"kubernetes.io/projected/606736e1-be53-4d8d-a634-658e10dfe192-kube-api-access-c9qbh\") pod \"606736e1-be53-4d8d-a634-658e10dfe192\" (UID: \"606736e1-be53-4d8d-a634-658e10dfe192\") " Apr 23 18:34:29.178866 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:29.178758 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/606736e1-be53-4d8d-a634-658e10dfe192-kserve-provision-location\") pod \"606736e1-be53-4d8d-a634-658e10dfe192\" (UID: \"606736e1-be53-4d8d-a634-658e10dfe192\") " Apr 23 18:34:29.179001 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:29.178976 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/606736e1-be53-4d8d-a634-658e10dfe192-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config") pod "606736e1-be53-4d8d-a634-658e10dfe192" (UID: "606736e1-be53-4d8d-a634-658e10dfe192"). InnerVolumeSpecName "isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:34:29.179119 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:29.179100 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/606736e1-be53-4d8d-a634-658e10dfe192-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "606736e1-be53-4d8d-a634-658e10dfe192" (UID: "606736e1-be53-4d8d-a634-658e10dfe192"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:34:29.180645 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:29.180625 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/606736e1-be53-4d8d-a634-658e10dfe192-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "606736e1-be53-4d8d-a634-658e10dfe192" (UID: "606736e1-be53-4d8d-a634-658e10dfe192"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:34:29.180780 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:29.180761 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/606736e1-be53-4d8d-a634-658e10dfe192-kube-api-access-c9qbh" (OuterVolumeSpecName: "kube-api-access-c9qbh") pod "606736e1-be53-4d8d-a634-658e10dfe192" (UID: "606736e1-be53-4d8d-a634-658e10dfe192"). InnerVolumeSpecName "kube-api-access-c9qbh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:34:29.279987 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:29.279956 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/606736e1-be53-4d8d-a634-658e10dfe192-kserve-provision-location\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:34:29.279987 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:29.279983 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/606736e1-be53-4d8d-a634-658e10dfe192-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:34:29.279987 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:29.279994 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/606736e1-be53-4d8d-a634-658e10dfe192-proxy-tls\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:34:29.280279 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:29.280004 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c9qbh\" (UniqueName: \"kubernetes.io/projected/606736e1-be53-4d8d-a634-658e10dfe192-kube-api-access-c9qbh\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:34:29.816800 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:29.816762 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj" event={"ID":"606736e1-be53-4d8d-a634-658e10dfe192","Type":"ContainerDied","Data":"eed0c0d3fbe039b925444a3bcd3bbfc91928364ef5ca9bb6808d99be3c410015"} Apr 23 18:34:29.817323 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:29.816804 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj" Apr 23 18:34:29.817323 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:29.816810 2568 scope.go:117] "RemoveContainer" containerID="92ba21ca92bd332984698c851e7eff159f4de9ff3d08956837aa7b7ba2a4d1e6" Apr 23 18:34:29.824879 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:29.824850 2568 scope.go:117] "RemoveContainer" containerID="1e0ad88ff9a8217b33b10154e1d29e0ada6486dabb9b19d99244617f73c83a9b" Apr 23 18:34:29.835758 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:29.835734 2568 scope.go:117] "RemoveContainer" containerID="93b4fb536e55c3a11263db768436a82df4e0b72dbd30f2921e1a4598432b70fc" Apr 23 18:34:29.840336 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:29.840309 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj"] Apr 23 18:34:29.844443 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:29.844421 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-p5scj"] Apr 23 18:34:30.599270 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:30.599237 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="606736e1-be53-4d8d-a634-658e10dfe192" path="/var/lib/kubelet/pods/606736e1-be53-4d8d-a634-658e10dfe192/volumes" Apr 23 18:34:30.821107 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:30.821073 2568 generic.go:358] "Generic (PLEG): container finished" podID="42ff5a6b-5db2-4982-ad73-549c177b4074" containerID="6b8fcc2bd6a3e953334e4685291a0acb91a743ebf78abe5abef5bf443b4e4459" exitCode=0 Apr 23 18:34:30.821540 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:30.821148 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7d84d8b7f9-zhsm7" event={"ID":"42ff5a6b-5db2-4982-ad73-549c177b4074","Type":"ContainerDied","Data":"6b8fcc2bd6a3e953334e4685291a0acb91a743ebf78abe5abef5bf443b4e4459"} Apr 23 18:34:31.826767 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:31.826732 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7d84d8b7f9-zhsm7" event={"ID":"42ff5a6b-5db2-4982-ad73-549c177b4074","Type":"ContainerStarted","Data":"9b71f5b8d7cb9a08c1d40c5ca6063069446727f6722b61a5efe76f14851b20f9"} Apr 23 18:34:31.826767 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:31.826773 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7d84d8b7f9-zhsm7" event={"ID":"42ff5a6b-5db2-4982-ad73-549c177b4074","Type":"ContainerStarted","Data":"3f1fc6392e710f12b298f96ffe6547012aaa6e6b0f68e40599526b5f768f8761"} Apr 23 18:34:31.827267 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:31.827003 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7d84d8b7f9-zhsm7" Apr 23 18:34:31.846568 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:31.846513 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7d84d8b7f9-zhsm7" podStartSLOduration=5.846495419 podStartE2EDuration="5.846495419s" podCreationTimestamp="2026-04-23 18:34:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:34:31.845793817 +0000 UTC m=+2529.765886292" watchObservedRunningTime="2026-04-23 18:34:31.846495419 +0000 UTC m=+2529.766587894" Apr 23 18:34:32.829403 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:32.829362 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7d84d8b7f9-zhsm7" Apr 23 18:34:32.830700 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:32.830668 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7d84d8b7f9-zhsm7" podUID="42ff5a6b-5db2-4982-ad73-549c177b4074" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 23 18:34:33.832410 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:33.832373 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7d84d8b7f9-zhsm7" podUID="42ff5a6b-5db2-4982-ad73-549c177b4074" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 23 18:34:38.836421 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:38.836390 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7d84d8b7f9-zhsm7" Apr 23 18:34:38.837042 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:38.836994 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7d84d8b7f9-zhsm7" podUID="42ff5a6b-5db2-4982-ad73-549c177b4074" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 23 18:34:48.837173 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:48.837132 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7d84d8b7f9-zhsm7" podUID="42ff5a6b-5db2-4982-ad73-549c177b4074" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 23 18:34:58.837292 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:34:58.837249 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7d84d8b7f9-zhsm7" podUID="42ff5a6b-5db2-4982-ad73-549c177b4074" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 23 18:35:08.837378 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:08.837339 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7d84d8b7f9-zhsm7" podUID="42ff5a6b-5db2-4982-ad73-549c177b4074" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 23 18:35:18.837022 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:18.836982 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7d84d8b7f9-zhsm7" podUID="42ff5a6b-5db2-4982-ad73-549c177b4074" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 23 18:35:28.836945 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:28.836904 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7d84d8b7f9-zhsm7" podUID="42ff5a6b-5db2-4982-ad73-549c177b4074" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 23 18:35:38.837945 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:38.837851 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7d84d8b7f9-zhsm7" Apr 23 18:35:46.373572 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:46.373533 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-7d84d8b7f9-zhsm7"] Apr 23 18:35:46.374007 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:46.373881 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7d84d8b7f9-zhsm7" podUID="42ff5a6b-5db2-4982-ad73-549c177b4074" containerName="kserve-container" containerID="cri-o://3f1fc6392e710f12b298f96ffe6547012aaa6e6b0f68e40599526b5f768f8761" gracePeriod=30 Apr 23 18:35:46.374074 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:46.373963 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7d84d8b7f9-zhsm7" podUID="42ff5a6b-5db2-4982-ad73-549c177b4074" containerName="kube-rbac-proxy" containerID="cri-o://9b71f5b8d7cb9a08c1d40c5ca6063069446727f6722b61a5efe76f14851b20f9" gracePeriod=30 Apr 23 18:35:46.456014 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:46.455976 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-zc9ww"] Apr 23 18:35:46.456274 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:46.456236 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="606736e1-be53-4d8d-a634-658e10dfe192" containerName="storage-initializer" Apr 23 18:35:46.456274 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:46.456250 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="606736e1-be53-4d8d-a634-658e10dfe192" containerName="storage-initializer" Apr 23 18:35:46.456274 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:46.456266 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="606736e1-be53-4d8d-a634-658e10dfe192" containerName="kserve-container" Apr 23 18:35:46.456274 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:46.456271 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="606736e1-be53-4d8d-a634-658e10dfe192" containerName="kserve-container" Apr 23 18:35:46.456464 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:46.456286 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="606736e1-be53-4d8d-a634-658e10dfe192" containerName="kube-rbac-proxy" Apr 23 18:35:46.456464 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:46.456292 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="606736e1-be53-4d8d-a634-658e10dfe192" containerName="kube-rbac-proxy" Apr 23 18:35:46.456464 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:46.456332 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="606736e1-be53-4d8d-a634-658e10dfe192" containerName="kserve-container" Apr 23 18:35:46.456464 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:46.456340 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="606736e1-be53-4d8d-a634-658e10dfe192" containerName="kube-rbac-proxy" Apr 23 18:35:46.460445 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:46.460415 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-zc9ww" Apr 23 18:35:46.462226 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:46.462200 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sklearn-v2-mlserver-predictor-serving-cert\"" Apr 23 18:35:46.462362 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:46.462300 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\"" Apr 23 18:35:46.469426 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:46.469401 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-zc9ww"] Apr 23 18:35:46.544642 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:46.544606 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdnlq\" (UniqueName: \"kubernetes.io/projected/5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed-kube-api-access-kdnlq\") pod \"sklearn-v2-mlserver-predictor-65d8664766-zc9ww\" (UID: \"5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-zc9ww" Apr 23 18:35:46.544857 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:46.544659 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"sklearn-v2-mlserver-predictor-65d8664766-zc9ww\" (UID: \"5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-zc9ww" Apr 23 18:35:46.544857 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:46.544695 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-zc9ww\" (UID: \"5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-zc9ww" Apr 23 18:35:46.544857 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:46.544724 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-65d8664766-zc9ww\" (UID: \"5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-zc9ww" Apr 23 18:35:46.645485 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:46.645403 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-zc9ww\" (UID: \"5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-zc9ww" Apr 23 18:35:46.645485 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:46.645449 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-65d8664766-zc9ww\" (UID: \"5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-zc9ww" Apr 23 18:35:46.645485 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:46.645481 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdnlq\" (UniqueName: \"kubernetes.io/projected/5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed-kube-api-access-kdnlq\") pod \"sklearn-v2-mlserver-predictor-65d8664766-zc9ww\" (UID: \"5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-zc9ww" Apr 23 18:35:46.645749 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:46.645516 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"sklearn-v2-mlserver-predictor-65d8664766-zc9ww\" (UID: \"5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-zc9ww" Apr 23 18:35:46.646004 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:46.645983 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-65d8664766-zc9ww\" (UID: \"5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-zc9ww" Apr 23 18:35:46.646122 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:46.646104 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"sklearn-v2-mlserver-predictor-65d8664766-zc9ww\" (UID: \"5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-zc9ww" Apr 23 18:35:46.647982 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:46.647964 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-zc9ww\" (UID: \"5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-zc9ww" Apr 23 18:35:46.654191 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:46.654169 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdnlq\" (UniqueName: \"kubernetes.io/projected/5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed-kube-api-access-kdnlq\") pod \"sklearn-v2-mlserver-predictor-65d8664766-zc9ww\" (UID: \"5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-zc9ww" Apr 23 18:35:46.770748 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:46.770709 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-zc9ww" Apr 23 18:35:46.893878 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:46.893847 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-zc9ww"] Apr 23 18:35:46.896816 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:35:46.896753 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5eccc3a1_fbd3_4ac4_a11c_47c0c24ee3ed.slice/crio-a16acac950c4be63d6d0e1d365903d10079bdf38de489c4aca6223a74d5053e0 WatchSource:0}: Error finding container a16acac950c4be63d6d0e1d365903d10079bdf38de489c4aca6223a74d5053e0: Status 404 returned error can't find the container with id a16acac950c4be63d6d0e1d365903d10079bdf38de489c4aca6223a74d5053e0 Apr 23 18:35:47.029594 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:47.029560 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-zc9ww" event={"ID":"5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed","Type":"ContainerStarted","Data":"173cc262ab384fd73a1db3d05eafcdf8686b43ff5a5e17733cfd8db531dd00ec"} Apr 23 18:35:47.029594 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:47.029597 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-zc9ww" event={"ID":"5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed","Type":"ContainerStarted","Data":"a16acac950c4be63d6d0e1d365903d10079bdf38de489c4aca6223a74d5053e0"} Apr 23 18:35:47.031457 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:47.031432 2568 generic.go:358] "Generic (PLEG): container finished" podID="42ff5a6b-5db2-4982-ad73-549c177b4074" containerID="9b71f5b8d7cb9a08c1d40c5ca6063069446727f6722b61a5efe76f14851b20f9" exitCode=2 Apr 23 18:35:47.031567 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:47.031480 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7d84d8b7f9-zhsm7" event={"ID":"42ff5a6b-5db2-4982-ad73-549c177b4074","Type":"ContainerDied","Data":"9b71f5b8d7cb9a08c1d40c5ca6063069446727f6722b61a5efe76f14851b20f9"} Apr 23 18:35:48.833450 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:48.833403 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7d84d8b7f9-zhsm7" podUID="42ff5a6b-5db2-4982-ad73-549c177b4074" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.41:8643/healthz\": dial tcp 10.133.0.41:8643: connect: connection refused" Apr 23 18:35:48.837698 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:48.837665 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7d84d8b7f9-zhsm7" podUID="42ff5a6b-5db2-4982-ad73-549c177b4074" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 23 18:35:50.721222 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:50.721198 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7d84d8b7f9-zhsm7" Apr 23 18:35:50.781960 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:50.781922 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqv6h\" (UniqueName: \"kubernetes.io/projected/42ff5a6b-5db2-4982-ad73-549c177b4074-kube-api-access-hqv6h\") pod \"42ff5a6b-5db2-4982-ad73-549c177b4074\" (UID: \"42ff5a6b-5db2-4982-ad73-549c177b4074\") " Apr 23 18:35:50.782091 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:50.781976 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42ff5a6b-5db2-4982-ad73-549c177b4074-proxy-tls\") pod \"42ff5a6b-5db2-4982-ad73-549c177b4074\" (UID: \"42ff5a6b-5db2-4982-ad73-549c177b4074\") " Apr 23 18:35:50.782091 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:50.782008 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42ff5a6b-5db2-4982-ad73-549c177b4074-kserve-provision-location\") pod \"42ff5a6b-5db2-4982-ad73-549c177b4074\" (UID: \"42ff5a6b-5db2-4982-ad73-549c177b4074\") " Apr 23 18:35:50.782091 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:50.782026 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/42ff5a6b-5db2-4982-ad73-549c177b4074-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"42ff5a6b-5db2-4982-ad73-549c177b4074\" (UID: \"42ff5a6b-5db2-4982-ad73-549c177b4074\") " Apr 23 18:35:50.782300 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:50.782278 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42ff5a6b-5db2-4982-ad73-549c177b4074-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "42ff5a6b-5db2-4982-ad73-549c177b4074" (UID: "42ff5a6b-5db2-4982-ad73-549c177b4074"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:35:50.782415 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:50.782392 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42ff5a6b-5db2-4982-ad73-549c177b4074-isvc-sklearn-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-kube-rbac-proxy-sar-config") pod "42ff5a6b-5db2-4982-ad73-549c177b4074" (UID: "42ff5a6b-5db2-4982-ad73-549c177b4074"). InnerVolumeSpecName "isvc-sklearn-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:35:50.784023 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:50.783994 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42ff5a6b-5db2-4982-ad73-549c177b4074-kube-api-access-hqv6h" (OuterVolumeSpecName: "kube-api-access-hqv6h") pod "42ff5a6b-5db2-4982-ad73-549c177b4074" (UID: "42ff5a6b-5db2-4982-ad73-549c177b4074"). InnerVolumeSpecName "kube-api-access-hqv6h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:35:50.784285 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:50.784268 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42ff5a6b-5db2-4982-ad73-549c177b4074-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "42ff5a6b-5db2-4982-ad73-549c177b4074" (UID: "42ff5a6b-5db2-4982-ad73-549c177b4074"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:35:50.882727 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:50.882659 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hqv6h\" (UniqueName: \"kubernetes.io/projected/42ff5a6b-5db2-4982-ad73-549c177b4074-kube-api-access-hqv6h\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:35:50.882727 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:50.882694 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42ff5a6b-5db2-4982-ad73-549c177b4074-proxy-tls\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:35:50.882727 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:50.882704 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42ff5a6b-5db2-4982-ad73-549c177b4074-kserve-provision-location\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:35:50.882727 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:50.882714 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/42ff5a6b-5db2-4982-ad73-549c177b4074-isvc-sklearn-kube-rbac-proxy-sar-config\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:35:51.042733 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:51.042708 2568 generic.go:358] "Generic (PLEG): container finished" podID="5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed" containerID="173cc262ab384fd73a1db3d05eafcdf8686b43ff5a5e17733cfd8db531dd00ec" exitCode=0 Apr 23 18:35:51.042876 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:51.042777 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-zc9ww" event={"ID":"5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed","Type":"ContainerDied","Data":"173cc262ab384fd73a1db3d05eafcdf8686b43ff5a5e17733cfd8db531dd00ec"} Apr 23 18:35:51.044527 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:51.044506 2568 generic.go:358] "Generic (PLEG): container finished" podID="42ff5a6b-5db2-4982-ad73-549c177b4074" containerID="3f1fc6392e710f12b298f96ffe6547012aaa6e6b0f68e40599526b5f768f8761" exitCode=0 Apr 23 18:35:51.044632 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:51.044554 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7d84d8b7f9-zhsm7" event={"ID":"42ff5a6b-5db2-4982-ad73-549c177b4074","Type":"ContainerDied","Data":"3f1fc6392e710f12b298f96ffe6547012aaa6e6b0f68e40599526b5f768f8761"} Apr 23 18:35:51.044632 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:51.044575 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7d84d8b7f9-zhsm7" Apr 23 18:35:51.044632 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:51.044590 2568 scope.go:117] "RemoveContainer" containerID="9b71f5b8d7cb9a08c1d40c5ca6063069446727f6722b61a5efe76f14851b20f9" Apr 23 18:35:51.044774 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:51.044577 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7d84d8b7f9-zhsm7" event={"ID":"42ff5a6b-5db2-4982-ad73-549c177b4074","Type":"ContainerDied","Data":"31a51f687ea6b5737231c8940002000218dff1620f5fa7d7e1d94452200f8252"} Apr 23 18:35:51.052339 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:51.052200 2568 scope.go:117] "RemoveContainer" containerID="3f1fc6392e710f12b298f96ffe6547012aaa6e6b0f68e40599526b5f768f8761" Apr 23 18:35:51.059097 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:51.059080 2568 scope.go:117] "RemoveContainer" containerID="6b8fcc2bd6a3e953334e4685291a0acb91a743ebf78abe5abef5bf443b4e4459" Apr 23 18:35:51.067184 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:51.067160 2568 scope.go:117] "RemoveContainer" containerID="9b71f5b8d7cb9a08c1d40c5ca6063069446727f6722b61a5efe76f14851b20f9" Apr 23 18:35:51.067467 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:35:51.067443 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b71f5b8d7cb9a08c1d40c5ca6063069446727f6722b61a5efe76f14851b20f9\": container with ID starting with 9b71f5b8d7cb9a08c1d40c5ca6063069446727f6722b61a5efe76f14851b20f9 not found: ID does not exist" containerID="9b71f5b8d7cb9a08c1d40c5ca6063069446727f6722b61a5efe76f14851b20f9" Apr 23 18:35:51.067543 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:51.067471 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b71f5b8d7cb9a08c1d40c5ca6063069446727f6722b61a5efe76f14851b20f9"} err="failed to get container status \"9b71f5b8d7cb9a08c1d40c5ca6063069446727f6722b61a5efe76f14851b20f9\": rpc error: code = NotFound desc = could not find container \"9b71f5b8d7cb9a08c1d40c5ca6063069446727f6722b61a5efe76f14851b20f9\": container with ID starting with 9b71f5b8d7cb9a08c1d40c5ca6063069446727f6722b61a5efe76f14851b20f9 not found: ID does not exist" Apr 23 18:35:51.067543 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:51.067489 2568 scope.go:117] "RemoveContainer" containerID="3f1fc6392e710f12b298f96ffe6547012aaa6e6b0f68e40599526b5f768f8761" Apr 23 18:35:51.067717 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:35:51.067699 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f1fc6392e710f12b298f96ffe6547012aaa6e6b0f68e40599526b5f768f8761\": container with ID starting with 3f1fc6392e710f12b298f96ffe6547012aaa6e6b0f68e40599526b5f768f8761 not found: ID does not exist" containerID="3f1fc6392e710f12b298f96ffe6547012aaa6e6b0f68e40599526b5f768f8761" Apr 23 18:35:51.067768 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:51.067723 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f1fc6392e710f12b298f96ffe6547012aaa6e6b0f68e40599526b5f768f8761"} err="failed to get container status \"3f1fc6392e710f12b298f96ffe6547012aaa6e6b0f68e40599526b5f768f8761\": rpc error: code = NotFound desc = could not find container \"3f1fc6392e710f12b298f96ffe6547012aaa6e6b0f68e40599526b5f768f8761\": container with ID starting with 3f1fc6392e710f12b298f96ffe6547012aaa6e6b0f68e40599526b5f768f8761 not found: ID does not exist" Apr 23 18:35:51.067768 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:51.067740 2568 scope.go:117] "RemoveContainer" containerID="6b8fcc2bd6a3e953334e4685291a0acb91a743ebf78abe5abef5bf443b4e4459" Apr 23 18:35:51.067992 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:35:51.067972 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b8fcc2bd6a3e953334e4685291a0acb91a743ebf78abe5abef5bf443b4e4459\": container with ID starting with 6b8fcc2bd6a3e953334e4685291a0acb91a743ebf78abe5abef5bf443b4e4459 not found: ID does not exist" containerID="6b8fcc2bd6a3e953334e4685291a0acb91a743ebf78abe5abef5bf443b4e4459" Apr 23 18:35:51.068085 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:51.067995 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b8fcc2bd6a3e953334e4685291a0acb91a743ebf78abe5abef5bf443b4e4459"} err="failed to get container status \"6b8fcc2bd6a3e953334e4685291a0acb91a743ebf78abe5abef5bf443b4e4459\": rpc error: code = NotFound desc = could not find container \"6b8fcc2bd6a3e953334e4685291a0acb91a743ebf78abe5abef5bf443b4e4459\": container with ID starting with 6b8fcc2bd6a3e953334e4685291a0acb91a743ebf78abe5abef5bf443b4e4459 not found: ID does not exist" Apr 23 18:35:51.075649 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:51.075139 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-7d84d8b7f9-zhsm7"] Apr 23 18:35:51.082907 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:51.082883 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-7d84d8b7f9-zhsm7"] Apr 23 18:35:52.048767 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:52.048729 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-zc9ww" event={"ID":"5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed","Type":"ContainerStarted","Data":"52e02211c3fa4274772b480ce4823cc84bad05249f573b372e5bfb448dad3b82"} Apr 23 18:35:52.049230 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:52.048774 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-zc9ww" event={"ID":"5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed","Type":"ContainerStarted","Data":"7bbfa1bc42b8c74d3b236bed780a4aee6ecc9a4e691f59e2a72bb41a4bf7ed84"} Apr 23 18:35:52.049230 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:52.049010 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-zc9ww" Apr 23 18:35:52.069359 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:52.069309 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-zc9ww" podStartSLOduration=6.06929422 podStartE2EDuration="6.06929422s" podCreationTimestamp="2026-04-23 18:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:35:52.067431891 +0000 UTC m=+2609.987524366" watchObservedRunningTime="2026-04-23 18:35:52.06929422 +0000 UTC m=+2609.989386695" Apr 23 18:35:52.598899 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:52.598867 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42ff5a6b-5db2-4982-ad73-549c177b4074" path="/var/lib/kubelet/pods/42ff5a6b-5db2-4982-ad73-549c177b4074/volumes" Apr 23 18:35:53.052581 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:53.052507 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-zc9ww" Apr 23 18:35:59.061556 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:35:59.061526 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-zc9ww" Apr 23 18:36:29.068748 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:29.068710 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-zc9ww" podUID="5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 23 18:36:39.064712 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:39.064678 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-zc9ww" Apr 23 18:36:46.558675 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:46.558645 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-zc9ww"] Apr 23 18:36:46.559217 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:46.559013 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-zc9ww" podUID="5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed" containerName="kserve-container" containerID="cri-o://7bbfa1bc42b8c74d3b236bed780a4aee6ecc9a4e691f59e2a72bb41a4bf7ed84" gracePeriod=30 Apr 23 18:36:46.559217 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:46.559083 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-zc9ww" podUID="5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed" containerName="kube-rbac-proxy" containerID="cri-o://52e02211c3fa4274772b480ce4823cc84bad05249f573b372e5bfb448dad3b82" gracePeriod=30 Apr 23 18:36:46.638076 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:46.638035 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt"] Apr 23 18:36:46.638672 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:46.638651 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42ff5a6b-5db2-4982-ad73-549c177b4074" containerName="storage-initializer" Apr 23 18:36:46.638766 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:46.638675 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="42ff5a6b-5db2-4982-ad73-549c177b4074" containerName="storage-initializer" Apr 23 18:36:46.638766 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:46.638695 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42ff5a6b-5db2-4982-ad73-549c177b4074" containerName="kube-rbac-proxy" Apr 23 18:36:46.638766 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:46.638704 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="42ff5a6b-5db2-4982-ad73-549c177b4074" containerName="kube-rbac-proxy" Apr 23 18:36:46.638766 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:46.638723 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42ff5a6b-5db2-4982-ad73-549c177b4074" containerName="kserve-container" Apr 23 18:36:46.638766 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:46.638735 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="42ff5a6b-5db2-4982-ad73-549c177b4074" containerName="kserve-container" Apr 23 18:36:46.639033 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:46.638850 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="42ff5a6b-5db2-4982-ad73-549c177b4074" containerName="kube-rbac-proxy" Apr 23 18:36:46.639033 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:46.638862 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="42ff5a6b-5db2-4982-ad73-549c177b4074" containerName="kserve-container" Apr 23 18:36:46.644677 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:46.644654 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt" Apr 23 18:36:46.646584 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:46.646560 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-runtime-predictor-serving-cert\"" Apr 23 18:36:46.646584 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:46.646572 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\"" Apr 23 18:36:46.650683 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:46.650658 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt"] Apr 23 18:36:46.724092 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:46.724056 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/56f206e4-b537-4506-a58d-b804f5e01ba5-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt\" (UID: \"56f206e4-b537-4506-a58d-b804f5e01ba5\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt" Apr 23 18:36:46.724267 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:46.724109 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/56f206e4-b537-4506-a58d-b804f5e01ba5-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt\" (UID: \"56f206e4-b537-4506-a58d-b804f5e01ba5\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt" Apr 23 18:36:46.724267 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:46.724136 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/56f206e4-b537-4506-a58d-b804f5e01ba5-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt\" (UID: \"56f206e4-b537-4506-a58d-b804f5e01ba5\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt" Apr 23 18:36:46.724267 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:46.724166 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqfqw\" (UniqueName: \"kubernetes.io/projected/56f206e4-b537-4506-a58d-b804f5e01ba5-kube-api-access-wqfqw\") pod \"isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt\" (UID: \"56f206e4-b537-4506-a58d-b804f5e01ba5\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt" Apr 23 18:36:46.824973 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:46.824923 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/56f206e4-b537-4506-a58d-b804f5e01ba5-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt\" (UID: \"56f206e4-b537-4506-a58d-b804f5e01ba5\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt" Apr 23 18:36:46.825147 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:46.824986 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/56f206e4-b537-4506-a58d-b804f5e01ba5-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt\" (UID: \"56f206e4-b537-4506-a58d-b804f5e01ba5\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt" Apr 23 18:36:46.825147 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:46.825021 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wqfqw\" (UniqueName: \"kubernetes.io/projected/56f206e4-b537-4506-a58d-b804f5e01ba5-kube-api-access-wqfqw\") pod \"isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt\" (UID: \"56f206e4-b537-4506-a58d-b804f5e01ba5\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt" Apr 23 18:36:46.825147 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:46.825074 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/56f206e4-b537-4506-a58d-b804f5e01ba5-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt\" (UID: \"56f206e4-b537-4506-a58d-b804f5e01ba5\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt" Apr 23 18:36:46.825469 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:46.825448 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/56f206e4-b537-4506-a58d-b804f5e01ba5-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt\" (UID: \"56f206e4-b537-4506-a58d-b804f5e01ba5\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt" Apr 23 18:36:46.825737 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:46.825708 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/56f206e4-b537-4506-a58d-b804f5e01ba5-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt\" (UID: \"56f206e4-b537-4506-a58d-b804f5e01ba5\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt" Apr 23 18:36:46.827293 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:46.827275 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/56f206e4-b537-4506-a58d-b804f5e01ba5-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt\" (UID: \"56f206e4-b537-4506-a58d-b804f5e01ba5\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt" Apr 23 18:36:46.833382 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:46.833361 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqfqw\" (UniqueName: \"kubernetes.io/projected/56f206e4-b537-4506-a58d-b804f5e01ba5-kube-api-access-wqfqw\") pod \"isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt\" (UID: \"56f206e4-b537-4506-a58d-b804f5e01ba5\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt" Apr 23 18:36:46.956535 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:46.956496 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt" Apr 23 18:36:47.079723 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:47.079644 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt"] Apr 23 18:36:47.082682 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:36:47.082659 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56f206e4_b537_4506_a58d_b804f5e01ba5.slice/crio-2dde632f91ee0ebd50270d077accd465009a62c92d87cd1b70789809ccc89f61 WatchSource:0}: Error finding container 2dde632f91ee0ebd50270d077accd465009a62c92d87cd1b70789809ccc89f61: Status 404 returned error can't find the container with id 2dde632f91ee0ebd50270d077accd465009a62c92d87cd1b70789809ccc89f61 Apr 23 18:36:47.205664 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:47.205633 2568 generic.go:358] "Generic (PLEG): container finished" podID="5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed" containerID="52e02211c3fa4274772b480ce4823cc84bad05249f573b372e5bfb448dad3b82" exitCode=2 Apr 23 18:36:47.205852 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:47.205713 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-zc9ww" event={"ID":"5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed","Type":"ContainerDied","Data":"52e02211c3fa4274772b480ce4823cc84bad05249f573b372e5bfb448dad3b82"} Apr 23 18:36:47.207080 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:47.207054 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt" event={"ID":"56f206e4-b537-4506-a58d-b804f5e01ba5","Type":"ContainerStarted","Data":"3c4c3c2ceb03c58e2c59b7eb6227c3648442ddb4995547d435345a7d3ddb22c8"} Apr 23 18:36:47.207189 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:47.207086 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt" event={"ID":"56f206e4-b537-4506-a58d-b804f5e01ba5","Type":"ContainerStarted","Data":"2dde632f91ee0ebd50270d077accd465009a62c92d87cd1b70789809ccc89f61"} Apr 23 18:36:49.057563 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:49.057523 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-zc9ww" podUID="5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.42:8643/healthz\": dial tcp 10.133.0.42:8643: connect: connection refused" Apr 23 18:36:53.225757 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:53.225711 2568 generic.go:358] "Generic (PLEG): container finished" podID="56f206e4-b537-4506-a58d-b804f5e01ba5" containerID="3c4c3c2ceb03c58e2c59b7eb6227c3648442ddb4995547d435345a7d3ddb22c8" exitCode=0 Apr 23 18:36:53.226136 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:53.225788 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt" event={"ID":"56f206e4-b537-4506-a58d-b804f5e01ba5","Type":"ContainerDied","Data":"3c4c3c2ceb03c58e2c59b7eb6227c3648442ddb4995547d435345a7d3ddb22c8"} Apr 23 18:36:54.056995 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:54.056915 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-zc9ww" podUID="5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.42:8643/healthz\": dial tcp 10.133.0.42:8643: connect: connection refused" Apr 23 18:36:54.230643 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:54.230608 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt" event={"ID":"56f206e4-b537-4506-a58d-b804f5e01ba5","Type":"ContainerStarted","Data":"0acf706b94a4d7174ba6f762bd43f64ca3907ec574a3225eb52f3aaf8b98e39d"} Apr 23 18:36:54.230643 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:54.230647 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt" event={"ID":"56f206e4-b537-4506-a58d-b804f5e01ba5","Type":"ContainerStarted","Data":"6e36dca01bfb271ad7afab59fa2c6668eae87ffd70a9a7b8e221a589fecf9042"} Apr 23 18:36:54.231092 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:54.231004 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt" Apr 23 18:36:54.231152 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:54.231135 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt" Apr 23 18:36:54.232317 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:54.232292 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt" podUID="56f206e4-b537-4506-a58d-b804f5e01ba5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 23 18:36:54.248989 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:54.248906 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt" podStartSLOduration=8.248873139 podStartE2EDuration="8.248873139s" podCreationTimestamp="2026-04-23 18:36:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:36:54.247691452 +0000 UTC m=+2672.167783927" watchObservedRunningTime="2026-04-23 18:36:54.248873139 +0000 UTC m=+2672.168965615" Apr 23 18:36:55.234018 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:55.233965 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt" podUID="56f206e4-b537-4506-a58d-b804f5e01ba5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 23 18:36:57.403586 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:57.403563 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-zc9ww" Apr 23 18:36:57.508223 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:57.508134 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed-proxy-tls\") pod \"5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed\" (UID: \"5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed\") " Apr 23 18:36:57.508223 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:57.508195 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed-kserve-provision-location\") pod \"5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed\" (UID: \"5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed\") " Apr 23 18:36:57.508223 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:57.508223 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdnlq\" (UniqueName: \"kubernetes.io/projected/5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed-kube-api-access-kdnlq\") pod \"5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed\" (UID: \"5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed\") " Apr 23 18:36:57.508526 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:57.508247 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed\" (UID: \"5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed\") " Apr 23 18:36:57.508526 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:57.508507 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed" (UID: "5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:36:57.508730 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:57.508691 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed-sklearn-v2-mlserver-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "sklearn-v2-mlserver-kube-rbac-proxy-sar-config") pod "5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed" (UID: "5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed"). InnerVolumeSpecName "sklearn-v2-mlserver-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:36:57.510287 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:57.510267 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed" (UID: "5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:36:57.510370 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:57.510355 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed-kube-api-access-kdnlq" (OuterVolumeSpecName: "kube-api-access-kdnlq") pod "5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed" (UID: "5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed"). InnerVolumeSpecName "kube-api-access-kdnlq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:36:57.608956 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:57.608912 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed-proxy-tls\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:36:57.608956 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:57.608953 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed-kserve-provision-location\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:36:57.608956 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:57.608964 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kdnlq\" (UniqueName: \"kubernetes.io/projected/5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed-kube-api-access-kdnlq\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:36:57.609180 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:57.608974 2568 reconciler_common.go:299] "Volume detached for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:36:58.244032 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:58.243995 2568 generic.go:358] "Generic (PLEG): container finished" podID="5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed" containerID="7bbfa1bc42b8c74d3b236bed780a4aee6ecc9a4e691f59e2a72bb41a4bf7ed84" exitCode=0 Apr 23 18:36:58.244269 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:58.244079 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-zc9ww" event={"ID":"5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed","Type":"ContainerDied","Data":"7bbfa1bc42b8c74d3b236bed780a4aee6ecc9a4e691f59e2a72bb41a4bf7ed84"} Apr 23 18:36:58.244269 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:58.244106 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-zc9ww" Apr 23 18:36:58.244269 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:58.244126 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-zc9ww" event={"ID":"5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed","Type":"ContainerDied","Data":"a16acac950c4be63d6d0e1d365903d10079bdf38de489c4aca6223a74d5053e0"} Apr 23 18:36:58.244269 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:58.244149 2568 scope.go:117] "RemoveContainer" containerID="52e02211c3fa4274772b480ce4823cc84bad05249f573b372e5bfb448dad3b82" Apr 23 18:36:58.254199 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:58.252793 2568 scope.go:117] "RemoveContainer" containerID="7bbfa1bc42b8c74d3b236bed780a4aee6ecc9a4e691f59e2a72bb41a4bf7ed84" Apr 23 18:36:58.261809 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:58.261789 2568 scope.go:117] "RemoveContainer" containerID="173cc262ab384fd73a1db3d05eafcdf8686b43ff5a5e17733cfd8db531dd00ec" Apr 23 18:36:58.268824 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:58.268805 2568 scope.go:117] "RemoveContainer" containerID="52e02211c3fa4274772b480ce4823cc84bad05249f573b372e5bfb448dad3b82" Apr 23 18:36:58.269108 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:36:58.269088 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52e02211c3fa4274772b480ce4823cc84bad05249f573b372e5bfb448dad3b82\": container with ID starting with 52e02211c3fa4274772b480ce4823cc84bad05249f573b372e5bfb448dad3b82 not found: ID does not exist" containerID="52e02211c3fa4274772b480ce4823cc84bad05249f573b372e5bfb448dad3b82" Apr 23 18:36:58.269169 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:58.269116 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52e02211c3fa4274772b480ce4823cc84bad05249f573b372e5bfb448dad3b82"} err="failed to get container status \"52e02211c3fa4274772b480ce4823cc84bad05249f573b372e5bfb448dad3b82\": rpc error: code = NotFound desc = could not find container \"52e02211c3fa4274772b480ce4823cc84bad05249f573b372e5bfb448dad3b82\": container with ID starting with 52e02211c3fa4274772b480ce4823cc84bad05249f573b372e5bfb448dad3b82 not found: ID does not exist" Apr 23 18:36:58.269169 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:58.269133 2568 scope.go:117] "RemoveContainer" containerID="7bbfa1bc42b8c74d3b236bed780a4aee6ecc9a4e691f59e2a72bb41a4bf7ed84" Apr 23 18:36:58.269373 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:36:58.269353 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bbfa1bc42b8c74d3b236bed780a4aee6ecc9a4e691f59e2a72bb41a4bf7ed84\": container with ID starting with 7bbfa1bc42b8c74d3b236bed780a4aee6ecc9a4e691f59e2a72bb41a4bf7ed84 not found: ID does not exist" containerID="7bbfa1bc42b8c74d3b236bed780a4aee6ecc9a4e691f59e2a72bb41a4bf7ed84" Apr 23 18:36:58.269436 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:58.269383 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bbfa1bc42b8c74d3b236bed780a4aee6ecc9a4e691f59e2a72bb41a4bf7ed84"} err="failed to get container status \"7bbfa1bc42b8c74d3b236bed780a4aee6ecc9a4e691f59e2a72bb41a4bf7ed84\": rpc error: code = NotFound desc = could not find container \"7bbfa1bc42b8c74d3b236bed780a4aee6ecc9a4e691f59e2a72bb41a4bf7ed84\": container with ID starting with 7bbfa1bc42b8c74d3b236bed780a4aee6ecc9a4e691f59e2a72bb41a4bf7ed84 not found: ID does not exist" Apr 23 18:36:58.269436 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:58.269408 2568 scope.go:117] "RemoveContainer" containerID="173cc262ab384fd73a1db3d05eafcdf8686b43ff5a5e17733cfd8db531dd00ec" Apr 23 18:36:58.269613 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:36:58.269597 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"173cc262ab384fd73a1db3d05eafcdf8686b43ff5a5e17733cfd8db531dd00ec\": container with ID starting with 173cc262ab384fd73a1db3d05eafcdf8686b43ff5a5e17733cfd8db531dd00ec not found: ID does not exist" containerID="173cc262ab384fd73a1db3d05eafcdf8686b43ff5a5e17733cfd8db531dd00ec" Apr 23 18:36:58.269666 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:58.269620 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"173cc262ab384fd73a1db3d05eafcdf8686b43ff5a5e17733cfd8db531dd00ec"} err="failed to get container status \"173cc262ab384fd73a1db3d05eafcdf8686b43ff5a5e17733cfd8db531dd00ec\": rpc error: code = NotFound desc = could not find container \"173cc262ab384fd73a1db3d05eafcdf8686b43ff5a5e17733cfd8db531dd00ec\": container with ID starting with 173cc262ab384fd73a1db3d05eafcdf8686b43ff5a5e17733cfd8db531dd00ec not found: ID does not exist" Apr 23 18:36:58.273815 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:58.273794 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-zc9ww"] Apr 23 18:36:58.277260 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:58.277239 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-zc9ww"] Apr 23 18:36:58.599428 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:36:58.599391 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed" path="/var/lib/kubelet/pods/5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed/volumes" Apr 23 18:37:00.239033 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:00.238996 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt" Apr 23 18:37:00.239636 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:00.239610 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt" podUID="56f206e4-b537-4506-a58d-b804f5e01ba5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 23 18:37:10.240990 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:10.240880 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt" Apr 23 18:37:22.689386 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:22.689359 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwrk7_52f4d4c2-438f-4486-8ffb-4afbaad14bf1/ovn-acl-logging/0.log" Apr 23 18:37:22.692854 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:22.692828 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_8dc4456d0c7e39a0756b16fde0e83318/kube-rbac-proxy-crio/4.log" Apr 23 18:37:22.693079 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:22.693058 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwrk7_52f4d4c2-438f-4486-8ffb-4afbaad14bf1/ovn-acl-logging/0.log" Apr 23 18:37:22.696481 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:22.696462 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_8dc4456d0c7e39a0756b16fde0e83318/kube-rbac-proxy-crio/4.log" Apr 23 18:37:23.698167 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:23.698139 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt_56f206e4-b537-4506-a58d-b804f5e01ba5/kserve-container/0.log" Apr 23 18:37:23.835121 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:23.835087 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt"] Apr 23 18:37:23.835443 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:23.835401 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt" podUID="56f206e4-b537-4506-a58d-b804f5e01ba5" containerName="kserve-container" containerID="cri-o://6e36dca01bfb271ad7afab59fa2c6668eae87ffd70a9a7b8e221a589fecf9042" gracePeriod=30 Apr 23 18:37:23.835529 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:23.835456 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt" podUID="56f206e4-b537-4506-a58d-b804f5e01ba5" containerName="kube-rbac-proxy" containerID="cri-o://0acf706b94a4d7174ba6f762bd43f64ca3907ec574a3225eb52f3aaf8b98e39d" gracePeriod=30 Apr 23 18:37:23.934422 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:23.934387 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq"] Apr 23 18:37:23.934702 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:23.934689 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed" containerName="kserve-container" Apr 23 18:37:23.934748 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:23.934706 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed" containerName="kserve-container" Apr 23 18:37:23.934748 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:23.934721 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed" containerName="storage-initializer" Apr 23 18:37:23.934748 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:23.934727 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed" containerName="storage-initializer" Apr 23 18:37:23.934748 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:23.934736 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed" containerName="kube-rbac-proxy" Apr 23 18:37:23.934748 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:23.934742 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed" containerName="kube-rbac-proxy" Apr 23 18:37:23.934903 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:23.934794 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed" containerName="kserve-container" Apr 23 18:37:23.934903 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:23.934803 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="5eccc3a1-fbd3-4ac4-a11c-47c0c24ee3ed" containerName="kube-rbac-proxy" Apr 23 18:37:23.937736 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:23.937709 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq" Apr 23 18:37:23.939963 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:23.939912 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-runtime-predictor-serving-cert\"" Apr 23 18:37:23.940083 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:23.940005 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 23 18:37:23.949274 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:23.949204 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq"] Apr 23 18:37:24.014692 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:24.014662 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/67f15f76-fe39-48b6-9835-7006dbcaff80-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq\" (UID: \"67f15f76-fe39-48b6-9835-7006dbcaff80\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq" Apr 23 18:37:24.014692 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:24.014694 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vg66\" (UniqueName: \"kubernetes.io/projected/67f15f76-fe39-48b6-9835-7006dbcaff80-kube-api-access-9vg66\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq\" (UID: \"67f15f76-fe39-48b6-9835-7006dbcaff80\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq" Apr 23 18:37:24.014898 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:24.014768 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/67f15f76-fe39-48b6-9835-7006dbcaff80-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq\" (UID: \"67f15f76-fe39-48b6-9835-7006dbcaff80\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq" Apr 23 18:37:24.014898 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:24.014832 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/67f15f76-fe39-48b6-9835-7006dbcaff80-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq\" (UID: \"67f15f76-fe39-48b6-9835-7006dbcaff80\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq" Apr 23 18:37:24.115742 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:24.115707 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/67f15f76-fe39-48b6-9835-7006dbcaff80-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq\" (UID: \"67f15f76-fe39-48b6-9835-7006dbcaff80\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq" Apr 23 18:37:24.115742 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:24.115754 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9vg66\" (UniqueName: \"kubernetes.io/projected/67f15f76-fe39-48b6-9835-7006dbcaff80-kube-api-access-9vg66\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq\" (UID: \"67f15f76-fe39-48b6-9835-7006dbcaff80\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq" Apr 23 18:37:24.116038 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:24.115805 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/67f15f76-fe39-48b6-9835-7006dbcaff80-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq\" (UID: \"67f15f76-fe39-48b6-9835-7006dbcaff80\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq" Apr 23 18:37:24.116038 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:24.115867 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/67f15f76-fe39-48b6-9835-7006dbcaff80-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq\" (UID: \"67f15f76-fe39-48b6-9835-7006dbcaff80\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq" Apr 23 18:37:24.116369 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:24.116345 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/67f15f76-fe39-48b6-9835-7006dbcaff80-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq\" (UID: \"67f15f76-fe39-48b6-9835-7006dbcaff80\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq" Apr 23 18:37:24.116646 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:24.116627 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/67f15f76-fe39-48b6-9835-7006dbcaff80-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq\" (UID: \"67f15f76-fe39-48b6-9835-7006dbcaff80\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq" Apr 23 18:37:24.118722 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:24.118692 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/67f15f76-fe39-48b6-9835-7006dbcaff80-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq\" (UID: \"67f15f76-fe39-48b6-9835-7006dbcaff80\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq" Apr 23 18:37:24.124119 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:24.124097 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vg66\" (UniqueName: \"kubernetes.io/projected/67f15f76-fe39-48b6-9835-7006dbcaff80-kube-api-access-9vg66\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq\" (UID: \"67f15f76-fe39-48b6-9835-7006dbcaff80\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq" Apr 23 18:37:24.248527 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:24.248438 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq" Apr 23 18:37:24.321668 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:24.321631 2568 generic.go:358] "Generic (PLEG): container finished" podID="56f206e4-b537-4506-a58d-b804f5e01ba5" containerID="0acf706b94a4d7174ba6f762bd43f64ca3907ec574a3225eb52f3aaf8b98e39d" exitCode=2 Apr 23 18:37:24.321849 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:24.321688 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt" event={"ID":"56f206e4-b537-4506-a58d-b804f5e01ba5","Type":"ContainerDied","Data":"0acf706b94a4d7174ba6f762bd43f64ca3907ec574a3225eb52f3aaf8b98e39d"} Apr 23 18:37:24.369754 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:24.369721 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq"] Apr 23 18:37:24.373150 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:37:24.373120 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67f15f76_fe39_48b6_9835_7006dbcaff80.slice/crio-ffa7d8bc68279d87e4d21284741f92488f43dd08c376a5576af653e3e4250bb5 WatchSource:0}: Error finding container ffa7d8bc68279d87e4d21284741f92488f43dd08c376a5576af653e3e4250bb5: Status 404 returned error can't find the container with id ffa7d8bc68279d87e4d21284741f92488f43dd08c376a5576af653e3e4250bb5 Apr 23 18:37:24.968882 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:24.968859 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt" Apr 23 18:37:25.124923 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:25.124888 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/56f206e4-b537-4506-a58d-b804f5e01ba5-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"56f206e4-b537-4506-a58d-b804f5e01ba5\" (UID: \"56f206e4-b537-4506-a58d-b804f5e01ba5\") " Apr 23 18:37:25.125127 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:25.124972 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqfqw\" (UniqueName: \"kubernetes.io/projected/56f206e4-b537-4506-a58d-b804f5e01ba5-kube-api-access-wqfqw\") pod \"56f206e4-b537-4506-a58d-b804f5e01ba5\" (UID: \"56f206e4-b537-4506-a58d-b804f5e01ba5\") " Apr 23 18:37:25.125127 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:25.125056 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/56f206e4-b537-4506-a58d-b804f5e01ba5-kserve-provision-location\") pod \"56f206e4-b537-4506-a58d-b804f5e01ba5\" (UID: \"56f206e4-b537-4506-a58d-b804f5e01ba5\") " Apr 23 18:37:25.125127 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:25.125105 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/56f206e4-b537-4506-a58d-b804f5e01ba5-proxy-tls\") pod \"56f206e4-b537-4506-a58d-b804f5e01ba5\" (UID: \"56f206e4-b537-4506-a58d-b804f5e01ba5\") " Apr 23 18:37:25.125340 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:25.125309 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56f206e4-b537-4506-a58d-b804f5e01ba5-isvc-sklearn-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-runtime-kube-rbac-proxy-sar-config") pod "56f206e4-b537-4506-a58d-b804f5e01ba5" (UID: "56f206e4-b537-4506-a58d-b804f5e01ba5"). InnerVolumeSpecName "isvc-sklearn-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:37:25.127105 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:25.127071 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56f206e4-b537-4506-a58d-b804f5e01ba5-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "56f206e4-b537-4506-a58d-b804f5e01ba5" (UID: "56f206e4-b537-4506-a58d-b804f5e01ba5"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:37:25.127214 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:25.127165 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56f206e4-b537-4506-a58d-b804f5e01ba5-kube-api-access-wqfqw" (OuterVolumeSpecName: "kube-api-access-wqfqw") pod "56f206e4-b537-4506-a58d-b804f5e01ba5" (UID: "56f206e4-b537-4506-a58d-b804f5e01ba5"). InnerVolumeSpecName "kube-api-access-wqfqw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:37:25.144447 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:25.144404 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56f206e4-b537-4506-a58d-b804f5e01ba5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "56f206e4-b537-4506-a58d-b804f5e01ba5" (UID: "56f206e4-b537-4506-a58d-b804f5e01ba5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:37:25.226620 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:25.226584 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/56f206e4-b537-4506-a58d-b804f5e01ba5-kserve-provision-location\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:37:25.226620 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:25.226614 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/56f206e4-b537-4506-a58d-b804f5e01ba5-proxy-tls\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:37:25.226620 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:25.226625 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/56f206e4-b537-4506-a58d-b804f5e01ba5-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:37:25.226861 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:25.226634 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wqfqw\" (UniqueName: \"kubernetes.io/projected/56f206e4-b537-4506-a58d-b804f5e01ba5-kube-api-access-wqfqw\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:37:25.326014 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:25.325980 2568 generic.go:358] "Generic (PLEG): container finished" podID="56f206e4-b537-4506-a58d-b804f5e01ba5" containerID="6e36dca01bfb271ad7afab59fa2c6668eae87ffd70a9a7b8e221a589fecf9042" exitCode=0 Apr 23 18:37:25.326198 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:25.326059 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt" Apr 23 18:37:25.326198 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:25.326068 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt" event={"ID":"56f206e4-b537-4506-a58d-b804f5e01ba5","Type":"ContainerDied","Data":"6e36dca01bfb271ad7afab59fa2c6668eae87ffd70a9a7b8e221a589fecf9042"} Apr 23 18:37:25.326198 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:25.326104 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt" event={"ID":"56f206e4-b537-4506-a58d-b804f5e01ba5","Type":"ContainerDied","Data":"2dde632f91ee0ebd50270d077accd465009a62c92d87cd1b70789809ccc89f61"} Apr 23 18:37:25.326198 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:25.326124 2568 scope.go:117] "RemoveContainer" containerID="0acf706b94a4d7174ba6f762bd43f64ca3907ec574a3225eb52f3aaf8b98e39d" Apr 23 18:37:25.327610 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:25.327585 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq" event={"ID":"67f15f76-fe39-48b6-9835-7006dbcaff80","Type":"ContainerStarted","Data":"1d5f9639dc3f98e57c207ac5f357ea69e948c919f4aecb49122b1bd1e50d9fbf"} Apr 23 18:37:25.327610 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:25.327617 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq" event={"ID":"67f15f76-fe39-48b6-9835-7006dbcaff80","Type":"ContainerStarted","Data":"ffa7d8bc68279d87e4d21284741f92488f43dd08c376a5576af653e3e4250bb5"} Apr 23 18:37:25.333784 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:25.333766 2568 scope.go:117] "RemoveContainer" containerID="6e36dca01bfb271ad7afab59fa2c6668eae87ffd70a9a7b8e221a589fecf9042" Apr 23 18:37:25.340979 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:25.340960 2568 scope.go:117] "RemoveContainer" containerID="3c4c3c2ceb03c58e2c59b7eb6227c3648442ddb4995547d435345a7d3ddb22c8" Apr 23 18:37:25.348052 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:25.348034 2568 scope.go:117] "RemoveContainer" containerID="0acf706b94a4d7174ba6f762bd43f64ca3907ec574a3225eb52f3aaf8b98e39d" Apr 23 18:37:25.348300 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:37:25.348282 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0acf706b94a4d7174ba6f762bd43f64ca3907ec574a3225eb52f3aaf8b98e39d\": container with ID starting with 0acf706b94a4d7174ba6f762bd43f64ca3907ec574a3225eb52f3aaf8b98e39d not found: ID does not exist" containerID="0acf706b94a4d7174ba6f762bd43f64ca3907ec574a3225eb52f3aaf8b98e39d" Apr 23 18:37:25.348347 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:25.348313 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0acf706b94a4d7174ba6f762bd43f64ca3907ec574a3225eb52f3aaf8b98e39d"} err="failed to get container status \"0acf706b94a4d7174ba6f762bd43f64ca3907ec574a3225eb52f3aaf8b98e39d\": rpc error: code = NotFound desc = could not find container \"0acf706b94a4d7174ba6f762bd43f64ca3907ec574a3225eb52f3aaf8b98e39d\": container with ID starting with 0acf706b94a4d7174ba6f762bd43f64ca3907ec574a3225eb52f3aaf8b98e39d not found: ID does not exist" Apr 23 18:37:25.348347 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:25.348333 2568 scope.go:117] "RemoveContainer" containerID="6e36dca01bfb271ad7afab59fa2c6668eae87ffd70a9a7b8e221a589fecf9042" Apr 23 18:37:25.348587 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:37:25.348570 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e36dca01bfb271ad7afab59fa2c6668eae87ffd70a9a7b8e221a589fecf9042\": container with ID starting with 6e36dca01bfb271ad7afab59fa2c6668eae87ffd70a9a7b8e221a589fecf9042 not found: ID does not exist" containerID="6e36dca01bfb271ad7afab59fa2c6668eae87ffd70a9a7b8e221a589fecf9042" Apr 23 18:37:25.348629 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:25.348593 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e36dca01bfb271ad7afab59fa2c6668eae87ffd70a9a7b8e221a589fecf9042"} err="failed to get container status \"6e36dca01bfb271ad7afab59fa2c6668eae87ffd70a9a7b8e221a589fecf9042\": rpc error: code = NotFound desc = could not find container \"6e36dca01bfb271ad7afab59fa2c6668eae87ffd70a9a7b8e221a589fecf9042\": container with ID starting with 6e36dca01bfb271ad7afab59fa2c6668eae87ffd70a9a7b8e221a589fecf9042 not found: ID does not exist" Apr 23 18:37:25.348629 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:25.348609 2568 scope.go:117] "RemoveContainer" containerID="3c4c3c2ceb03c58e2c59b7eb6227c3648442ddb4995547d435345a7d3ddb22c8" Apr 23 18:37:25.348808 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:37:25.348793 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c4c3c2ceb03c58e2c59b7eb6227c3648442ddb4995547d435345a7d3ddb22c8\": container with ID starting with 3c4c3c2ceb03c58e2c59b7eb6227c3648442ddb4995547d435345a7d3ddb22c8 not found: ID does not exist" containerID="3c4c3c2ceb03c58e2c59b7eb6227c3648442ddb4995547d435345a7d3ddb22c8" Apr 23 18:37:25.348848 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:25.348811 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c4c3c2ceb03c58e2c59b7eb6227c3648442ddb4995547d435345a7d3ddb22c8"} err="failed to get container status \"3c4c3c2ceb03c58e2c59b7eb6227c3648442ddb4995547d435345a7d3ddb22c8\": rpc error: code = NotFound desc = could not find container \"3c4c3c2ceb03c58e2c59b7eb6227c3648442ddb4995547d435345a7d3ddb22c8\": container with ID starting with 3c4c3c2ceb03c58e2c59b7eb6227c3648442ddb4995547d435345a7d3ddb22c8 not found: ID does not exist" Apr 23 18:37:25.367387 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:25.367349 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt"] Apr 23 18:37:25.372178 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:25.372145 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6f864c46cc-fvnpt"] Apr 23 18:37:26.598581 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:26.598546 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56f206e4-b537-4506-a58d-b804f5e01ba5" path="/var/lib/kubelet/pods/56f206e4-b537-4506-a58d-b804f5e01ba5/volumes" Apr 23 18:37:28.337373 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:28.337347 2568 generic.go:358] "Generic (PLEG): container finished" podID="67f15f76-fe39-48b6-9835-7006dbcaff80" containerID="1d5f9639dc3f98e57c207ac5f357ea69e948c919f4aecb49122b1bd1e50d9fbf" exitCode=0 Apr 23 18:37:28.337761 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:28.337409 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq" event={"ID":"67f15f76-fe39-48b6-9835-7006dbcaff80","Type":"ContainerDied","Data":"1d5f9639dc3f98e57c207ac5f357ea69e948c919f4aecb49122b1bd1e50d9fbf"} Apr 23 18:37:29.341845 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:29.341811 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq" event={"ID":"67f15f76-fe39-48b6-9835-7006dbcaff80","Type":"ContainerStarted","Data":"f7754c8fe085bb9d55681eaccbd2a535047d9cde2fd7de3b670e62a6fb547c2c"} Apr 23 18:37:29.342296 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:29.341853 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq" event={"ID":"67f15f76-fe39-48b6-9835-7006dbcaff80","Type":"ContainerStarted","Data":"28a6c4964e33bf6fb04080ea47708d8907b1bde147521698d84a78c221cd2357"} Apr 23 18:37:29.342296 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:29.342109 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq" Apr 23 18:37:29.365306 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:29.365264 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq" podStartSLOduration=6.365250037 podStartE2EDuration="6.365250037s" podCreationTimestamp="2026-04-23 18:37:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:37:29.364091594 +0000 UTC m=+2707.284184070" watchObservedRunningTime="2026-04-23 18:37:29.365250037 +0000 UTC m=+2707.285342513" Apr 23 18:37:30.345098 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:30.345068 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq" Apr 23 18:37:36.353478 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:37:36.353448 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq" Apr 23 18:38:06.370066 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:06.370024 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq" podUID="67f15f76-fe39-48b6-9835-7006dbcaff80" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 23 18:38:16.356773 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:16.356744 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq" Apr 23 18:38:24.011072 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:24.011035 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq"] Apr 23 18:38:24.011560 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:24.011371 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq" podUID="67f15f76-fe39-48b6-9835-7006dbcaff80" containerName="kserve-container" containerID="cri-o://28a6c4964e33bf6fb04080ea47708d8907b1bde147521698d84a78c221cd2357" gracePeriod=30 Apr 23 18:38:24.011560 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:24.011410 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq" podUID="67f15f76-fe39-48b6-9835-7006dbcaff80" containerName="kube-rbac-proxy" containerID="cri-o://f7754c8fe085bb9d55681eaccbd2a535047d9cde2fd7de3b670e62a6fb547c2c" gracePeriod=30 Apr 23 18:38:24.104074 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:24.104038 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-557dddfbc-rd5tp"] Apr 23 18:38:24.104314 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:24.104302 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="56f206e4-b537-4506-a58d-b804f5e01ba5" containerName="storage-initializer" Apr 23 18:38:24.104358 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:24.104316 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="56f206e4-b537-4506-a58d-b804f5e01ba5" containerName="storage-initializer" Apr 23 18:38:24.104358 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:24.104324 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="56f206e4-b537-4506-a58d-b804f5e01ba5" containerName="kube-rbac-proxy" Apr 23 18:38:24.104358 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:24.104329 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="56f206e4-b537-4506-a58d-b804f5e01ba5" containerName="kube-rbac-proxy" Apr 23 18:38:24.104358 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:24.104342 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="56f206e4-b537-4506-a58d-b804f5e01ba5" containerName="kserve-container" Apr 23 18:38:24.104358 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:24.104347 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="56f206e4-b537-4506-a58d-b804f5e01ba5" containerName="kserve-container" Apr 23 18:38:24.104519 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:24.104389 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="56f206e4-b537-4506-a58d-b804f5e01ba5" containerName="kube-rbac-proxy" Apr 23 18:38:24.104519 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:24.104398 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="56f206e4-b537-4506-a58d-b804f5e01ba5" containerName="kserve-container" Apr 23 18:38:24.107363 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:24.107345 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-557dddfbc-rd5tp" Apr 23 18:38:24.109137 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:24.109115 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-predictor-serving-cert\"" Apr 23 18:38:24.109239 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:24.109120 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-kube-rbac-proxy-sar-config\"" Apr 23 18:38:24.118580 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:24.118553 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-557dddfbc-rd5tp"] Apr 23 18:38:24.167186 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:24.167146 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/06c59f79-7714-459e-bdca-6143b41065b6-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-557dddfbc-rd5tp\" (UID: \"06c59f79-7714-459e-bdca-6143b41065b6\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-557dddfbc-rd5tp" Apr 23 18:38:24.167368 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:24.167206 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/06c59f79-7714-459e-bdca-6143b41065b6-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-predictor-557dddfbc-rd5tp\" (UID: \"06c59f79-7714-459e-bdca-6143b41065b6\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-557dddfbc-rd5tp" Apr 23 18:38:24.167368 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:24.167251 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sfdg\" (UniqueName: \"kubernetes.io/projected/06c59f79-7714-459e-bdca-6143b41065b6-kube-api-access-2sfdg\") pod \"isvc-sklearn-v2-predictor-557dddfbc-rd5tp\" (UID: \"06c59f79-7714-459e-bdca-6143b41065b6\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-557dddfbc-rd5tp" Apr 23 18:38:24.167368 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:24.167271 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/06c59f79-7714-459e-bdca-6143b41065b6-proxy-tls\") pod \"isvc-sklearn-v2-predictor-557dddfbc-rd5tp\" (UID: \"06c59f79-7714-459e-bdca-6143b41065b6\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-557dddfbc-rd5tp" Apr 23 18:38:24.267711 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:24.267621 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/06c59f79-7714-459e-bdca-6143b41065b6-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-557dddfbc-rd5tp\" (UID: \"06c59f79-7714-459e-bdca-6143b41065b6\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-557dddfbc-rd5tp" Apr 23 18:38:24.267711 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:24.267691 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/06c59f79-7714-459e-bdca-6143b41065b6-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-predictor-557dddfbc-rd5tp\" (UID: \"06c59f79-7714-459e-bdca-6143b41065b6\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-557dddfbc-rd5tp" Apr 23 18:38:24.267964 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:24.267731 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2sfdg\" (UniqueName: \"kubernetes.io/projected/06c59f79-7714-459e-bdca-6143b41065b6-kube-api-access-2sfdg\") pod \"isvc-sklearn-v2-predictor-557dddfbc-rd5tp\" (UID: \"06c59f79-7714-459e-bdca-6143b41065b6\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-557dddfbc-rd5tp" Apr 23 18:38:24.267964 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:24.267761 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/06c59f79-7714-459e-bdca-6143b41065b6-proxy-tls\") pod \"isvc-sklearn-v2-predictor-557dddfbc-rd5tp\" (UID: \"06c59f79-7714-459e-bdca-6143b41065b6\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-557dddfbc-rd5tp" Apr 23 18:38:24.268138 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:24.268112 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/06c59f79-7714-459e-bdca-6143b41065b6-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-557dddfbc-rd5tp\" (UID: \"06c59f79-7714-459e-bdca-6143b41065b6\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-557dddfbc-rd5tp" Apr 23 18:38:24.268391 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:24.268364 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/06c59f79-7714-459e-bdca-6143b41065b6-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-predictor-557dddfbc-rd5tp\" (UID: \"06c59f79-7714-459e-bdca-6143b41065b6\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-557dddfbc-rd5tp" Apr 23 18:38:24.270276 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:24.270250 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/06c59f79-7714-459e-bdca-6143b41065b6-proxy-tls\") pod \"isvc-sklearn-v2-predictor-557dddfbc-rd5tp\" (UID: \"06c59f79-7714-459e-bdca-6143b41065b6\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-557dddfbc-rd5tp" Apr 23 18:38:24.276596 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:24.276573 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sfdg\" (UniqueName: \"kubernetes.io/projected/06c59f79-7714-459e-bdca-6143b41065b6-kube-api-access-2sfdg\") pod \"isvc-sklearn-v2-predictor-557dddfbc-rd5tp\" (UID: \"06c59f79-7714-459e-bdca-6143b41065b6\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-557dddfbc-rd5tp" Apr 23 18:38:24.416843 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:24.416799 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-557dddfbc-rd5tp" Apr 23 18:38:24.499717 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:24.499675 2568 generic.go:358] "Generic (PLEG): container finished" podID="67f15f76-fe39-48b6-9835-7006dbcaff80" containerID="f7754c8fe085bb9d55681eaccbd2a535047d9cde2fd7de3b670e62a6fb547c2c" exitCode=2 Apr 23 18:38:24.499894 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:24.499765 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq" event={"ID":"67f15f76-fe39-48b6-9835-7006dbcaff80","Type":"ContainerDied","Data":"f7754c8fe085bb9d55681eaccbd2a535047d9cde2fd7de3b670e62a6fb547c2c"} Apr 23 18:38:24.544406 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:24.544328 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-557dddfbc-rd5tp"] Apr 23 18:38:24.548026 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:38:24.547994 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06c59f79_7714_459e_bdca_6143b41065b6.slice/crio-26164e085f42d87d809e955e9eb42f19a7768f83f3a5d31cc847976342ec8a63 WatchSource:0}: Error finding container 26164e085f42d87d809e955e9eb42f19a7768f83f3a5d31cc847976342ec8a63: Status 404 returned error can't find the container with id 26164e085f42d87d809e955e9eb42f19a7768f83f3a5d31cc847976342ec8a63 Apr 23 18:38:25.509603 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:25.509566 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-557dddfbc-rd5tp" event={"ID":"06c59f79-7714-459e-bdca-6143b41065b6","Type":"ContainerStarted","Data":"41203b5729c456fd5cb5db0f54aabd0751388ad92ddb7f74b134f612dfd8b931"} Apr 23 18:38:25.509603 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:25.509604 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-557dddfbc-rd5tp" event={"ID":"06c59f79-7714-459e-bdca-6143b41065b6","Type":"ContainerStarted","Data":"26164e085f42d87d809e955e9eb42f19a7768f83f3a5d31cc847976342ec8a63"} Apr 23 18:38:26.348074 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:26.348016 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq" podUID="67f15f76-fe39-48b6-9835-7006dbcaff80" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.44:8643/healthz\": dial tcp 10.133.0.44:8643: connect: connection refused" Apr 23 18:38:27.354538 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:27.354490 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq" podUID="67f15f76-fe39-48b6-9835-7006dbcaff80" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.44:8080/v2/models/isvc-sklearn-v2-runtime/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 23 18:38:28.520613 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:28.520576 2568 generic.go:358] "Generic (PLEG): container finished" podID="06c59f79-7714-459e-bdca-6143b41065b6" containerID="41203b5729c456fd5cb5db0f54aabd0751388ad92ddb7f74b134f612dfd8b931" exitCode=0 Apr 23 18:38:28.521022 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:28.520623 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-557dddfbc-rd5tp" event={"ID":"06c59f79-7714-459e-bdca-6143b41065b6","Type":"ContainerDied","Data":"41203b5729c456fd5cb5db0f54aabd0751388ad92ddb7f74b134f612dfd8b931"} Apr 23 18:38:29.525189 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:29.525151 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-557dddfbc-rd5tp" event={"ID":"06c59f79-7714-459e-bdca-6143b41065b6","Type":"ContainerStarted","Data":"dced5a54808012ce5b37dafda1d2f7d3325b90f44a2bfa2000e0be0758253efc"} Apr 23 18:38:29.525189 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:29.525197 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-557dddfbc-rd5tp" event={"ID":"06c59f79-7714-459e-bdca-6143b41065b6","Type":"ContainerStarted","Data":"978d3979a4803ce2a3dd36deca1de7763c8748f6923e3d4867a8daccd37197ec"} Apr 23 18:38:29.525619 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:29.525381 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-557dddfbc-rd5tp" Apr 23 18:38:29.545403 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:29.545351 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-557dddfbc-rd5tp" podStartSLOduration=5.545337043 podStartE2EDuration="5.545337043s" podCreationTimestamp="2026-04-23 18:38:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:38:29.544169802 +0000 UTC m=+2767.464262279" watchObservedRunningTime="2026-04-23 18:38:29.545337043 +0000 UTC m=+2767.465429517" Apr 23 18:38:30.528634 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:30.528595 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-557dddfbc-rd5tp" Apr 23 18:38:30.529680 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:30.529653 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-557dddfbc-rd5tp" podUID="06c59f79-7714-459e-bdca-6143b41065b6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 23 18:38:31.348310 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:31.348264 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq" podUID="67f15f76-fe39-48b6-9835-7006dbcaff80" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.44:8643/healthz\": dial tcp 10.133.0.44:8643: connect: connection refused" Apr 23 18:38:31.531323 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:31.531280 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-557dddfbc-rd5tp" podUID="06c59f79-7714-459e-bdca-6143b41065b6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 23 18:38:31.762757 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:31.762736 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq" Apr 23 18:38:31.829104 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:31.829067 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vg66\" (UniqueName: \"kubernetes.io/projected/67f15f76-fe39-48b6-9835-7006dbcaff80-kube-api-access-9vg66\") pod \"67f15f76-fe39-48b6-9835-7006dbcaff80\" (UID: \"67f15f76-fe39-48b6-9835-7006dbcaff80\") " Apr 23 18:38:31.829291 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:31.829116 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/67f15f76-fe39-48b6-9835-7006dbcaff80-proxy-tls\") pod \"67f15f76-fe39-48b6-9835-7006dbcaff80\" (UID: \"67f15f76-fe39-48b6-9835-7006dbcaff80\") " Apr 23 18:38:31.829291 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:31.829236 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/67f15f76-fe39-48b6-9835-7006dbcaff80-kserve-provision-location\") pod \"67f15f76-fe39-48b6-9835-7006dbcaff80\" (UID: \"67f15f76-fe39-48b6-9835-7006dbcaff80\") " Apr 23 18:38:31.829389 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:31.829303 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/67f15f76-fe39-48b6-9835-7006dbcaff80-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"67f15f76-fe39-48b6-9835-7006dbcaff80\" (UID: \"67f15f76-fe39-48b6-9835-7006dbcaff80\") " Apr 23 18:38:31.829598 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:31.829567 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67f15f76-fe39-48b6-9835-7006dbcaff80-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "67f15f76-fe39-48b6-9835-7006dbcaff80" (UID: "67f15f76-fe39-48b6-9835-7006dbcaff80"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:38:31.829721 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:31.829654 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67f15f76-fe39-48b6-9835-7006dbcaff80-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config") pod "67f15f76-fe39-48b6-9835-7006dbcaff80" (UID: "67f15f76-fe39-48b6-9835-7006dbcaff80"). InnerVolumeSpecName "isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:38:31.831261 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:31.831237 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67f15f76-fe39-48b6-9835-7006dbcaff80-kube-api-access-9vg66" (OuterVolumeSpecName: "kube-api-access-9vg66") pod "67f15f76-fe39-48b6-9835-7006dbcaff80" (UID: "67f15f76-fe39-48b6-9835-7006dbcaff80"). InnerVolumeSpecName "kube-api-access-9vg66". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:38:31.831355 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:31.831283 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67f15f76-fe39-48b6-9835-7006dbcaff80-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "67f15f76-fe39-48b6-9835-7006dbcaff80" (UID: "67f15f76-fe39-48b6-9835-7006dbcaff80"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:38:31.930557 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:31.930466 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/67f15f76-fe39-48b6-9835-7006dbcaff80-proxy-tls\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:38:31.930557 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:31.930495 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/67f15f76-fe39-48b6-9835-7006dbcaff80-kserve-provision-location\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:38:31.930557 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:31.930509 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/67f15f76-fe39-48b6-9835-7006dbcaff80-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:38:31.930557 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:31.930523 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9vg66\" (UniqueName: \"kubernetes.io/projected/67f15f76-fe39-48b6-9835-7006dbcaff80-kube-api-access-9vg66\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:38:32.534744 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:32.534712 2568 generic.go:358] "Generic (PLEG): container finished" podID="67f15f76-fe39-48b6-9835-7006dbcaff80" containerID="28a6c4964e33bf6fb04080ea47708d8907b1bde147521698d84a78c221cd2357" exitCode=0 Apr 23 18:38:32.535176 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:32.534775 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq" event={"ID":"67f15f76-fe39-48b6-9835-7006dbcaff80","Type":"ContainerDied","Data":"28a6c4964e33bf6fb04080ea47708d8907b1bde147521698d84a78c221cd2357"} Apr 23 18:38:32.535176 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:32.534787 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq" Apr 23 18:38:32.535176 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:32.534809 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq" event={"ID":"67f15f76-fe39-48b6-9835-7006dbcaff80","Type":"ContainerDied","Data":"ffa7d8bc68279d87e4d21284741f92488f43dd08c376a5576af653e3e4250bb5"} Apr 23 18:38:32.535176 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:32.534829 2568 scope.go:117] "RemoveContainer" containerID="f7754c8fe085bb9d55681eaccbd2a535047d9cde2fd7de3b670e62a6fb547c2c" Apr 23 18:38:32.542820 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:32.542802 2568 scope.go:117] "RemoveContainer" containerID="28a6c4964e33bf6fb04080ea47708d8907b1bde147521698d84a78c221cd2357" Apr 23 18:38:32.549855 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:32.549839 2568 scope.go:117] "RemoveContainer" containerID="1d5f9639dc3f98e57c207ac5f357ea69e948c919f4aecb49122b1bd1e50d9fbf" Apr 23 18:38:32.555577 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:32.555554 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq"] Apr 23 18:38:32.556600 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:32.556584 2568 scope.go:117] "RemoveContainer" containerID="f7754c8fe085bb9d55681eaccbd2a535047d9cde2fd7de3b670e62a6fb547c2c" Apr 23 18:38:32.556840 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:38:32.556819 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7754c8fe085bb9d55681eaccbd2a535047d9cde2fd7de3b670e62a6fb547c2c\": container with ID starting with f7754c8fe085bb9d55681eaccbd2a535047d9cde2fd7de3b670e62a6fb547c2c not found: ID does not exist" containerID="f7754c8fe085bb9d55681eaccbd2a535047d9cde2fd7de3b670e62a6fb547c2c" Apr 23 18:38:32.556901 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:32.556848 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7754c8fe085bb9d55681eaccbd2a535047d9cde2fd7de3b670e62a6fb547c2c"} err="failed to get container status \"f7754c8fe085bb9d55681eaccbd2a535047d9cde2fd7de3b670e62a6fb547c2c\": rpc error: code = NotFound desc = could not find container \"f7754c8fe085bb9d55681eaccbd2a535047d9cde2fd7de3b670e62a6fb547c2c\": container with ID starting with f7754c8fe085bb9d55681eaccbd2a535047d9cde2fd7de3b670e62a6fb547c2c not found: ID does not exist" Apr 23 18:38:32.556901 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:32.556865 2568 scope.go:117] "RemoveContainer" containerID="28a6c4964e33bf6fb04080ea47708d8907b1bde147521698d84a78c221cd2357" Apr 23 18:38:32.557126 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:38:32.557109 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28a6c4964e33bf6fb04080ea47708d8907b1bde147521698d84a78c221cd2357\": container with ID starting with 28a6c4964e33bf6fb04080ea47708d8907b1bde147521698d84a78c221cd2357 not found: ID does not exist" containerID="28a6c4964e33bf6fb04080ea47708d8907b1bde147521698d84a78c221cd2357" Apr 23 18:38:32.557197 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:32.557132 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28a6c4964e33bf6fb04080ea47708d8907b1bde147521698d84a78c221cd2357"} err="failed to get container status \"28a6c4964e33bf6fb04080ea47708d8907b1bde147521698d84a78c221cd2357\": rpc error: code = NotFound desc = could not find container \"28a6c4964e33bf6fb04080ea47708d8907b1bde147521698d84a78c221cd2357\": container with ID starting with 28a6c4964e33bf6fb04080ea47708d8907b1bde147521698d84a78c221cd2357 not found: ID does not exist" Apr 23 18:38:32.557197 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:32.557149 2568 scope.go:117] "RemoveContainer" containerID="1d5f9639dc3f98e57c207ac5f357ea69e948c919f4aecb49122b1bd1e50d9fbf" Apr 23 18:38:32.557404 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:38:32.557387 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d5f9639dc3f98e57c207ac5f357ea69e948c919f4aecb49122b1bd1e50d9fbf\": container with ID starting with 1d5f9639dc3f98e57c207ac5f357ea69e948c919f4aecb49122b1bd1e50d9fbf not found: ID does not exist" containerID="1d5f9639dc3f98e57c207ac5f357ea69e948c919f4aecb49122b1bd1e50d9fbf" Apr 23 18:38:32.557464 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:32.557408 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d5f9639dc3f98e57c207ac5f357ea69e948c919f4aecb49122b1bd1e50d9fbf"} err="failed to get container status \"1d5f9639dc3f98e57c207ac5f357ea69e948c919f4aecb49122b1bd1e50d9fbf\": rpc error: code = NotFound desc = could not find container \"1d5f9639dc3f98e57c207ac5f357ea69e948c919f4aecb49122b1bd1e50d9fbf\": container with ID starting with 1d5f9639dc3f98e57c207ac5f357ea69e948c919f4aecb49122b1bd1e50d9fbf not found: ID does not exist" Apr 23 18:38:32.561418 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:32.561395 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-gxbcq"] Apr 23 18:38:32.598656 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:32.598633 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67f15f76-fe39-48b6-9835-7006dbcaff80" path="/var/lib/kubelet/pods/67f15f76-fe39-48b6-9835-7006dbcaff80/volumes" Apr 23 18:38:36.535705 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:36.535675 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-557dddfbc-rd5tp" Apr 23 18:38:36.536132 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:36.536081 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-557dddfbc-rd5tp" podUID="06c59f79-7714-459e-bdca-6143b41065b6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 23 18:38:46.536246 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:46.536203 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-557dddfbc-rd5tp" podUID="06c59f79-7714-459e-bdca-6143b41065b6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 23 18:38:56.537082 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:38:56.537041 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-557dddfbc-rd5tp" podUID="06c59f79-7714-459e-bdca-6143b41065b6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 23 18:39:06.536128 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:06.536085 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-557dddfbc-rd5tp" podUID="06c59f79-7714-459e-bdca-6143b41065b6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 23 18:39:16.537070 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:16.537028 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-557dddfbc-rd5tp" podUID="06c59f79-7714-459e-bdca-6143b41065b6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 23 18:39:26.536704 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:26.536666 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-557dddfbc-rd5tp" podUID="06c59f79-7714-459e-bdca-6143b41065b6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 23 18:39:36.537079 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:36.537048 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-557dddfbc-rd5tp" Apr 23 18:39:44.333854 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:44.333817 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-557dddfbc-rd5tp"] Apr 23 18:39:44.334453 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:44.334169 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-557dddfbc-rd5tp" podUID="06c59f79-7714-459e-bdca-6143b41065b6" containerName="kserve-container" containerID="cri-o://978d3979a4803ce2a3dd36deca1de7763c8748f6923e3d4867a8daccd37197ec" gracePeriod=30 Apr 23 18:39:44.334453 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:44.334193 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-557dddfbc-rd5tp" podUID="06c59f79-7714-459e-bdca-6143b41065b6" containerName="kube-rbac-proxy" containerID="cri-o://dced5a54808012ce5b37dafda1d2f7d3325b90f44a2bfa2000e0be0758253efc" gracePeriod=30 Apr 23 18:39:44.431026 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:44.430992 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf"] Apr 23 18:39:44.431262 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:44.431250 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="67f15f76-fe39-48b6-9835-7006dbcaff80" containerName="kube-rbac-proxy" Apr 23 18:39:44.431311 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:44.431263 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="67f15f76-fe39-48b6-9835-7006dbcaff80" containerName="kube-rbac-proxy" Apr 23 18:39:44.431311 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:44.431274 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="67f15f76-fe39-48b6-9835-7006dbcaff80" containerName="storage-initializer" Apr 23 18:39:44.431311 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:44.431280 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="67f15f76-fe39-48b6-9835-7006dbcaff80" containerName="storage-initializer" Apr 23 18:39:44.431311 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:44.431310 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="67f15f76-fe39-48b6-9835-7006dbcaff80" containerName="kserve-container" Apr 23 18:39:44.431443 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:44.431316 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="67f15f76-fe39-48b6-9835-7006dbcaff80" containerName="kserve-container" Apr 23 18:39:44.431443 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:44.431361 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="67f15f76-fe39-48b6-9835-7006dbcaff80" containerName="kube-rbac-proxy" Apr 23 18:39:44.431443 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:44.431373 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="67f15f76-fe39-48b6-9835-7006dbcaff80" containerName="kserve-container" Apr 23 18:39:44.435418 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:44.435398 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf" Apr 23 18:39:44.437240 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:44.437218 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-mixed-predictor-serving-cert\"" Apr 23 18:39:44.437240 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:44.437236 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\"" Apr 23 18:39:44.444165 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:44.444127 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf"] Apr 23 18:39:44.459515 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:44.459495 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh97t\" (UniqueName: \"kubernetes.io/projected/c37cd673-df2b-4869-8cf1-cb5fe7af9906-kube-api-access-wh97t\") pod \"isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf\" (UID: \"c37cd673-df2b-4869-8cf1-cb5fe7af9906\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf" Apr 23 18:39:44.459611 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:44.459531 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c37cd673-df2b-4869-8cf1-cb5fe7af9906-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf\" (UID: \"c37cd673-df2b-4869-8cf1-cb5fe7af9906\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf" Apr 23 18:39:44.459611 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:44.459559 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c37cd673-df2b-4869-8cf1-cb5fe7af9906-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf\" (UID: \"c37cd673-df2b-4869-8cf1-cb5fe7af9906\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf" Apr 23 18:39:44.459702 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:44.459642 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c37cd673-df2b-4869-8cf1-cb5fe7af9906-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf\" (UID: \"c37cd673-df2b-4869-8cf1-cb5fe7af9906\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf" Apr 23 18:39:44.560435 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:44.560396 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c37cd673-df2b-4869-8cf1-cb5fe7af9906-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf\" (UID: \"c37cd673-df2b-4869-8cf1-cb5fe7af9906\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf" Apr 23 18:39:44.560624 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:44.560456 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wh97t\" (UniqueName: \"kubernetes.io/projected/c37cd673-df2b-4869-8cf1-cb5fe7af9906-kube-api-access-wh97t\") pod \"isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf\" (UID: \"c37cd673-df2b-4869-8cf1-cb5fe7af9906\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf" Apr 23 18:39:44.560624 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:44.560495 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c37cd673-df2b-4869-8cf1-cb5fe7af9906-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf\" (UID: \"c37cd673-df2b-4869-8cf1-cb5fe7af9906\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf" Apr 23 18:39:44.560624 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:44.560529 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c37cd673-df2b-4869-8cf1-cb5fe7af9906-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf\" (UID: \"c37cd673-df2b-4869-8cf1-cb5fe7af9906\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf" Apr 23 18:39:44.560791 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:39:44.560669 2568 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-serving-cert: secret "isvc-sklearn-v2-mixed-predictor-serving-cert" not found Apr 23 18:39:44.560791 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:39:44.560729 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c37cd673-df2b-4869-8cf1-cb5fe7af9906-proxy-tls podName:c37cd673-df2b-4869-8cf1-cb5fe7af9906 nodeName:}" failed. No retries permitted until 2026-04-23 18:39:45.060708061 +0000 UTC m=+2842.980800517 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/c37cd673-df2b-4869-8cf1-cb5fe7af9906-proxy-tls") pod "isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf" (UID: "c37cd673-df2b-4869-8cf1-cb5fe7af9906") : secret "isvc-sklearn-v2-mixed-predictor-serving-cert" not found Apr 23 18:39:44.560913 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:44.560837 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c37cd673-df2b-4869-8cf1-cb5fe7af9906-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf\" (UID: \"c37cd673-df2b-4869-8cf1-cb5fe7af9906\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf" Apr 23 18:39:44.561146 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:44.561125 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c37cd673-df2b-4869-8cf1-cb5fe7af9906-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf\" (UID: \"c37cd673-df2b-4869-8cf1-cb5fe7af9906\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf" Apr 23 18:39:44.569409 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:44.569380 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh97t\" (UniqueName: \"kubernetes.io/projected/c37cd673-df2b-4869-8cf1-cb5fe7af9906-kube-api-access-wh97t\") pod \"isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf\" (UID: \"c37cd673-df2b-4869-8cf1-cb5fe7af9906\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf" Apr 23 18:39:44.738149 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:44.738053 2568 generic.go:358] "Generic (PLEG): container finished" podID="06c59f79-7714-459e-bdca-6143b41065b6" containerID="dced5a54808012ce5b37dafda1d2f7d3325b90f44a2bfa2000e0be0758253efc" exitCode=2 Apr 23 18:39:44.738149 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:44.738108 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-557dddfbc-rd5tp" event={"ID":"06c59f79-7714-459e-bdca-6143b41065b6","Type":"ContainerDied","Data":"dced5a54808012ce5b37dafda1d2f7d3325b90f44a2bfa2000e0be0758253efc"} Apr 23 18:39:45.063870 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:45.063765 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c37cd673-df2b-4869-8cf1-cb5fe7af9906-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf\" (UID: \"c37cd673-df2b-4869-8cf1-cb5fe7af9906\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf" Apr 23 18:39:45.066239 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:45.066214 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c37cd673-df2b-4869-8cf1-cb5fe7af9906-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf\" (UID: \"c37cd673-df2b-4869-8cf1-cb5fe7af9906\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf" Apr 23 18:39:45.346303 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:45.346267 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf" Apr 23 18:39:45.485669 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:45.485627 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf"] Apr 23 18:39:45.490218 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:39:45.490185 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc37cd673_df2b_4869_8cf1_cb5fe7af9906.slice/crio-e1e9b4451f3573cd0388edd440a30fe17b7f407a48e55874e14e8935ae821144 WatchSource:0}: Error finding container e1e9b4451f3573cd0388edd440a30fe17b7f407a48e55874e14e8935ae821144: Status 404 returned error can't find the container with id e1e9b4451f3573cd0388edd440a30fe17b7f407a48e55874e14e8935ae821144 Apr 23 18:39:45.492139 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:45.492120 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 18:39:45.742800 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:45.742718 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf" event={"ID":"c37cd673-df2b-4869-8cf1-cb5fe7af9906","Type":"ContainerStarted","Data":"22ce176907e6d1910eb69018e30ae8f2b732042880c1468846500a3d7bd4d1d9"} Apr 23 18:39:45.742800 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:45.742754 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf" event={"ID":"c37cd673-df2b-4869-8cf1-cb5fe7af9906","Type":"ContainerStarted","Data":"e1e9b4451f3573cd0388edd440a30fe17b7f407a48e55874e14e8935ae821144"} Apr 23 18:39:46.532143 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:46.532101 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-557dddfbc-rd5tp" podUID="06c59f79-7714-459e-bdca-6143b41065b6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.45:8643/healthz\": dial tcp 10.133.0.45:8643: connect: connection refused" Apr 23 18:39:46.536533 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:46.536502 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-557dddfbc-rd5tp" podUID="06c59f79-7714-459e-bdca-6143b41065b6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 23 18:39:48.753517 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:48.753489 2568 generic.go:358] "Generic (PLEG): container finished" podID="06c59f79-7714-459e-bdca-6143b41065b6" containerID="978d3979a4803ce2a3dd36deca1de7763c8748f6923e3d4867a8daccd37197ec" exitCode=0 Apr 23 18:39:48.753847 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:48.753537 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-557dddfbc-rd5tp" event={"ID":"06c59f79-7714-459e-bdca-6143b41065b6","Type":"ContainerDied","Data":"978d3979a4803ce2a3dd36deca1de7763c8748f6923e3d4867a8daccd37197ec"} Apr 23 18:39:48.876194 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:48.876170 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-557dddfbc-rd5tp" Apr 23 18:39:48.887768 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:48.887742 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/06c59f79-7714-459e-bdca-6143b41065b6-kserve-provision-location\") pod \"06c59f79-7714-459e-bdca-6143b41065b6\" (UID: \"06c59f79-7714-459e-bdca-6143b41065b6\") " Apr 23 18:39:48.887916 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:48.887781 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2sfdg\" (UniqueName: \"kubernetes.io/projected/06c59f79-7714-459e-bdca-6143b41065b6-kube-api-access-2sfdg\") pod \"06c59f79-7714-459e-bdca-6143b41065b6\" (UID: \"06c59f79-7714-459e-bdca-6143b41065b6\") " Apr 23 18:39:48.887916 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:48.887812 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/06c59f79-7714-459e-bdca-6143b41065b6-proxy-tls\") pod \"06c59f79-7714-459e-bdca-6143b41065b6\" (UID: \"06c59f79-7714-459e-bdca-6143b41065b6\") " Apr 23 18:39:48.887916 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:48.887844 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/06c59f79-7714-459e-bdca-6143b41065b6-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"06c59f79-7714-459e-bdca-6143b41065b6\" (UID: \"06c59f79-7714-459e-bdca-6143b41065b6\") " Apr 23 18:39:48.888191 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:48.888164 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06c59f79-7714-459e-bdca-6143b41065b6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "06c59f79-7714-459e-bdca-6143b41065b6" (UID: "06c59f79-7714-459e-bdca-6143b41065b6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:39:48.888285 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:48.888262 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06c59f79-7714-459e-bdca-6143b41065b6-isvc-sklearn-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-v2-kube-rbac-proxy-sar-config") pod "06c59f79-7714-459e-bdca-6143b41065b6" (UID: "06c59f79-7714-459e-bdca-6143b41065b6"). InnerVolumeSpecName "isvc-sklearn-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:39:48.889975 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:48.889949 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06c59f79-7714-459e-bdca-6143b41065b6-kube-api-access-2sfdg" (OuterVolumeSpecName: "kube-api-access-2sfdg") pod "06c59f79-7714-459e-bdca-6143b41065b6" (UID: "06c59f79-7714-459e-bdca-6143b41065b6"). InnerVolumeSpecName "kube-api-access-2sfdg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:39:48.890103 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:48.890087 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06c59f79-7714-459e-bdca-6143b41065b6-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "06c59f79-7714-459e-bdca-6143b41065b6" (UID: "06c59f79-7714-459e-bdca-6143b41065b6"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:39:48.988460 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:48.988364 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/06c59f79-7714-459e-bdca-6143b41065b6-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:39:48.988460 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:48.988398 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/06c59f79-7714-459e-bdca-6143b41065b6-kserve-provision-location\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:39:48.988460 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:48.988416 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2sfdg\" (UniqueName: \"kubernetes.io/projected/06c59f79-7714-459e-bdca-6143b41065b6-kube-api-access-2sfdg\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:39:48.988460 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:48.988428 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/06c59f79-7714-459e-bdca-6143b41065b6-proxy-tls\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:39:49.757528 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:49.757493 2568 generic.go:358] "Generic (PLEG): container finished" podID="c37cd673-df2b-4869-8cf1-cb5fe7af9906" containerID="22ce176907e6d1910eb69018e30ae8f2b732042880c1468846500a3d7bd4d1d9" exitCode=0 Apr 23 18:39:49.758050 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:49.757563 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf" event={"ID":"c37cd673-df2b-4869-8cf1-cb5fe7af9906","Type":"ContainerDied","Data":"22ce176907e6d1910eb69018e30ae8f2b732042880c1468846500a3d7bd4d1d9"} Apr 23 18:39:49.759352 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:49.759323 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-557dddfbc-rd5tp" event={"ID":"06c59f79-7714-459e-bdca-6143b41065b6","Type":"ContainerDied","Data":"26164e085f42d87d809e955e9eb42f19a7768f83f3a5d31cc847976342ec8a63"} Apr 23 18:39:49.759448 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:49.759368 2568 scope.go:117] "RemoveContainer" containerID="dced5a54808012ce5b37dafda1d2f7d3325b90f44a2bfa2000e0be0758253efc" Apr 23 18:39:49.759448 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:49.759342 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-557dddfbc-rd5tp" Apr 23 18:39:49.767728 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:49.767374 2568 scope.go:117] "RemoveContainer" containerID="978d3979a4803ce2a3dd36deca1de7763c8748f6923e3d4867a8daccd37197ec" Apr 23 18:39:49.774784 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:49.774761 2568 scope.go:117] "RemoveContainer" containerID="41203b5729c456fd5cb5db0f54aabd0751388ad92ddb7f74b134f612dfd8b931" Apr 23 18:39:49.789606 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:49.789577 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-557dddfbc-rd5tp"] Apr 23 18:39:49.793176 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:49.793147 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-557dddfbc-rd5tp"] Apr 23 18:39:50.599145 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:50.599113 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06c59f79-7714-459e-bdca-6143b41065b6" path="/var/lib/kubelet/pods/06c59f79-7714-459e-bdca-6143b41065b6/volumes" Apr 23 18:39:50.764949 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:50.764888 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf" event={"ID":"c37cd673-df2b-4869-8cf1-cb5fe7af9906","Type":"ContainerStarted","Data":"0148638c9529385806c40560bbc7e1c275f5c12977b202d9a2b50d9e5f84756e"} Apr 23 18:39:50.764949 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:50.764925 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf" event={"ID":"c37cd673-df2b-4869-8cf1-cb5fe7af9906","Type":"ContainerStarted","Data":"5cf4ad4052272d5eb80a43747aabfec8396fc6f8411f50d5fcccc747b2c46624"} Apr 23 18:39:50.765491 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:50.765185 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf" Apr 23 18:39:50.783669 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:50.783622 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf" podStartSLOduration=6.783608568 podStartE2EDuration="6.783608568s" podCreationTimestamp="2026-04-23 18:39:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:39:50.782159931 +0000 UTC m=+2848.702252406" watchObservedRunningTime="2026-04-23 18:39:50.783608568 +0000 UTC m=+2848.703701043" Apr 23 18:39:51.767794 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:51.767755 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf" Apr 23 18:39:51.769022 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:51.768992 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf" podUID="c37cd673-df2b-4869-8cf1-cb5fe7af9906" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 23 18:39:52.770478 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:52.770433 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf" podUID="c37cd673-df2b-4869-8cf1-cb5fe7af9906" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 23 18:39:57.774946 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:57.774908 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf" Apr 23 18:39:57.775553 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:39:57.775529 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf" podUID="c37cd673-df2b-4869-8cf1-cb5fe7af9906" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 23 18:40:07.775532 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:40:07.775437 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf" podUID="c37cd673-df2b-4869-8cf1-cb5fe7af9906" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 23 18:40:17.776425 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:40:17.776379 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf" podUID="c37cd673-df2b-4869-8cf1-cb5fe7af9906" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 23 18:40:27.776112 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:40:27.776062 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf" podUID="c37cd673-df2b-4869-8cf1-cb5fe7af9906" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 23 18:40:37.776364 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:40:37.776324 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf" podUID="c37cd673-df2b-4869-8cf1-cb5fe7af9906" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 23 18:40:47.775708 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:40:47.775660 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf" podUID="c37cd673-df2b-4869-8cf1-cb5fe7af9906" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 23 18:40:57.776103 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:40:57.776073 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf" Apr 23 18:41:04.549885 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:04.549846 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf"] Apr 23 18:41:04.550361 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:04.550175 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf" podUID="c37cd673-df2b-4869-8cf1-cb5fe7af9906" containerName="kserve-container" containerID="cri-o://5cf4ad4052272d5eb80a43747aabfec8396fc6f8411f50d5fcccc747b2c46624" gracePeriod=30 Apr 23 18:41:04.550361 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:04.550264 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf" podUID="c37cd673-df2b-4869-8cf1-cb5fe7af9906" containerName="kube-rbac-proxy" containerID="cri-o://0148638c9529385806c40560bbc7e1c275f5c12977b202d9a2b50d9e5f84756e" gracePeriod=30 Apr 23 18:41:04.637527 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:04.637492 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xzg8r"] Apr 23 18:41:04.637794 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:04.637781 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="06c59f79-7714-459e-bdca-6143b41065b6" containerName="storage-initializer" Apr 23 18:41:04.637850 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:04.637795 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="06c59f79-7714-459e-bdca-6143b41065b6" containerName="storage-initializer" Apr 23 18:41:04.637850 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:04.637806 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="06c59f79-7714-459e-bdca-6143b41065b6" containerName="kserve-container" Apr 23 18:41:04.637850 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:04.637811 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="06c59f79-7714-459e-bdca-6143b41065b6" containerName="kserve-container" Apr 23 18:41:04.637850 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:04.637822 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="06c59f79-7714-459e-bdca-6143b41065b6" containerName="kube-rbac-proxy" Apr 23 18:41:04.637850 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:04.637828 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="06c59f79-7714-459e-bdca-6143b41065b6" containerName="kube-rbac-proxy" Apr 23 18:41:04.638074 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:04.637881 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="06c59f79-7714-459e-bdca-6143b41065b6" containerName="kube-rbac-proxy" Apr 23 18:41:04.638074 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:04.637891 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="06c59f79-7714-459e-bdca-6143b41065b6" containerName="kserve-container" Apr 23 18:41:04.640697 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:04.640676 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xzg8r" Apr 23 18:41:04.642375 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:04.642356 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-predictor-serving-cert\"" Apr 23 18:41:04.642477 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:04.642359 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-kube-rbac-proxy-sar-config\"" Apr 23 18:41:04.649594 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:04.649571 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xzg8r"] Apr 23 18:41:04.754288 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:04.754248 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a42b44c-1120-45fe-88d3-7a6edc4e52c4-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-xzg8r\" (UID: \"0a42b44c-1120-45fe-88d3-7a6edc4e52c4\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xzg8r" Apr 23 18:41:04.754478 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:04.754320 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0a42b44c-1120-45fe-88d3-7a6edc4e52c4-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-predictor-6756f669d7-xzg8r\" (UID: \"0a42b44c-1120-45fe-88d3-7a6edc4e52c4\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xzg8r" Apr 23 18:41:04.754478 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:04.754374 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndpwt\" (UniqueName: \"kubernetes.io/projected/0a42b44c-1120-45fe-88d3-7a6edc4e52c4-kube-api-access-ndpwt\") pod \"isvc-tensorflow-predictor-6756f669d7-xzg8r\" (UID: \"0a42b44c-1120-45fe-88d3-7a6edc4e52c4\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xzg8r" Apr 23 18:41:04.754478 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:04.754406 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a42b44c-1120-45fe-88d3-7a6edc4e52c4-kserve-provision-location\") pod \"isvc-tensorflow-predictor-6756f669d7-xzg8r\" (UID: \"0a42b44c-1120-45fe-88d3-7a6edc4e52c4\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xzg8r" Apr 23 18:41:04.855599 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:04.855562 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0a42b44c-1120-45fe-88d3-7a6edc4e52c4-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-predictor-6756f669d7-xzg8r\" (UID: \"0a42b44c-1120-45fe-88d3-7a6edc4e52c4\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xzg8r" Apr 23 18:41:04.855791 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:04.855606 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ndpwt\" (UniqueName: \"kubernetes.io/projected/0a42b44c-1120-45fe-88d3-7a6edc4e52c4-kube-api-access-ndpwt\") pod \"isvc-tensorflow-predictor-6756f669d7-xzg8r\" (UID: \"0a42b44c-1120-45fe-88d3-7a6edc4e52c4\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xzg8r" Apr 23 18:41:04.855791 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:04.855626 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a42b44c-1120-45fe-88d3-7a6edc4e52c4-kserve-provision-location\") pod \"isvc-tensorflow-predictor-6756f669d7-xzg8r\" (UID: \"0a42b44c-1120-45fe-88d3-7a6edc4e52c4\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xzg8r" Apr 23 18:41:04.855791 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:04.855656 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a42b44c-1120-45fe-88d3-7a6edc4e52c4-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-xzg8r\" (UID: \"0a42b44c-1120-45fe-88d3-7a6edc4e52c4\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xzg8r" Apr 23 18:41:04.856187 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:04.856166 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a42b44c-1120-45fe-88d3-7a6edc4e52c4-kserve-provision-location\") pod \"isvc-tensorflow-predictor-6756f669d7-xzg8r\" (UID: \"0a42b44c-1120-45fe-88d3-7a6edc4e52c4\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xzg8r" Apr 23 18:41:04.856394 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:04.856375 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0a42b44c-1120-45fe-88d3-7a6edc4e52c4-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-predictor-6756f669d7-xzg8r\" (UID: \"0a42b44c-1120-45fe-88d3-7a6edc4e52c4\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xzg8r" Apr 23 18:41:04.858145 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:04.858126 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a42b44c-1120-45fe-88d3-7a6edc4e52c4-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-xzg8r\" (UID: \"0a42b44c-1120-45fe-88d3-7a6edc4e52c4\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xzg8r" Apr 23 18:41:04.864700 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:04.864676 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndpwt\" (UniqueName: \"kubernetes.io/projected/0a42b44c-1120-45fe-88d3-7a6edc4e52c4-kube-api-access-ndpwt\") pod \"isvc-tensorflow-predictor-6756f669d7-xzg8r\" (UID: \"0a42b44c-1120-45fe-88d3-7a6edc4e52c4\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xzg8r" Apr 23 18:41:04.951546 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:04.951508 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xzg8r" Apr 23 18:41:04.974883 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:04.974846 2568 generic.go:358] "Generic (PLEG): container finished" podID="c37cd673-df2b-4869-8cf1-cb5fe7af9906" containerID="0148638c9529385806c40560bbc7e1c275f5c12977b202d9a2b50d9e5f84756e" exitCode=2 Apr 23 18:41:04.975054 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:04.974917 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf" event={"ID":"c37cd673-df2b-4869-8cf1-cb5fe7af9906","Type":"ContainerDied","Data":"0148638c9529385806c40560bbc7e1c275f5c12977b202d9a2b50d9e5f84756e"} Apr 23 18:41:05.075649 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:05.075622 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xzg8r"] Apr 23 18:41:05.076578 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:41:05.076551 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a42b44c_1120_45fe_88d3_7a6edc4e52c4.slice/crio-292a0ce9f1259402dbe9e4b5ed863595c6e6c3dfc43da97fbd6f4a50dca63c1b WatchSource:0}: Error finding container 292a0ce9f1259402dbe9e4b5ed863595c6e6c3dfc43da97fbd6f4a50dca63c1b: Status 404 returned error can't find the container with id 292a0ce9f1259402dbe9e4b5ed863595c6e6c3dfc43da97fbd6f4a50dca63c1b Apr 23 18:41:05.979015 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:05.978976 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xzg8r" event={"ID":"0a42b44c-1120-45fe-88d3-7a6edc4e52c4","Type":"ContainerStarted","Data":"796fb49549e347e46d732c5f435d3e513c2892358e105c8c09b78a8c1dce9e38"} Apr 23 18:41:05.979015 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:05.979016 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xzg8r" event={"ID":"0a42b44c-1120-45fe-88d3-7a6edc4e52c4","Type":"ContainerStarted","Data":"292a0ce9f1259402dbe9e4b5ed863595c6e6c3dfc43da97fbd6f4a50dca63c1b"} Apr 23 18:41:07.771635 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:07.771594 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf" podUID="c37cd673-df2b-4869-8cf1-cb5fe7af9906" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.46:8643/healthz\": dial tcp 10.133.0.46:8643: connect: connection refused" Apr 23 18:41:07.775481 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:07.775449 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf" podUID="c37cd673-df2b-4869-8cf1-cb5fe7af9906" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 23 18:41:08.990150 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:08.990120 2568 generic.go:358] "Generic (PLEG): container finished" podID="c37cd673-df2b-4869-8cf1-cb5fe7af9906" containerID="5cf4ad4052272d5eb80a43747aabfec8396fc6f8411f50d5fcccc747b2c46624" exitCode=0 Apr 23 18:41:08.990499 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:08.990202 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf" event={"ID":"c37cd673-df2b-4869-8cf1-cb5fe7af9906","Type":"ContainerDied","Data":"5cf4ad4052272d5eb80a43747aabfec8396fc6f8411f50d5fcccc747b2c46624"} Apr 23 18:41:08.990499 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:08.990247 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf" event={"ID":"c37cd673-df2b-4869-8cf1-cb5fe7af9906","Type":"ContainerDied","Data":"e1e9b4451f3573cd0388edd440a30fe17b7f407a48e55874e14e8935ae821144"} Apr 23 18:41:08.990499 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:08.990262 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1e9b4451f3573cd0388edd440a30fe17b7f407a48e55874e14e8935ae821144" Apr 23 18:41:08.997464 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:08.997437 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf" Apr 23 18:41:09.185096 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:09.185060 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c37cd673-df2b-4869-8cf1-cb5fe7af9906-proxy-tls\") pod \"c37cd673-df2b-4869-8cf1-cb5fe7af9906\" (UID: \"c37cd673-df2b-4869-8cf1-cb5fe7af9906\") " Apr 23 18:41:09.185303 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:09.185143 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c37cd673-df2b-4869-8cf1-cb5fe7af9906-kserve-provision-location\") pod \"c37cd673-df2b-4869-8cf1-cb5fe7af9906\" (UID: \"c37cd673-df2b-4869-8cf1-cb5fe7af9906\") " Apr 23 18:41:09.185303 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:09.185169 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wh97t\" (UniqueName: \"kubernetes.io/projected/c37cd673-df2b-4869-8cf1-cb5fe7af9906-kube-api-access-wh97t\") pod \"c37cd673-df2b-4869-8cf1-cb5fe7af9906\" (UID: \"c37cd673-df2b-4869-8cf1-cb5fe7af9906\") " Apr 23 18:41:09.185429 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:09.185334 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c37cd673-df2b-4869-8cf1-cb5fe7af9906-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"c37cd673-df2b-4869-8cf1-cb5fe7af9906\" (UID: \"c37cd673-df2b-4869-8cf1-cb5fe7af9906\") " Apr 23 18:41:09.185550 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:09.185516 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c37cd673-df2b-4869-8cf1-cb5fe7af9906-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c37cd673-df2b-4869-8cf1-cb5fe7af9906" (UID: "c37cd673-df2b-4869-8cf1-cb5fe7af9906"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:41:09.185745 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:09.185711 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c37cd673-df2b-4869-8cf1-cb5fe7af9906-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config") pod "c37cd673-df2b-4869-8cf1-cb5fe7af9906" (UID: "c37cd673-df2b-4869-8cf1-cb5fe7af9906"). InnerVolumeSpecName "isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:41:09.187334 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:09.187307 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c37cd673-df2b-4869-8cf1-cb5fe7af9906-kube-api-access-wh97t" (OuterVolumeSpecName: "kube-api-access-wh97t") pod "c37cd673-df2b-4869-8cf1-cb5fe7af9906" (UID: "c37cd673-df2b-4869-8cf1-cb5fe7af9906"). InnerVolumeSpecName "kube-api-access-wh97t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:41:09.187334 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:09.187326 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c37cd673-df2b-4869-8cf1-cb5fe7af9906-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c37cd673-df2b-4869-8cf1-cb5fe7af9906" (UID: "c37cd673-df2b-4869-8cf1-cb5fe7af9906"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:41:09.286115 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:09.286072 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c37cd673-df2b-4869-8cf1-cb5fe7af9906-kserve-provision-location\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:41:09.286115 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:09.286105 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wh97t\" (UniqueName: \"kubernetes.io/projected/c37cd673-df2b-4869-8cf1-cb5fe7af9906-kube-api-access-wh97t\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:41:09.286115 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:09.286120 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c37cd673-df2b-4869-8cf1-cb5fe7af9906-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:41:09.286357 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:09.286134 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c37cd673-df2b-4869-8cf1-cb5fe7af9906-proxy-tls\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:41:09.994394 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:09.994358 2568 generic.go:358] "Generic (PLEG): container finished" podID="0a42b44c-1120-45fe-88d3-7a6edc4e52c4" containerID="796fb49549e347e46d732c5f435d3e513c2892358e105c8c09b78a8c1dce9e38" exitCode=0 Apr 23 18:41:09.994823 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:09.994432 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xzg8r" event={"ID":"0a42b44c-1120-45fe-88d3-7a6edc4e52c4","Type":"ContainerDied","Data":"796fb49549e347e46d732c5f435d3e513c2892358e105c8c09b78a8c1dce9e38"} Apr 23 18:41:09.994823 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:09.994526 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf" Apr 23 18:41:10.054097 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:10.054065 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf"] Apr 23 18:41:10.060204 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:10.060175 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d99c9448b-n68gf"] Apr 23 18:41:10.600420 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:10.600375 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c37cd673-df2b-4869-8cf1-cb5fe7af9906" path="/var/lib/kubelet/pods/c37cd673-df2b-4869-8cf1-cb5fe7af9906/volumes" Apr 23 18:41:14.008957 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:14.008844 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xzg8r" event={"ID":"0a42b44c-1120-45fe-88d3-7a6edc4e52c4","Type":"ContainerStarted","Data":"10ec4d6cdb0a62286dc946237d010a7eed6288a64bcb404468ef7a4e5fe16230"} Apr 23 18:41:14.008957 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:14.008892 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xzg8r" event={"ID":"0a42b44c-1120-45fe-88d3-7a6edc4e52c4","Type":"ContainerStarted","Data":"64bc970902eb0d4f3401f8f8e6d07e6b700cb6709344dc32f4659e31ad245486"} Apr 23 18:41:14.009485 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:14.009161 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xzg8r" Apr 23 18:41:14.027644 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:14.027580 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xzg8r" podStartSLOduration=6.36050607 podStartE2EDuration="10.027563466s" podCreationTimestamp="2026-04-23 18:41:04 +0000 UTC" firstStartedPulling="2026-04-23 18:41:09.996144081 +0000 UTC m=+2927.916236533" lastFinishedPulling="2026-04-23 18:41:13.663201472 +0000 UTC m=+2931.583293929" observedRunningTime="2026-04-23 18:41:14.026823664 +0000 UTC m=+2931.946916136" watchObservedRunningTime="2026-04-23 18:41:14.027563466 +0000 UTC m=+2931.947655942" Apr 23 18:41:15.012966 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:15.012916 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xzg8r" Apr 23 18:41:15.014190 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:15.014163 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xzg8r" podUID="0a42b44c-1120-45fe-88d3-7a6edc4e52c4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 23 18:41:16.016595 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:16.016549 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xzg8r" podUID="0a42b44c-1120-45fe-88d3-7a6edc4e52c4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 23 18:41:21.021268 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:21.021239 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xzg8r" Apr 23 18:41:21.021908 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:21.021880 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xzg8r" podUID="0a42b44c-1120-45fe-88d3-7a6edc4e52c4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 23 18:41:31.022884 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:31.022854 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xzg8r" Apr 23 18:41:45.923988 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:45.923880 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xzg8r"] Apr 23 18:41:45.924387 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:45.924246 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xzg8r" podUID="0a42b44c-1120-45fe-88d3-7a6edc4e52c4" containerName="kserve-container" containerID="cri-o://64bc970902eb0d4f3401f8f8e6d07e6b700cb6709344dc32f4659e31ad245486" gracePeriod=30 Apr 23 18:41:45.924387 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:45.924309 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xzg8r" podUID="0a42b44c-1120-45fe-88d3-7a6edc4e52c4" containerName="kube-rbac-proxy" containerID="cri-o://10ec4d6cdb0a62286dc946237d010a7eed6288a64bcb404468ef7a4e5fe16230" gracePeriod=30 Apr 23 18:41:46.016843 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:46.016800 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw"] Apr 23 18:41:46.017205 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:46.017178 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xzg8r" podUID="0a42b44c-1120-45fe-88d3-7a6edc4e52c4" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.47:8643/healthz\": dial tcp 10.133.0.47:8643: connect: connection refused" Apr 23 18:41:46.017321 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:46.017187 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c37cd673-df2b-4869-8cf1-cb5fe7af9906" containerName="kube-rbac-proxy" Apr 23 18:41:46.017321 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:46.017299 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="c37cd673-df2b-4869-8cf1-cb5fe7af9906" containerName="kube-rbac-proxy" Apr 23 18:41:46.017321 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:46.017313 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c37cd673-df2b-4869-8cf1-cb5fe7af9906" containerName="storage-initializer" Apr 23 18:41:46.017321 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:46.017320 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="c37cd673-df2b-4869-8cf1-cb5fe7af9906" containerName="storage-initializer" Apr 23 18:41:46.017530 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:46.017326 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c37cd673-df2b-4869-8cf1-cb5fe7af9906" containerName="kserve-container" Apr 23 18:41:46.017530 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:46.017331 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="c37cd673-df2b-4869-8cf1-cb5fe7af9906" containerName="kserve-container" Apr 23 18:41:46.017530 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:46.017390 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="c37cd673-df2b-4869-8cf1-cb5fe7af9906" containerName="kserve-container" Apr 23 18:41:46.017530 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:46.017401 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="c37cd673-df2b-4869-8cf1-cb5fe7af9906" containerName="kube-rbac-proxy" Apr 23 18:41:46.020675 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:46.020654 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw" Apr 23 18:41:46.022426 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:46.022405 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-runtime-predictor-serving-cert\"" Apr 23 18:41:46.022426 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:46.022413 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\"" Apr 23 18:41:46.028537 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:46.028514 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw"] Apr 23 18:41:46.043673 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:46.043646 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3abe383a-f080-46ad-8500-4e27901fe96c-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw\" (UID: \"3abe383a-f080-46ad-8500-4e27901fe96c\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw" Apr 23 18:41:46.043809 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:46.043697 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3abe383a-f080-46ad-8500-4e27901fe96c-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw\" (UID: \"3abe383a-f080-46ad-8500-4e27901fe96c\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw" Apr 23 18:41:46.043809 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:46.043754 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d79f5\" (UniqueName: \"kubernetes.io/projected/3abe383a-f080-46ad-8500-4e27901fe96c-kube-api-access-d79f5\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw\" (UID: \"3abe383a-f080-46ad-8500-4e27901fe96c\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw" Apr 23 18:41:46.043900 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:46.043812 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3abe383a-f080-46ad-8500-4e27901fe96c-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw\" (UID: \"3abe383a-f080-46ad-8500-4e27901fe96c\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw" Apr 23 18:41:46.102404 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:46.102371 2568 generic.go:358] "Generic (PLEG): container finished" podID="0a42b44c-1120-45fe-88d3-7a6edc4e52c4" containerID="10ec4d6cdb0a62286dc946237d010a7eed6288a64bcb404468ef7a4e5fe16230" exitCode=2 Apr 23 18:41:46.102583 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:46.102444 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xzg8r" event={"ID":"0a42b44c-1120-45fe-88d3-7a6edc4e52c4","Type":"ContainerDied","Data":"10ec4d6cdb0a62286dc946237d010a7eed6288a64bcb404468ef7a4e5fe16230"} Apr 23 18:41:46.144405 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:46.144367 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3abe383a-f080-46ad-8500-4e27901fe96c-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw\" (UID: \"3abe383a-f080-46ad-8500-4e27901fe96c\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw" Apr 23 18:41:46.144527 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:46.144433 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3abe383a-f080-46ad-8500-4e27901fe96c-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw\" (UID: \"3abe383a-f080-46ad-8500-4e27901fe96c\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw" Apr 23 18:41:46.144527 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:46.144475 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d79f5\" (UniqueName: \"kubernetes.io/projected/3abe383a-f080-46ad-8500-4e27901fe96c-kube-api-access-d79f5\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw\" (UID: \"3abe383a-f080-46ad-8500-4e27901fe96c\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw" Apr 23 18:41:46.144527 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:46.144505 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3abe383a-f080-46ad-8500-4e27901fe96c-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw\" (UID: \"3abe383a-f080-46ad-8500-4e27901fe96c\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw" Apr 23 18:41:46.144646 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:41:46.144597 2568 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-serving-cert: secret "isvc-tensorflow-runtime-predictor-serving-cert" not found Apr 23 18:41:46.144685 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:41:46.144672 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3abe383a-f080-46ad-8500-4e27901fe96c-proxy-tls podName:3abe383a-f080-46ad-8500-4e27901fe96c nodeName:}" failed. No retries permitted until 2026-04-23 18:41:46.644651876 +0000 UTC m=+2964.564744343 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/3abe383a-f080-46ad-8500-4e27901fe96c-proxy-tls") pod "isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw" (UID: "3abe383a-f080-46ad-8500-4e27901fe96c") : secret "isvc-tensorflow-runtime-predictor-serving-cert" not found Apr 23 18:41:46.144862 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:46.144838 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3abe383a-f080-46ad-8500-4e27901fe96c-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw\" (UID: \"3abe383a-f080-46ad-8500-4e27901fe96c\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw" Apr 23 18:41:46.145152 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:46.145135 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3abe383a-f080-46ad-8500-4e27901fe96c-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw\" (UID: \"3abe383a-f080-46ad-8500-4e27901fe96c\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw" Apr 23 18:41:46.157058 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:46.157026 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d79f5\" (UniqueName: \"kubernetes.io/projected/3abe383a-f080-46ad-8500-4e27901fe96c-kube-api-access-d79f5\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw\" (UID: \"3abe383a-f080-46ad-8500-4e27901fe96c\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw" Apr 23 18:41:46.647092 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:46.647046 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3abe383a-f080-46ad-8500-4e27901fe96c-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw\" (UID: \"3abe383a-f080-46ad-8500-4e27901fe96c\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw" Apr 23 18:41:46.649464 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:46.649440 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3abe383a-f080-46ad-8500-4e27901fe96c-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw\" (UID: \"3abe383a-f080-46ad-8500-4e27901fe96c\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw" Apr 23 18:41:46.932007 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:46.931911 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw" Apr 23 18:41:47.053956 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:47.053916 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw"] Apr 23 18:41:47.056588 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:41:47.056548 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3abe383a_f080_46ad_8500_4e27901fe96c.slice/crio-f2215a58d75ce35e51eb7df77e6bd1017e36a3c0262395938c1eaa010e0e476e WatchSource:0}: Error finding container f2215a58d75ce35e51eb7df77e6bd1017e36a3c0262395938c1eaa010e0e476e: Status 404 returned error can't find the container with id f2215a58d75ce35e51eb7df77e6bd1017e36a3c0262395938c1eaa010e0e476e Apr 23 18:41:47.106981 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:47.106901 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw" event={"ID":"3abe383a-f080-46ad-8500-4e27901fe96c","Type":"ContainerStarted","Data":"f2215a58d75ce35e51eb7df77e6bd1017e36a3c0262395938c1eaa010e0e476e"} Apr 23 18:41:48.110828 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:48.110793 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw" event={"ID":"3abe383a-f080-46ad-8500-4e27901fe96c","Type":"ContainerStarted","Data":"58b1c4bb786c070a7ec1e02670d31972a3ad39be6a4b103dc154beef4fd01d63"} Apr 23 18:41:51.017301 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:51.017257 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xzg8r" podUID="0a42b44c-1120-45fe-88d3-7a6edc4e52c4" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.47:8643/healthz\": dial tcp 10.133.0.47:8643: connect: connection refused" Apr 23 18:41:52.124194 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:52.124157 2568 generic.go:358] "Generic (PLEG): container finished" podID="3abe383a-f080-46ad-8500-4e27901fe96c" containerID="58b1c4bb786c070a7ec1e02670d31972a3ad39be6a4b103dc154beef4fd01d63" exitCode=0 Apr 23 18:41:52.124680 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:52.124236 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw" event={"ID":"3abe383a-f080-46ad-8500-4e27901fe96c","Type":"ContainerDied","Data":"58b1c4bb786c070a7ec1e02670d31972a3ad39be6a4b103dc154beef4fd01d63"} Apr 23 18:41:53.128517 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:53.128483 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw" event={"ID":"3abe383a-f080-46ad-8500-4e27901fe96c","Type":"ContainerStarted","Data":"f5475b2b4e8f9542f4d2476e7fbc8a23419aecf9ba0ac4ce5c489cc8afd4e7e6"} Apr 23 18:41:53.128517 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:53.128520 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw" event={"ID":"3abe383a-f080-46ad-8500-4e27901fe96c","Type":"ContainerStarted","Data":"70d7355de29168e961b6a0d1aecfe3e0d83356dd80e72d1a1e19d641fcf80311"} Apr 23 18:41:53.128968 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:53.128765 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw" Apr 23 18:41:53.128968 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:53.128924 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw" Apr 23 18:41:53.130187 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:53.130164 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw" podUID="3abe383a-f080-46ad-8500-4e27901fe96c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 23 18:41:53.151317 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:53.151277 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw" podStartSLOduration=8.151266778 podStartE2EDuration="8.151266778s" podCreationTimestamp="2026-04-23 18:41:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:41:53.149677491 +0000 UTC m=+2971.069769988" watchObservedRunningTime="2026-04-23 18:41:53.151266778 +0000 UTC m=+2971.071359243" Apr 23 18:41:54.131487 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:54.131450 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw" podUID="3abe383a-f080-46ad-8500-4e27901fe96c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 23 18:41:56.017766 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:56.017715 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xzg8r" podUID="0a42b44c-1120-45fe-88d3-7a6edc4e52c4" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.47:8643/healthz\": dial tcp 10.133.0.47:8643: connect: connection refused" Apr 23 18:41:56.018190 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:56.017848 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xzg8r" Apr 23 18:41:59.135737 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:59.135705 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw" Apr 23 18:41:59.136324 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:41:59.136297 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw" podUID="3abe383a-f080-46ad-8500-4e27901fe96c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 23 18:42:01.017798 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:01.017757 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xzg8r" podUID="0a42b44c-1120-45fe-88d3-7a6edc4e52c4" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.47:8643/healthz\": dial tcp 10.133.0.47:8643: connect: connection refused" Apr 23 18:42:06.017660 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:06.017612 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xzg8r" podUID="0a42b44c-1120-45fe-88d3-7a6edc4e52c4" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.47:8643/healthz\": dial tcp 10.133.0.47:8643: connect: connection refused" Apr 23 18:42:09.136920 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:09.136890 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw" Apr 23 18:42:11.017078 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:11.017037 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xzg8r" podUID="0a42b44c-1120-45fe-88d3-7a6edc4e52c4" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.47:8643/healthz\": dial tcp 10.133.0.47:8643: connect: connection refused" Apr 23 18:42:16.017212 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:16.017170 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xzg8r" podUID="0a42b44c-1120-45fe-88d3-7a6edc4e52c4" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.47:8643/healthz\": dial tcp 10.133.0.47:8643: connect: connection refused" Apr 23 18:42:16.195872 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:16.195786 2568 generic.go:358] "Generic (PLEG): container finished" podID="0a42b44c-1120-45fe-88d3-7a6edc4e52c4" containerID="64bc970902eb0d4f3401f8f8e6d07e6b700cb6709344dc32f4659e31ad245486" exitCode=137 Apr 23 18:42:16.195872 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:16.195849 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xzg8r" event={"ID":"0a42b44c-1120-45fe-88d3-7a6edc4e52c4","Type":"ContainerDied","Data":"64bc970902eb0d4f3401f8f8e6d07e6b700cb6709344dc32f4659e31ad245486"} Apr 23 18:42:16.565306 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:16.565279 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xzg8r" Apr 23 18:42:16.650375 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:16.650337 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndpwt\" (UniqueName: \"kubernetes.io/projected/0a42b44c-1120-45fe-88d3-7a6edc4e52c4-kube-api-access-ndpwt\") pod \"0a42b44c-1120-45fe-88d3-7a6edc4e52c4\" (UID: \"0a42b44c-1120-45fe-88d3-7a6edc4e52c4\") " Apr 23 18:42:16.650569 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:16.650386 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a42b44c-1120-45fe-88d3-7a6edc4e52c4-kserve-provision-location\") pod \"0a42b44c-1120-45fe-88d3-7a6edc4e52c4\" (UID: \"0a42b44c-1120-45fe-88d3-7a6edc4e52c4\") " Apr 23 18:42:16.650569 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:16.650427 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0a42b44c-1120-45fe-88d3-7a6edc4e52c4-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"0a42b44c-1120-45fe-88d3-7a6edc4e52c4\" (UID: \"0a42b44c-1120-45fe-88d3-7a6edc4e52c4\") " Apr 23 18:42:16.650569 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:16.650531 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a42b44c-1120-45fe-88d3-7a6edc4e52c4-proxy-tls\") pod \"0a42b44c-1120-45fe-88d3-7a6edc4e52c4\" (UID: \"0a42b44c-1120-45fe-88d3-7a6edc4e52c4\") " Apr 23 18:42:16.650748 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:16.650707 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a42b44c-1120-45fe-88d3-7a6edc4e52c4-isvc-tensorflow-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-tensorflow-kube-rbac-proxy-sar-config") pod "0a42b44c-1120-45fe-88d3-7a6edc4e52c4" (UID: "0a42b44c-1120-45fe-88d3-7a6edc4e52c4"). InnerVolumeSpecName "isvc-tensorflow-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:42:16.650818 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:16.650766 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0a42b44c-1120-45fe-88d3-7a6edc4e52c4-isvc-tensorflow-kube-rbac-proxy-sar-config\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:42:16.652492 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:16.652465 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a42b44c-1120-45fe-88d3-7a6edc4e52c4-kube-api-access-ndpwt" (OuterVolumeSpecName: "kube-api-access-ndpwt") pod "0a42b44c-1120-45fe-88d3-7a6edc4e52c4" (UID: "0a42b44c-1120-45fe-88d3-7a6edc4e52c4"). InnerVolumeSpecName "kube-api-access-ndpwt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:42:16.652602 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:16.652501 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a42b44c-1120-45fe-88d3-7a6edc4e52c4-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0a42b44c-1120-45fe-88d3-7a6edc4e52c4" (UID: "0a42b44c-1120-45fe-88d3-7a6edc4e52c4"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:42:16.666625 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:16.666586 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a42b44c-1120-45fe-88d3-7a6edc4e52c4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0a42b44c-1120-45fe-88d3-7a6edc4e52c4" (UID: "0a42b44c-1120-45fe-88d3-7a6edc4e52c4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:42:16.751792 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:16.751705 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a42b44c-1120-45fe-88d3-7a6edc4e52c4-kserve-provision-location\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:42:16.751792 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:16.751735 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a42b44c-1120-45fe-88d3-7a6edc4e52c4-proxy-tls\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:42:16.751792 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:16.751746 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ndpwt\" (UniqueName: \"kubernetes.io/projected/0a42b44c-1120-45fe-88d3-7a6edc4e52c4-kube-api-access-ndpwt\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:42:17.199742 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:17.199709 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xzg8r" event={"ID":"0a42b44c-1120-45fe-88d3-7a6edc4e52c4","Type":"ContainerDied","Data":"292a0ce9f1259402dbe9e4b5ed863595c6e6c3dfc43da97fbd6f4a50dca63c1b"} Apr 23 18:42:17.200253 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:17.199752 2568 scope.go:117] "RemoveContainer" containerID="10ec4d6cdb0a62286dc946237d010a7eed6288a64bcb404468ef7a4e5fe16230" Apr 23 18:42:17.200253 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:17.199780 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xzg8r" Apr 23 18:42:17.210208 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:17.210187 2568 scope.go:117] "RemoveContainer" containerID="64bc970902eb0d4f3401f8f8e6d07e6b700cb6709344dc32f4659e31ad245486" Apr 23 18:42:17.217575 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:17.217555 2568 scope.go:117] "RemoveContainer" containerID="796fb49549e347e46d732c5f435d3e513c2892358e105c8c09b78a8c1dce9e38" Apr 23 18:42:17.225368 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:17.225342 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xzg8r"] Apr 23 18:42:17.230983 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:17.230956 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-xzg8r"] Apr 23 18:42:18.598963 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:18.598910 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a42b44c-1120-45fe-88d3-7a6edc4e52c4" path="/var/lib/kubelet/pods/0a42b44c-1120-45fe-88d3-7a6edc4e52c4/volumes" Apr 23 18:42:22.707756 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:22.707727 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwrk7_52f4d4c2-438f-4486-8ffb-4afbaad14bf1/ovn-acl-logging/0.log" Apr 23 18:42:22.711143 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:22.711120 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_8dc4456d0c7e39a0756b16fde0e83318/kube-rbac-proxy-crio/4.log" Apr 23 18:42:22.712314 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:22.712297 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwrk7_52f4d4c2-438f-4486-8ffb-4afbaad14bf1/ovn-acl-logging/0.log" Apr 23 18:42:22.715555 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:22.715537 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_8dc4456d0c7e39a0756b16fde0e83318/kube-rbac-proxy-crio/4.log" Apr 23 18:42:27.436008 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:27.435962 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw"] Apr 23 18:42:27.436408 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:27.436383 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw" podUID="3abe383a-f080-46ad-8500-4e27901fe96c" containerName="kserve-container" containerID="cri-o://70d7355de29168e961b6a0d1aecfe3e0d83356dd80e72d1a1e19d641fcf80311" gracePeriod=30 Apr 23 18:42:27.436475 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:27.436423 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw" podUID="3abe383a-f080-46ad-8500-4e27901fe96c" containerName="kube-rbac-proxy" containerID="cri-o://f5475b2b4e8f9542f4d2476e7fbc8a23419aecf9ba0ac4ce5c489cc8afd4e7e6" gracePeriod=30 Apr 23 18:42:27.499363 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:27.499333 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5888t"] Apr 23 18:42:27.499612 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:27.499601 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a42b44c-1120-45fe-88d3-7a6edc4e52c4" containerName="kserve-container" Apr 23 18:42:27.499676 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:27.499614 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a42b44c-1120-45fe-88d3-7a6edc4e52c4" containerName="kserve-container" Apr 23 18:42:27.499676 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:27.499623 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a42b44c-1120-45fe-88d3-7a6edc4e52c4" containerName="storage-initializer" Apr 23 18:42:27.499676 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:27.499628 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a42b44c-1120-45fe-88d3-7a6edc4e52c4" containerName="storage-initializer" Apr 23 18:42:27.499676 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:27.499636 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a42b44c-1120-45fe-88d3-7a6edc4e52c4" containerName="kube-rbac-proxy" Apr 23 18:42:27.499676 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:27.499642 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a42b44c-1120-45fe-88d3-7a6edc4e52c4" containerName="kube-rbac-proxy" Apr 23 18:42:27.499830 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:27.499687 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a42b44c-1120-45fe-88d3-7a6edc4e52c4" containerName="kube-rbac-proxy" Apr 23 18:42:27.499830 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:27.499698 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a42b44c-1120-45fe-88d3-7a6edc4e52c4" containerName="kserve-container" Apr 23 18:42:27.503016 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:27.502996 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5888t" Apr 23 18:42:27.505128 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:27.505099 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-triton-kube-rbac-proxy-sar-config\"" Apr 23 18:42:27.505249 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:27.505107 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-triton-predictor-serving-cert\"" Apr 23 18:42:27.512289 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:27.512267 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5888t"] Apr 23 18:42:27.537041 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:27.537011 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ce6223bd-de94-48b1-9d38-d12836c50320-isvc-triton-kube-rbac-proxy-sar-config\") pod \"isvc-triton-predictor-84bb65d94b-5888t\" (UID: \"ce6223bd-de94-48b1-9d38-d12836c50320\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5888t" Apr 23 18:42:27.537041 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:27.537044 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ce6223bd-de94-48b1-9d38-d12836c50320-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-5888t\" (UID: \"ce6223bd-de94-48b1-9d38-d12836c50320\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5888t" Apr 23 18:42:27.537235 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:27.537063 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ce6223bd-de94-48b1-9d38-d12836c50320-kserve-provision-location\") pod \"isvc-triton-predictor-84bb65d94b-5888t\" (UID: \"ce6223bd-de94-48b1-9d38-d12836c50320\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5888t" Apr 23 18:42:27.537235 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:27.537147 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kfgb\" (UniqueName: \"kubernetes.io/projected/ce6223bd-de94-48b1-9d38-d12836c50320-kube-api-access-5kfgb\") pod \"isvc-triton-predictor-84bb65d94b-5888t\" (UID: \"ce6223bd-de94-48b1-9d38-d12836c50320\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5888t" Apr 23 18:42:27.637645 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:27.637607 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5kfgb\" (UniqueName: \"kubernetes.io/projected/ce6223bd-de94-48b1-9d38-d12836c50320-kube-api-access-5kfgb\") pod \"isvc-triton-predictor-84bb65d94b-5888t\" (UID: \"ce6223bd-de94-48b1-9d38-d12836c50320\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5888t" Apr 23 18:42:27.637874 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:27.637661 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ce6223bd-de94-48b1-9d38-d12836c50320-isvc-triton-kube-rbac-proxy-sar-config\") pod \"isvc-triton-predictor-84bb65d94b-5888t\" (UID: \"ce6223bd-de94-48b1-9d38-d12836c50320\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5888t" Apr 23 18:42:27.637874 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:27.637689 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ce6223bd-de94-48b1-9d38-d12836c50320-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-5888t\" (UID: \"ce6223bd-de94-48b1-9d38-d12836c50320\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5888t" Apr 23 18:42:27.637874 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:27.637727 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ce6223bd-de94-48b1-9d38-d12836c50320-kserve-provision-location\") pod \"isvc-triton-predictor-84bb65d94b-5888t\" (UID: \"ce6223bd-de94-48b1-9d38-d12836c50320\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5888t" Apr 23 18:42:27.637874 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:42:27.637824 2568 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-triton-predictor-serving-cert: secret "isvc-triton-predictor-serving-cert" not found Apr 23 18:42:27.638153 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:42:27.637949 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce6223bd-de94-48b1-9d38-d12836c50320-proxy-tls podName:ce6223bd-de94-48b1-9d38-d12836c50320 nodeName:}" failed. No retries permitted until 2026-04-23 18:42:28.137907835 +0000 UTC m=+3006.058000300 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/ce6223bd-de94-48b1-9d38-d12836c50320-proxy-tls") pod "isvc-triton-predictor-84bb65d94b-5888t" (UID: "ce6223bd-de94-48b1-9d38-d12836c50320") : secret "isvc-triton-predictor-serving-cert" not found Apr 23 18:42:27.638325 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:27.638305 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ce6223bd-de94-48b1-9d38-d12836c50320-kserve-provision-location\") pod \"isvc-triton-predictor-84bb65d94b-5888t\" (UID: \"ce6223bd-de94-48b1-9d38-d12836c50320\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5888t" Apr 23 18:42:27.638422 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:27.638403 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ce6223bd-de94-48b1-9d38-d12836c50320-isvc-triton-kube-rbac-proxy-sar-config\") pod \"isvc-triton-predictor-84bb65d94b-5888t\" (UID: \"ce6223bd-de94-48b1-9d38-d12836c50320\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5888t" Apr 23 18:42:27.647350 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:27.647327 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kfgb\" (UniqueName: \"kubernetes.io/projected/ce6223bd-de94-48b1-9d38-d12836c50320-kube-api-access-5kfgb\") pod \"isvc-triton-predictor-84bb65d94b-5888t\" (UID: \"ce6223bd-de94-48b1-9d38-d12836c50320\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5888t" Apr 23 18:42:28.142175 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:28.142130 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ce6223bd-de94-48b1-9d38-d12836c50320-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-5888t\" (UID: \"ce6223bd-de94-48b1-9d38-d12836c50320\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5888t" Apr 23 18:42:28.144560 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:28.144534 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ce6223bd-de94-48b1-9d38-d12836c50320-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-5888t\" (UID: \"ce6223bd-de94-48b1-9d38-d12836c50320\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5888t" Apr 23 18:42:28.234020 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:28.233989 2568 generic.go:358] "Generic (PLEG): container finished" podID="3abe383a-f080-46ad-8500-4e27901fe96c" containerID="f5475b2b4e8f9542f4d2476e7fbc8a23419aecf9ba0ac4ce5c489cc8afd4e7e6" exitCode=2 Apr 23 18:42:28.234224 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:28.234073 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw" event={"ID":"3abe383a-f080-46ad-8500-4e27901fe96c","Type":"ContainerDied","Data":"f5475b2b4e8f9542f4d2476e7fbc8a23419aecf9ba0ac4ce5c489cc8afd4e7e6"} Apr 23 18:42:28.414151 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:28.414049 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5888t" Apr 23 18:42:28.536029 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:28.535999 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5888t"] Apr 23 18:42:28.539094 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:42:28.539062 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce6223bd_de94_48b1_9d38_d12836c50320.slice/crio-97dd62a2b23d411b50787966136174fbd342cc2fc44af60822a4403edbe5c50b WatchSource:0}: Error finding container 97dd62a2b23d411b50787966136174fbd342cc2fc44af60822a4403edbe5c50b: Status 404 returned error can't find the container with id 97dd62a2b23d411b50787966136174fbd342cc2fc44af60822a4403edbe5c50b Apr 23 18:42:29.132389 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:29.132348 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw" podUID="3abe383a-f080-46ad-8500-4e27901fe96c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.48:8643/healthz\": dial tcp 10.133.0.48:8643: connect: connection refused" Apr 23 18:42:29.238669 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:29.238629 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5888t" event={"ID":"ce6223bd-de94-48b1-9d38-d12836c50320","Type":"ContainerStarted","Data":"d804933a5ee97aaa2540d411f5e01c4b2f7335cb6082db016ead3b476053d5f2"} Apr 23 18:42:29.238669 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:29.238664 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5888t" event={"ID":"ce6223bd-de94-48b1-9d38-d12836c50320","Type":"ContainerStarted","Data":"97dd62a2b23d411b50787966136174fbd342cc2fc44af60822a4403edbe5c50b"} Apr 23 18:42:32.249241 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:32.249209 2568 generic.go:358] "Generic (PLEG): container finished" podID="ce6223bd-de94-48b1-9d38-d12836c50320" containerID="d804933a5ee97aaa2540d411f5e01c4b2f7335cb6082db016ead3b476053d5f2" exitCode=0 Apr 23 18:42:32.249624 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:32.249282 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5888t" event={"ID":"ce6223bd-de94-48b1-9d38-d12836c50320","Type":"ContainerDied","Data":"d804933a5ee97aaa2540d411f5e01c4b2f7335cb6082db016ead3b476053d5f2"} Apr 23 18:42:34.132423 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:34.131991 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw" podUID="3abe383a-f080-46ad-8500-4e27901fe96c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.48:8643/healthz\": dial tcp 10.133.0.48:8643: connect: connection refused" Apr 23 18:42:39.132047 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:39.131852 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw" podUID="3abe383a-f080-46ad-8500-4e27901fe96c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.48:8643/healthz\": dial tcp 10.133.0.48:8643: connect: connection refused" Apr 23 18:42:39.132642 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:39.132060 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw" Apr 23 18:42:44.131717 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:44.131670 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw" podUID="3abe383a-f080-46ad-8500-4e27901fe96c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.48:8643/healthz\": dial tcp 10.133.0.48:8643: connect: connection refused" Apr 23 18:42:49.132274 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:49.131961 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw" podUID="3abe383a-f080-46ad-8500-4e27901fe96c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.48:8643/healthz\": dial tcp 10.133.0.48:8643: connect: connection refused" Apr 23 18:42:54.132660 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:54.132613 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw" podUID="3abe383a-f080-46ad-8500-4e27901fe96c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.48:8643/healthz\": dial tcp 10.133.0.48:8643: connect: connection refused" Apr 23 18:42:58.126097 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:58.126020 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw" Apr 23 18:42:58.306560 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:58.306464 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3abe383a-f080-46ad-8500-4e27901fe96c-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"3abe383a-f080-46ad-8500-4e27901fe96c\" (UID: \"3abe383a-f080-46ad-8500-4e27901fe96c\") " Apr 23 18:42:58.306560 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:58.306543 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3abe383a-f080-46ad-8500-4e27901fe96c-kserve-provision-location\") pod \"3abe383a-f080-46ad-8500-4e27901fe96c\" (UID: \"3abe383a-f080-46ad-8500-4e27901fe96c\") " Apr 23 18:42:58.306805 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:58.306566 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d79f5\" (UniqueName: \"kubernetes.io/projected/3abe383a-f080-46ad-8500-4e27901fe96c-kube-api-access-d79f5\") pod \"3abe383a-f080-46ad-8500-4e27901fe96c\" (UID: \"3abe383a-f080-46ad-8500-4e27901fe96c\") " Apr 23 18:42:58.306805 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:58.306618 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3abe383a-f080-46ad-8500-4e27901fe96c-proxy-tls\") pod \"3abe383a-f080-46ad-8500-4e27901fe96c\" (UID: \"3abe383a-f080-46ad-8500-4e27901fe96c\") " Apr 23 18:42:58.307493 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:58.307455 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3abe383a-f080-46ad-8500-4e27901fe96c-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-tensorflow-runtime-kube-rbac-proxy-sar-config") pod "3abe383a-f080-46ad-8500-4e27901fe96c" (UID: "3abe383a-f080-46ad-8500-4e27901fe96c"). InnerVolumeSpecName "isvc-tensorflow-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:42:58.309811 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:58.309748 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3abe383a-f080-46ad-8500-4e27901fe96c-kube-api-access-d79f5" (OuterVolumeSpecName: "kube-api-access-d79f5") pod "3abe383a-f080-46ad-8500-4e27901fe96c" (UID: "3abe383a-f080-46ad-8500-4e27901fe96c"). InnerVolumeSpecName "kube-api-access-d79f5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:42:58.310060 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:58.310006 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3abe383a-f080-46ad-8500-4e27901fe96c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "3abe383a-f080-46ad-8500-4e27901fe96c" (UID: "3abe383a-f080-46ad-8500-4e27901fe96c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:42:58.311239 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:58.311212 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3abe383a-f080-46ad-8500-4e27901fe96c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3abe383a-f080-46ad-8500-4e27901fe96c" (UID: "3abe383a-f080-46ad-8500-4e27901fe96c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:42:58.362380 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:58.362344 2568 generic.go:358] "Generic (PLEG): container finished" podID="3abe383a-f080-46ad-8500-4e27901fe96c" containerID="70d7355de29168e961b6a0d1aecfe3e0d83356dd80e72d1a1e19d641fcf80311" exitCode=137 Apr 23 18:42:58.362554 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:58.362450 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw" event={"ID":"3abe383a-f080-46ad-8500-4e27901fe96c","Type":"ContainerDied","Data":"70d7355de29168e961b6a0d1aecfe3e0d83356dd80e72d1a1e19d641fcf80311"} Apr 23 18:42:58.362554 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:58.362473 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw" Apr 23 18:42:58.362554 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:58.362496 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw" event={"ID":"3abe383a-f080-46ad-8500-4e27901fe96c","Type":"ContainerDied","Data":"f2215a58d75ce35e51eb7df77e6bd1017e36a3c0262395938c1eaa010e0e476e"} Apr 23 18:42:58.362554 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:58.362516 2568 scope.go:117] "RemoveContainer" containerID="f5475b2b4e8f9542f4d2476e7fbc8a23419aecf9ba0ac4ce5c489cc8afd4e7e6" Apr 23 18:42:58.373974 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:58.373538 2568 scope.go:117] "RemoveContainer" containerID="70d7355de29168e961b6a0d1aecfe3e0d83356dd80e72d1a1e19d641fcf80311" Apr 23 18:42:58.382858 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:58.382835 2568 scope.go:117] "RemoveContainer" containerID="58b1c4bb786c070a7ec1e02670d31972a3ad39be6a4b103dc154beef4fd01d63" Apr 23 18:42:58.390421 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:58.390381 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw"] Apr 23 18:42:58.391852 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:58.391833 2568 scope.go:117] "RemoveContainer" containerID="f5475b2b4e8f9542f4d2476e7fbc8a23419aecf9ba0ac4ce5c489cc8afd4e7e6" Apr 23 18:42:58.392559 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:42:58.392395 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5475b2b4e8f9542f4d2476e7fbc8a23419aecf9ba0ac4ce5c489cc8afd4e7e6\": container with ID starting with f5475b2b4e8f9542f4d2476e7fbc8a23419aecf9ba0ac4ce5c489cc8afd4e7e6 not found: ID does not exist" containerID="f5475b2b4e8f9542f4d2476e7fbc8a23419aecf9ba0ac4ce5c489cc8afd4e7e6" Apr 23 18:42:58.393302 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:58.393267 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5475b2b4e8f9542f4d2476e7fbc8a23419aecf9ba0ac4ce5c489cc8afd4e7e6"} err="failed to get container status \"f5475b2b4e8f9542f4d2476e7fbc8a23419aecf9ba0ac4ce5c489cc8afd4e7e6\": rpc error: code = NotFound desc = could not find container \"f5475b2b4e8f9542f4d2476e7fbc8a23419aecf9ba0ac4ce5c489cc8afd4e7e6\": container with ID starting with f5475b2b4e8f9542f4d2476e7fbc8a23419aecf9ba0ac4ce5c489cc8afd4e7e6 not found: ID does not exist" Apr 23 18:42:58.393410 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:58.393303 2568 scope.go:117] "RemoveContainer" containerID="70d7355de29168e961b6a0d1aecfe3e0d83356dd80e72d1a1e19d641fcf80311" Apr 23 18:42:58.393611 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:42:58.393586 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70d7355de29168e961b6a0d1aecfe3e0d83356dd80e72d1a1e19d641fcf80311\": container with ID starting with 70d7355de29168e961b6a0d1aecfe3e0d83356dd80e72d1a1e19d641fcf80311 not found: ID does not exist" containerID="70d7355de29168e961b6a0d1aecfe3e0d83356dd80e72d1a1e19d641fcf80311" Apr 23 18:42:58.393674 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:58.393619 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70d7355de29168e961b6a0d1aecfe3e0d83356dd80e72d1a1e19d641fcf80311"} err="failed to get container status \"70d7355de29168e961b6a0d1aecfe3e0d83356dd80e72d1a1e19d641fcf80311\": rpc error: code = NotFound desc = could not find container \"70d7355de29168e961b6a0d1aecfe3e0d83356dd80e72d1a1e19d641fcf80311\": container with ID starting with 70d7355de29168e961b6a0d1aecfe3e0d83356dd80e72d1a1e19d641fcf80311 not found: ID does not exist" Apr 23 18:42:58.393674 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:58.393657 2568 scope.go:117] "RemoveContainer" containerID="58b1c4bb786c070a7ec1e02670d31972a3ad39be6a4b103dc154beef4fd01d63" Apr 23 18:42:58.394054 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:42:58.393979 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58b1c4bb786c070a7ec1e02670d31972a3ad39be6a4b103dc154beef4fd01d63\": container with ID starting with 58b1c4bb786c070a7ec1e02670d31972a3ad39be6a4b103dc154beef4fd01d63 not found: ID does not exist" containerID="58b1c4bb786c070a7ec1e02670d31972a3ad39be6a4b103dc154beef4fd01d63" Apr 23 18:42:58.394054 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:58.394022 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58b1c4bb786c070a7ec1e02670d31972a3ad39be6a4b103dc154beef4fd01d63"} err="failed to get container status \"58b1c4bb786c070a7ec1e02670d31972a3ad39be6a4b103dc154beef4fd01d63\": rpc error: code = NotFound desc = could not find container \"58b1c4bb786c070a7ec1e02670d31972a3ad39be6a4b103dc154beef4fd01d63\": container with ID starting with 58b1c4bb786c070a7ec1e02670d31972a3ad39be6a4b103dc154beef4fd01d63 not found: ID does not exist" Apr 23 18:42:58.394306 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:58.394283 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-tkgpw"] Apr 23 18:42:58.407451 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:58.407428 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3abe383a-f080-46ad-8500-4e27901fe96c-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:42:58.407559 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:58.407456 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3abe383a-f080-46ad-8500-4e27901fe96c-kserve-provision-location\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:42:58.407559 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:58.407475 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d79f5\" (UniqueName: \"kubernetes.io/projected/3abe383a-f080-46ad-8500-4e27901fe96c-kube-api-access-d79f5\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:42:58.407559 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:58.407490 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3abe383a-f080-46ad-8500-4e27901fe96c-proxy-tls\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:42:58.600723 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:42:58.600679 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3abe383a-f080-46ad-8500-4e27901fe96c" path="/var/lib/kubelet/pods/3abe383a-f080-46ad-8500-4e27901fe96c/volumes" Apr 23 18:44:27.640374 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:27.640333 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5888t" event={"ID":"ce6223bd-de94-48b1-9d38-d12836c50320","Type":"ContainerStarted","Data":"4ecad39aa1bbda178fa8525befed8839666dc703143750b2880dde7a4d6ee9c7"} Apr 23 18:44:27.640374 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:27.640375 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5888t" event={"ID":"ce6223bd-de94-48b1-9d38-d12836c50320","Type":"ContainerStarted","Data":"2d1bf00d648c4130823649ff34c00533906ecb764265b15ad0da8a2cd969df2f"} Apr 23 18:44:27.640885 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:27.640515 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5888t" Apr 23 18:44:27.640885 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:27.640651 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5888t" Apr 23 18:44:27.641945 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:27.641902 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5888t" podUID="ce6223bd-de94-48b1-9d38-d12836c50320" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 23 18:44:27.664428 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:27.664380 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5888t" podStartSLOduration=6.112884696 podStartE2EDuration="2m0.664364585s" podCreationTimestamp="2026-04-23 18:42:27 +0000 UTC" firstStartedPulling="2026-04-23 18:42:32.250349373 +0000 UTC m=+3010.170441827" lastFinishedPulling="2026-04-23 18:44:26.801829249 +0000 UTC m=+3124.721921716" observedRunningTime="2026-04-23 18:44:27.658982959 +0000 UTC m=+3125.579075431" watchObservedRunningTime="2026-04-23 18:44:27.664364585 +0000 UTC m=+3125.584457059" Apr 23 18:44:28.643659 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:28.643615 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5888t" podUID="ce6223bd-de94-48b1-9d38-d12836c50320" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 23 18:44:33.648109 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:33.648018 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5888t" Apr 23 18:44:33.648885 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:33.648863 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5888t" Apr 23 18:44:39.346043 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:39.346012 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5888t"] Apr 23 18:44:39.346570 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:39.346327 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5888t" podUID="ce6223bd-de94-48b1-9d38-d12836c50320" containerName="kserve-container" containerID="cri-o://2d1bf00d648c4130823649ff34c00533906ecb764265b15ad0da8a2cd969df2f" gracePeriod=30 Apr 23 18:44:39.346570 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:39.346391 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5888t" podUID="ce6223bd-de94-48b1-9d38-d12836c50320" containerName="kube-rbac-proxy" containerID="cri-o://4ecad39aa1bbda178fa8525befed8839666dc703143750b2880dde7a4d6ee9c7" gracePeriod=30 Apr 23 18:44:39.457657 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:39.457626 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fhdjq"] Apr 23 18:44:39.457953 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:39.457915 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3abe383a-f080-46ad-8500-4e27901fe96c" containerName="kserve-container" Apr 23 18:44:39.457953 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:39.457947 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="3abe383a-f080-46ad-8500-4e27901fe96c" containerName="kserve-container" Apr 23 18:44:39.458116 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:39.457961 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3abe383a-f080-46ad-8500-4e27901fe96c" containerName="storage-initializer" Apr 23 18:44:39.458116 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:39.457968 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="3abe383a-f080-46ad-8500-4e27901fe96c" containerName="storage-initializer" Apr 23 18:44:39.458116 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:39.457975 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3abe383a-f080-46ad-8500-4e27901fe96c" containerName="kube-rbac-proxy" Apr 23 18:44:39.458116 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:39.457980 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="3abe383a-f080-46ad-8500-4e27901fe96c" containerName="kube-rbac-proxy" Apr 23 18:44:39.458116 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:39.458042 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="3abe383a-f080-46ad-8500-4e27901fe96c" containerName="kserve-container" Apr 23 18:44:39.458116 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:39.458053 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="3abe383a-f080-46ad-8500-4e27901fe96c" containerName="kube-rbac-proxy" Apr 23 18:44:39.476435 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:39.476394 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fhdjq"] Apr 23 18:44:39.476626 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:39.476566 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fhdjq" Apr 23 18:44:39.478446 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:39.478425 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-kube-rbac-proxy-sar-config\"" Apr 23 18:44:39.478568 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:39.478425 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-predictor-serving-cert\"" Apr 23 18:44:39.533804 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:39.533765 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/980e2734-f854-455a-aed9-93aa43022fa9-kserve-provision-location\") pod \"isvc-xgboost-predictor-8689c4cfcc-fhdjq\" (UID: \"980e2734-f854-455a-aed9-93aa43022fa9\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fhdjq" Apr 23 18:44:39.533997 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:39.533824 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/980e2734-f854-455a-aed9-93aa43022fa9-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-predictor-8689c4cfcc-fhdjq\" (UID: \"980e2734-f854-455a-aed9-93aa43022fa9\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fhdjq" Apr 23 18:44:39.533997 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:39.533949 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/980e2734-f854-455a-aed9-93aa43022fa9-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-fhdjq\" (UID: \"980e2734-f854-455a-aed9-93aa43022fa9\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fhdjq" Apr 23 18:44:39.533997 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:39.533980 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrxf4\" (UniqueName: \"kubernetes.io/projected/980e2734-f854-455a-aed9-93aa43022fa9-kube-api-access-nrxf4\") pod \"isvc-xgboost-predictor-8689c4cfcc-fhdjq\" (UID: \"980e2734-f854-455a-aed9-93aa43022fa9\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fhdjq" Apr 23 18:44:39.634910 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:39.634822 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/980e2734-f854-455a-aed9-93aa43022fa9-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-fhdjq\" (UID: \"980e2734-f854-455a-aed9-93aa43022fa9\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fhdjq" Apr 23 18:44:39.634910 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:39.634859 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nrxf4\" (UniqueName: \"kubernetes.io/projected/980e2734-f854-455a-aed9-93aa43022fa9-kube-api-access-nrxf4\") pod \"isvc-xgboost-predictor-8689c4cfcc-fhdjq\" (UID: \"980e2734-f854-455a-aed9-93aa43022fa9\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fhdjq" Apr 23 18:44:39.635144 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:44:39.634984 2568 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-predictor-serving-cert: secret "isvc-xgboost-predictor-serving-cert" not found Apr 23 18:44:39.635144 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:44:39.635046 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/980e2734-f854-455a-aed9-93aa43022fa9-proxy-tls podName:980e2734-f854-455a-aed9-93aa43022fa9 nodeName:}" failed. No retries permitted until 2026-04-23 18:44:40.13503054 +0000 UTC m=+3138.055122994 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/980e2734-f854-455a-aed9-93aa43022fa9-proxy-tls") pod "isvc-xgboost-predictor-8689c4cfcc-fhdjq" (UID: "980e2734-f854-455a-aed9-93aa43022fa9") : secret "isvc-xgboost-predictor-serving-cert" not found Apr 23 18:44:39.635144 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:39.634985 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/980e2734-f854-455a-aed9-93aa43022fa9-kserve-provision-location\") pod \"isvc-xgboost-predictor-8689c4cfcc-fhdjq\" (UID: \"980e2734-f854-455a-aed9-93aa43022fa9\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fhdjq" Apr 23 18:44:39.635144 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:39.635099 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/980e2734-f854-455a-aed9-93aa43022fa9-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-predictor-8689c4cfcc-fhdjq\" (UID: \"980e2734-f854-455a-aed9-93aa43022fa9\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fhdjq" Apr 23 18:44:39.635320 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:39.635304 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/980e2734-f854-455a-aed9-93aa43022fa9-kserve-provision-location\") pod \"isvc-xgboost-predictor-8689c4cfcc-fhdjq\" (UID: \"980e2734-f854-455a-aed9-93aa43022fa9\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fhdjq" Apr 23 18:44:39.635644 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:39.635628 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/980e2734-f854-455a-aed9-93aa43022fa9-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-predictor-8689c4cfcc-fhdjq\" (UID: \"980e2734-f854-455a-aed9-93aa43022fa9\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fhdjq" Apr 23 18:44:39.643661 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:39.643637 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrxf4\" (UniqueName: \"kubernetes.io/projected/980e2734-f854-455a-aed9-93aa43022fa9-kube-api-access-nrxf4\") pod \"isvc-xgboost-predictor-8689c4cfcc-fhdjq\" (UID: \"980e2734-f854-455a-aed9-93aa43022fa9\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fhdjq" Apr 23 18:44:39.679514 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:39.679477 2568 generic.go:358] "Generic (PLEG): container finished" podID="ce6223bd-de94-48b1-9d38-d12836c50320" containerID="4ecad39aa1bbda178fa8525befed8839666dc703143750b2880dde7a4d6ee9c7" exitCode=2 Apr 23 18:44:39.679667 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:39.679547 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5888t" event={"ID":"ce6223bd-de94-48b1-9d38-d12836c50320","Type":"ContainerDied","Data":"4ecad39aa1bbda178fa8525befed8839666dc703143750b2880dde7a4d6ee9c7"} Apr 23 18:44:40.137878 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:40.137840 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/980e2734-f854-455a-aed9-93aa43022fa9-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-fhdjq\" (UID: \"980e2734-f854-455a-aed9-93aa43022fa9\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fhdjq" Apr 23 18:44:40.140250 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:40.140224 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/980e2734-f854-455a-aed9-93aa43022fa9-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-fhdjq\" (UID: \"980e2734-f854-455a-aed9-93aa43022fa9\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fhdjq" Apr 23 18:44:40.386831 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:40.386791 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fhdjq" Apr 23 18:44:40.547815 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:40.547770 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fhdjq"] Apr 23 18:44:40.550683 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:44:40.550655 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod980e2734_f854_455a_aed9_93aa43022fa9.slice/crio-00cf4011103e7fe76cc4bf030175bbf45d080d8446932047debce835f8eeec3a WatchSource:0}: Error finding container 00cf4011103e7fe76cc4bf030175bbf45d080d8446932047debce835f8eeec3a: Status 404 returned error can't find the container with id 00cf4011103e7fe76cc4bf030175bbf45d080d8446932047debce835f8eeec3a Apr 23 18:44:40.683702 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:40.683624 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fhdjq" event={"ID":"980e2734-f854-455a-aed9-93aa43022fa9","Type":"ContainerStarted","Data":"99ccc46400db33031a43d1884473bf775e3b2a8784449c6745fdf31606dec485"} Apr 23 18:44:40.683702 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:40.683664 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fhdjq" event={"ID":"980e2734-f854-455a-aed9-93aa43022fa9","Type":"ContainerStarted","Data":"00cf4011103e7fe76cc4bf030175bbf45d080d8446932047debce835f8eeec3a"} Apr 23 18:44:41.591437 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:41.591414 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5888t" Apr 23 18:44:41.650622 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:41.650563 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ce6223bd-de94-48b1-9d38-d12836c50320-proxy-tls\") pod \"ce6223bd-de94-48b1-9d38-d12836c50320\" (UID: \"ce6223bd-de94-48b1-9d38-d12836c50320\") " Apr 23 18:44:41.650622 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:41.650598 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kfgb\" (UniqueName: \"kubernetes.io/projected/ce6223bd-de94-48b1-9d38-d12836c50320-kube-api-access-5kfgb\") pod \"ce6223bd-de94-48b1-9d38-d12836c50320\" (UID: \"ce6223bd-de94-48b1-9d38-d12836c50320\") " Apr 23 18:44:41.650798 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:41.650627 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ce6223bd-de94-48b1-9d38-d12836c50320-kserve-provision-location\") pod \"ce6223bd-de94-48b1-9d38-d12836c50320\" (UID: \"ce6223bd-de94-48b1-9d38-d12836c50320\") " Apr 23 18:44:41.650798 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:41.650654 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ce6223bd-de94-48b1-9d38-d12836c50320-isvc-triton-kube-rbac-proxy-sar-config\") pod \"ce6223bd-de94-48b1-9d38-d12836c50320\" (UID: \"ce6223bd-de94-48b1-9d38-d12836c50320\") " Apr 23 18:44:41.651084 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:41.651056 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce6223bd-de94-48b1-9d38-d12836c50320-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ce6223bd-de94-48b1-9d38-d12836c50320" (UID: "ce6223bd-de94-48b1-9d38-d12836c50320"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:44:41.651199 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:41.651059 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce6223bd-de94-48b1-9d38-d12836c50320-isvc-triton-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-triton-kube-rbac-proxy-sar-config") pod "ce6223bd-de94-48b1-9d38-d12836c50320" (UID: "ce6223bd-de94-48b1-9d38-d12836c50320"). InnerVolumeSpecName "isvc-triton-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:44:41.653255 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:41.653234 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce6223bd-de94-48b1-9d38-d12836c50320-kube-api-access-5kfgb" (OuterVolumeSpecName: "kube-api-access-5kfgb") pod "ce6223bd-de94-48b1-9d38-d12836c50320" (UID: "ce6223bd-de94-48b1-9d38-d12836c50320"). InnerVolumeSpecName "kube-api-access-5kfgb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:44:41.653255 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:41.653243 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce6223bd-de94-48b1-9d38-d12836c50320-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ce6223bd-de94-48b1-9d38-d12836c50320" (UID: "ce6223bd-de94-48b1-9d38-d12836c50320"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:44:41.688007 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:41.687980 2568 generic.go:358] "Generic (PLEG): container finished" podID="ce6223bd-de94-48b1-9d38-d12836c50320" containerID="2d1bf00d648c4130823649ff34c00533906ecb764265b15ad0da8a2cd969df2f" exitCode=0 Apr 23 18:44:41.688114 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:41.688072 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5888t" Apr 23 18:44:41.688114 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:41.688070 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5888t" event={"ID":"ce6223bd-de94-48b1-9d38-d12836c50320","Type":"ContainerDied","Data":"2d1bf00d648c4130823649ff34c00533906ecb764265b15ad0da8a2cd969df2f"} Apr 23 18:44:41.688205 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:41.688113 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5888t" event={"ID":"ce6223bd-de94-48b1-9d38-d12836c50320","Type":"ContainerDied","Data":"97dd62a2b23d411b50787966136174fbd342cc2fc44af60822a4403edbe5c50b"} Apr 23 18:44:41.688205 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:41.688133 2568 scope.go:117] "RemoveContainer" containerID="4ecad39aa1bbda178fa8525befed8839666dc703143750b2880dde7a4d6ee9c7" Apr 23 18:44:41.695599 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:41.695435 2568 scope.go:117] "RemoveContainer" containerID="2d1bf00d648c4130823649ff34c00533906ecb764265b15ad0da8a2cd969df2f" Apr 23 18:44:41.702309 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:41.702294 2568 scope.go:117] "RemoveContainer" containerID="d804933a5ee97aaa2540d411f5e01c4b2f7335cb6082db016ead3b476053d5f2" Apr 23 18:44:41.708733 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:41.708719 2568 scope.go:117] "RemoveContainer" containerID="4ecad39aa1bbda178fa8525befed8839666dc703143750b2880dde7a4d6ee9c7" Apr 23 18:44:41.708977 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:44:41.708958 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ecad39aa1bbda178fa8525befed8839666dc703143750b2880dde7a4d6ee9c7\": container with ID starting with 4ecad39aa1bbda178fa8525befed8839666dc703143750b2880dde7a4d6ee9c7 not found: ID does not exist" containerID="4ecad39aa1bbda178fa8525befed8839666dc703143750b2880dde7a4d6ee9c7" Apr 23 18:44:41.709030 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:41.708986 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ecad39aa1bbda178fa8525befed8839666dc703143750b2880dde7a4d6ee9c7"} err="failed to get container status \"4ecad39aa1bbda178fa8525befed8839666dc703143750b2880dde7a4d6ee9c7\": rpc error: code = NotFound desc = could not find container \"4ecad39aa1bbda178fa8525befed8839666dc703143750b2880dde7a4d6ee9c7\": container with ID starting with 4ecad39aa1bbda178fa8525befed8839666dc703143750b2880dde7a4d6ee9c7 not found: ID does not exist" Apr 23 18:44:41.709030 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:41.709003 2568 scope.go:117] "RemoveContainer" containerID="2d1bf00d648c4130823649ff34c00533906ecb764265b15ad0da8a2cd969df2f" Apr 23 18:44:41.709222 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:44:41.709203 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d1bf00d648c4130823649ff34c00533906ecb764265b15ad0da8a2cd969df2f\": container with ID starting with 2d1bf00d648c4130823649ff34c00533906ecb764265b15ad0da8a2cd969df2f not found: ID does not exist" containerID="2d1bf00d648c4130823649ff34c00533906ecb764265b15ad0da8a2cd969df2f" Apr 23 18:44:41.709263 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:41.709228 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d1bf00d648c4130823649ff34c00533906ecb764265b15ad0da8a2cd969df2f"} err="failed to get container status \"2d1bf00d648c4130823649ff34c00533906ecb764265b15ad0da8a2cd969df2f\": rpc error: code = NotFound desc = could not find container \"2d1bf00d648c4130823649ff34c00533906ecb764265b15ad0da8a2cd969df2f\": container with ID starting with 2d1bf00d648c4130823649ff34c00533906ecb764265b15ad0da8a2cd969df2f not found: ID does not exist" Apr 23 18:44:41.709263 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:41.709242 2568 scope.go:117] "RemoveContainer" containerID="d804933a5ee97aaa2540d411f5e01c4b2f7335cb6082db016ead3b476053d5f2" Apr 23 18:44:41.709465 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:44:41.709448 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d804933a5ee97aaa2540d411f5e01c4b2f7335cb6082db016ead3b476053d5f2\": container with ID starting with d804933a5ee97aaa2540d411f5e01c4b2f7335cb6082db016ead3b476053d5f2 not found: ID does not exist" containerID="d804933a5ee97aaa2540d411f5e01c4b2f7335cb6082db016ead3b476053d5f2" Apr 23 18:44:41.709521 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:41.709468 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d804933a5ee97aaa2540d411f5e01c4b2f7335cb6082db016ead3b476053d5f2"} err="failed to get container status \"d804933a5ee97aaa2540d411f5e01c4b2f7335cb6082db016ead3b476053d5f2\": rpc error: code = NotFound desc = could not find container \"d804933a5ee97aaa2540d411f5e01c4b2f7335cb6082db016ead3b476053d5f2\": container with ID starting with d804933a5ee97aaa2540d411f5e01c4b2f7335cb6082db016ead3b476053d5f2 not found: ID does not exist" Apr 23 18:44:41.723773 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:41.723753 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5888t"] Apr 23 18:44:41.736915 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:41.736894 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5888t"] Apr 23 18:44:41.751782 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:41.751759 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ce6223bd-de94-48b1-9d38-d12836c50320-proxy-tls\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:44:41.751860 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:41.751786 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5kfgb\" (UniqueName: \"kubernetes.io/projected/ce6223bd-de94-48b1-9d38-d12836c50320-kube-api-access-5kfgb\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:44:41.751860 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:41.751801 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ce6223bd-de94-48b1-9d38-d12836c50320-kserve-provision-location\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:44:41.751860 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:41.751816 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ce6223bd-de94-48b1-9d38-d12836c50320-isvc-triton-kube-rbac-proxy-sar-config\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:44:42.599952 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:42.599903 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce6223bd-de94-48b1-9d38-d12836c50320" path="/var/lib/kubelet/pods/ce6223bd-de94-48b1-9d38-d12836c50320/volumes" Apr 23 18:44:45.700597 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:45.700557 2568 generic.go:358] "Generic (PLEG): container finished" podID="980e2734-f854-455a-aed9-93aa43022fa9" containerID="99ccc46400db33031a43d1884473bf775e3b2a8784449c6745fdf31606dec485" exitCode=0 Apr 23 18:44:45.701094 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:45.700630 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fhdjq" event={"ID":"980e2734-f854-455a-aed9-93aa43022fa9","Type":"ContainerDied","Data":"99ccc46400db33031a43d1884473bf775e3b2a8784449c6745fdf31606dec485"} Apr 23 18:44:45.701842 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:44:45.701826 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 18:45:03.754241 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:45:03.754211 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fhdjq" event={"ID":"980e2734-f854-455a-aed9-93aa43022fa9","Type":"ContainerStarted","Data":"4107f3941e278f5fbbafdec6884e82cbb9f7e790aeac0921c1d70462cbd88681"} Apr 23 18:45:03.754573 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:45:03.754248 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fhdjq" event={"ID":"980e2734-f854-455a-aed9-93aa43022fa9","Type":"ContainerStarted","Data":"88431299ba535d4047c5c7496235d6719449b97a1fe90f8d53b4ea540ac131fe"} Apr 23 18:45:04.757219 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:45:04.757179 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fhdjq" Apr 23 18:45:04.757219 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:45:04.757227 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fhdjq" Apr 23 18:45:04.758533 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:45:04.758503 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fhdjq" podUID="980e2734-f854-455a-aed9-93aa43022fa9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Apr 23 18:45:04.776553 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:45:04.776507 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fhdjq" podStartSLOduration=7.943919438 podStartE2EDuration="25.77649563s" podCreationTimestamp="2026-04-23 18:44:39 +0000 UTC" firstStartedPulling="2026-04-23 18:44:45.701965398 +0000 UTC m=+3143.622057851" lastFinishedPulling="2026-04-23 18:45:03.53454159 +0000 UTC m=+3161.454634043" observedRunningTime="2026-04-23 18:45:04.775769854 +0000 UTC m=+3162.695862343" watchObservedRunningTime="2026-04-23 18:45:04.77649563 +0000 UTC m=+3162.696588105" Apr 23 18:45:05.760035 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:45:05.759997 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fhdjq" podUID="980e2734-f854-455a-aed9-93aa43022fa9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Apr 23 18:45:10.764152 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:45:10.764118 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fhdjq" Apr 23 18:45:10.764676 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:45:10.764648 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fhdjq" podUID="980e2734-f854-455a-aed9-93aa43022fa9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Apr 23 18:45:20.765443 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:45:20.765404 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fhdjq" podUID="980e2734-f854-455a-aed9-93aa43022fa9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Apr 23 18:45:30.764629 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:45:30.764590 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fhdjq" podUID="980e2734-f854-455a-aed9-93aa43022fa9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Apr 23 18:45:40.764620 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:45:40.764579 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fhdjq" podUID="980e2734-f854-455a-aed9-93aa43022fa9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Apr 23 18:45:50.765117 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:45:50.765077 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fhdjq" podUID="980e2734-f854-455a-aed9-93aa43022fa9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Apr 23 18:46:00.765249 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:00.765207 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fhdjq" podUID="980e2734-f854-455a-aed9-93aa43022fa9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Apr 23 18:46:10.766056 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:10.765979 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fhdjq" Apr 23 18:46:19.567191 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:19.567152 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fhdjq"] Apr 23 18:46:19.567711 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:19.567502 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fhdjq" podUID="980e2734-f854-455a-aed9-93aa43022fa9" containerName="kserve-container" containerID="cri-o://88431299ba535d4047c5c7496235d6719449b97a1fe90f8d53b4ea540ac131fe" gracePeriod=30 Apr 23 18:46:19.567711 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:19.567559 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fhdjq" podUID="980e2734-f854-455a-aed9-93aa43022fa9" containerName="kube-rbac-proxy" containerID="cri-o://4107f3941e278f5fbbafdec6884e82cbb9f7e790aeac0921c1d70462cbd88681" gracePeriod=30 Apr 23 18:46:19.679259 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:19.679215 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8"] Apr 23 18:46:19.679544 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:19.679526 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ce6223bd-de94-48b1-9d38-d12836c50320" containerName="storage-initializer" Apr 23 18:46:19.679544 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:19.679542 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce6223bd-de94-48b1-9d38-d12836c50320" containerName="storage-initializer" Apr 23 18:46:19.679656 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:19.679552 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ce6223bd-de94-48b1-9d38-d12836c50320" containerName="kube-rbac-proxy" Apr 23 18:46:19.679656 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:19.679558 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce6223bd-de94-48b1-9d38-d12836c50320" containerName="kube-rbac-proxy" Apr 23 18:46:19.679656 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:19.679568 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ce6223bd-de94-48b1-9d38-d12836c50320" containerName="kserve-container" Apr 23 18:46:19.679656 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:19.679575 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce6223bd-de94-48b1-9d38-d12836c50320" containerName="kserve-container" Apr 23 18:46:19.679656 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:19.679616 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="ce6223bd-de94-48b1-9d38-d12836c50320" containerName="kube-rbac-proxy" Apr 23 18:46:19.679656 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:19.679628 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="ce6223bd-de94-48b1-9d38-d12836c50320" containerName="kserve-container" Apr 23 18:46:19.682792 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:19.682772 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8" Apr 23 18:46:19.684479 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:19.684458 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-mlserver-predictor-serving-cert\"" Apr 23 18:46:19.684558 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:19.684468 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\"" Apr 23 18:46:19.692526 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:19.692506 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8"] Apr 23 18:46:19.803381 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:19.803350 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d3be1e59-5a16-460e-8089-d7b479f49f36-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8\" (UID: \"d3be1e59-5a16-460e-8089-d7b479f49f36\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8" Apr 23 18:46:19.803595 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:19.803389 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d3be1e59-5a16-460e-8089-d7b479f49f36-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8\" (UID: \"d3be1e59-5a16-460e-8089-d7b479f49f36\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8" Apr 23 18:46:19.803595 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:19.803427 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfmlh\" (UniqueName: \"kubernetes.io/projected/d3be1e59-5a16-460e-8089-d7b479f49f36-kube-api-access-cfmlh\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8\" (UID: \"d3be1e59-5a16-460e-8089-d7b479f49f36\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8" Apr 23 18:46:19.803595 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:19.803527 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d3be1e59-5a16-460e-8089-d7b479f49f36-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8\" (UID: \"d3be1e59-5a16-460e-8089-d7b479f49f36\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8" Apr 23 18:46:19.904700 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:19.904663 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d3be1e59-5a16-460e-8089-d7b479f49f36-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8\" (UID: \"d3be1e59-5a16-460e-8089-d7b479f49f36\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8" Apr 23 18:46:19.904918 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:19.904709 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d3be1e59-5a16-460e-8089-d7b479f49f36-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8\" (UID: \"d3be1e59-5a16-460e-8089-d7b479f49f36\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8" Apr 23 18:46:19.904918 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:19.904755 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cfmlh\" (UniqueName: \"kubernetes.io/projected/d3be1e59-5a16-460e-8089-d7b479f49f36-kube-api-access-cfmlh\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8\" (UID: \"d3be1e59-5a16-460e-8089-d7b479f49f36\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8" Apr 23 18:46:19.904918 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:19.904779 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d3be1e59-5a16-460e-8089-d7b479f49f36-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8\" (UID: \"d3be1e59-5a16-460e-8089-d7b479f49f36\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8" Apr 23 18:46:19.905197 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:19.905176 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d3be1e59-5a16-460e-8089-d7b479f49f36-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8\" (UID: \"d3be1e59-5a16-460e-8089-d7b479f49f36\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8" Apr 23 18:46:19.905532 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:19.905512 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d3be1e59-5a16-460e-8089-d7b479f49f36-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8\" (UID: \"d3be1e59-5a16-460e-8089-d7b479f49f36\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8" Apr 23 18:46:19.907283 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:19.907257 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d3be1e59-5a16-460e-8089-d7b479f49f36-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8\" (UID: \"d3be1e59-5a16-460e-8089-d7b479f49f36\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8" Apr 23 18:46:19.914811 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:19.914790 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfmlh\" (UniqueName: \"kubernetes.io/projected/d3be1e59-5a16-460e-8089-d7b479f49f36-kube-api-access-cfmlh\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8\" (UID: \"d3be1e59-5a16-460e-8089-d7b479f49f36\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8" Apr 23 18:46:19.970473 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:19.970435 2568 generic.go:358] "Generic (PLEG): container finished" podID="980e2734-f854-455a-aed9-93aa43022fa9" containerID="4107f3941e278f5fbbafdec6884e82cbb9f7e790aeac0921c1d70462cbd88681" exitCode=2 Apr 23 18:46:19.970473 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:19.970487 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fhdjq" event={"ID":"980e2734-f854-455a-aed9-93aa43022fa9","Type":"ContainerDied","Data":"4107f3941e278f5fbbafdec6884e82cbb9f7e790aeac0921c1d70462cbd88681"} Apr 23 18:46:19.993352 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:19.993319 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8" Apr 23 18:46:20.117487 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:20.117385 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8"] Apr 23 18:46:20.120116 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:46:20.120086 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3be1e59_5a16_460e_8089_d7b479f49f36.slice/crio-e83db869ce687ecdda479ca8e4e315940f62ede74101dcf6d4fe24de71a4da1e WatchSource:0}: Error finding container e83db869ce687ecdda479ca8e4e315940f62ede74101dcf6d4fe24de71a4da1e: Status 404 returned error can't find the container with id e83db869ce687ecdda479ca8e4e315940f62ede74101dcf6d4fe24de71a4da1e Apr 23 18:46:20.760728 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:20.760691 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fhdjq" podUID="980e2734-f854-455a-aed9-93aa43022fa9" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.50:8643/healthz\": dial tcp 10.133.0.50:8643: connect: connection refused" Apr 23 18:46:20.764598 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:20.764566 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fhdjq" podUID="980e2734-f854-455a-aed9-93aa43022fa9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Apr 23 18:46:20.974804 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:20.974765 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8" event={"ID":"d3be1e59-5a16-460e-8089-d7b479f49f36","Type":"ContainerStarted","Data":"8192abb61cd2478b10194fcaf10e80040a24af7b34e638e1161b019a9d75cabb"} Apr 23 18:46:20.974804 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:20.974806 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8" event={"ID":"d3be1e59-5a16-460e-8089-d7b479f49f36","Type":"ContainerStarted","Data":"e83db869ce687ecdda479ca8e4e315940f62ede74101dcf6d4fe24de71a4da1e"} Apr 23 18:46:23.231321 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:23.231297 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fhdjq" Apr 23 18:46:23.329019 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:23.328992 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrxf4\" (UniqueName: \"kubernetes.io/projected/980e2734-f854-455a-aed9-93aa43022fa9-kube-api-access-nrxf4\") pod \"980e2734-f854-455a-aed9-93aa43022fa9\" (UID: \"980e2734-f854-455a-aed9-93aa43022fa9\") " Apr 23 18:46:23.329194 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:23.329067 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/980e2734-f854-455a-aed9-93aa43022fa9-kserve-provision-location\") pod \"980e2734-f854-455a-aed9-93aa43022fa9\" (UID: \"980e2734-f854-455a-aed9-93aa43022fa9\") " Apr 23 18:46:23.329194 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:23.329126 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/980e2734-f854-455a-aed9-93aa43022fa9-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"980e2734-f854-455a-aed9-93aa43022fa9\" (UID: \"980e2734-f854-455a-aed9-93aa43022fa9\") " Apr 23 18:46:23.329194 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:23.329155 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/980e2734-f854-455a-aed9-93aa43022fa9-proxy-tls\") pod \"980e2734-f854-455a-aed9-93aa43022fa9\" (UID: \"980e2734-f854-455a-aed9-93aa43022fa9\") " Apr 23 18:46:23.329447 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:23.329418 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/980e2734-f854-455a-aed9-93aa43022fa9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "980e2734-f854-455a-aed9-93aa43022fa9" (UID: "980e2734-f854-455a-aed9-93aa43022fa9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:46:23.329577 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:23.329499 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/980e2734-f854-455a-aed9-93aa43022fa9-isvc-xgboost-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-kube-rbac-proxy-sar-config") pod "980e2734-f854-455a-aed9-93aa43022fa9" (UID: "980e2734-f854-455a-aed9-93aa43022fa9"). InnerVolumeSpecName "isvc-xgboost-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:46:23.331101 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:23.331075 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/980e2734-f854-455a-aed9-93aa43022fa9-kube-api-access-nrxf4" (OuterVolumeSpecName: "kube-api-access-nrxf4") pod "980e2734-f854-455a-aed9-93aa43022fa9" (UID: "980e2734-f854-455a-aed9-93aa43022fa9"). InnerVolumeSpecName "kube-api-access-nrxf4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:46:23.331214 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:23.331190 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/980e2734-f854-455a-aed9-93aa43022fa9-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "980e2734-f854-455a-aed9-93aa43022fa9" (UID: "980e2734-f854-455a-aed9-93aa43022fa9"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:46:23.429786 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:23.429744 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/980e2734-f854-455a-aed9-93aa43022fa9-kserve-provision-location\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:46:23.429786 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:23.429785 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/980e2734-f854-455a-aed9-93aa43022fa9-isvc-xgboost-kube-rbac-proxy-sar-config\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:46:23.430021 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:23.429805 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/980e2734-f854-455a-aed9-93aa43022fa9-proxy-tls\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:46:23.430021 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:23.429822 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nrxf4\" (UniqueName: \"kubernetes.io/projected/980e2734-f854-455a-aed9-93aa43022fa9-kube-api-access-nrxf4\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:46:23.989137 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:23.989102 2568 generic.go:358] "Generic (PLEG): container finished" podID="980e2734-f854-455a-aed9-93aa43022fa9" containerID="88431299ba535d4047c5c7496235d6719449b97a1fe90f8d53b4ea540ac131fe" exitCode=0 Apr 23 18:46:23.989332 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:23.989205 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fhdjq" Apr 23 18:46:23.989332 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:23.989194 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fhdjq" event={"ID":"980e2734-f854-455a-aed9-93aa43022fa9","Type":"ContainerDied","Data":"88431299ba535d4047c5c7496235d6719449b97a1fe90f8d53b4ea540ac131fe"} Apr 23 18:46:23.989332 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:23.989330 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fhdjq" event={"ID":"980e2734-f854-455a-aed9-93aa43022fa9","Type":"ContainerDied","Data":"00cf4011103e7fe76cc4bf030175bbf45d080d8446932047debce835f8eeec3a"} Apr 23 18:46:23.989477 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:23.989350 2568 scope.go:117] "RemoveContainer" containerID="4107f3941e278f5fbbafdec6884e82cbb9f7e790aeac0921c1d70462cbd88681" Apr 23 18:46:23.997865 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:23.997844 2568 scope.go:117] "RemoveContainer" containerID="88431299ba535d4047c5c7496235d6719449b97a1fe90f8d53b4ea540ac131fe" Apr 23 18:46:24.005491 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:24.005466 2568 scope.go:117] "RemoveContainer" containerID="99ccc46400db33031a43d1884473bf775e3b2a8784449c6745fdf31606dec485" Apr 23 18:46:24.012714 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:24.012690 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fhdjq"] Apr 23 18:46:24.018253 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:24.018229 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fhdjq"] Apr 23 18:46:24.056508 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:24.056486 2568 scope.go:117] "RemoveContainer" containerID="4107f3941e278f5fbbafdec6884e82cbb9f7e790aeac0921c1d70462cbd88681" Apr 23 18:46:24.056862 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:46:24.056842 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4107f3941e278f5fbbafdec6884e82cbb9f7e790aeac0921c1d70462cbd88681\": container with ID starting with 4107f3941e278f5fbbafdec6884e82cbb9f7e790aeac0921c1d70462cbd88681 not found: ID does not exist" containerID="4107f3941e278f5fbbafdec6884e82cbb9f7e790aeac0921c1d70462cbd88681" Apr 23 18:46:24.056907 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:24.056872 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4107f3941e278f5fbbafdec6884e82cbb9f7e790aeac0921c1d70462cbd88681"} err="failed to get container status \"4107f3941e278f5fbbafdec6884e82cbb9f7e790aeac0921c1d70462cbd88681\": rpc error: code = NotFound desc = could not find container \"4107f3941e278f5fbbafdec6884e82cbb9f7e790aeac0921c1d70462cbd88681\": container with ID starting with 4107f3941e278f5fbbafdec6884e82cbb9f7e790aeac0921c1d70462cbd88681 not found: ID does not exist" Apr 23 18:46:24.056907 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:24.056893 2568 scope.go:117] "RemoveContainer" containerID="88431299ba535d4047c5c7496235d6719449b97a1fe90f8d53b4ea540ac131fe" Apr 23 18:46:24.057194 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:46:24.057175 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88431299ba535d4047c5c7496235d6719449b97a1fe90f8d53b4ea540ac131fe\": container with ID starting with 88431299ba535d4047c5c7496235d6719449b97a1fe90f8d53b4ea540ac131fe not found: ID does not exist" containerID="88431299ba535d4047c5c7496235d6719449b97a1fe90f8d53b4ea540ac131fe" Apr 23 18:46:24.057257 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:24.057200 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88431299ba535d4047c5c7496235d6719449b97a1fe90f8d53b4ea540ac131fe"} err="failed to get container status \"88431299ba535d4047c5c7496235d6719449b97a1fe90f8d53b4ea540ac131fe\": rpc error: code = NotFound desc = could not find container \"88431299ba535d4047c5c7496235d6719449b97a1fe90f8d53b4ea540ac131fe\": container with ID starting with 88431299ba535d4047c5c7496235d6719449b97a1fe90f8d53b4ea540ac131fe not found: ID does not exist" Apr 23 18:46:24.057257 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:24.057216 2568 scope.go:117] "RemoveContainer" containerID="99ccc46400db33031a43d1884473bf775e3b2a8784449c6745fdf31606dec485" Apr 23 18:46:24.057456 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:46:24.057442 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99ccc46400db33031a43d1884473bf775e3b2a8784449c6745fdf31606dec485\": container with ID starting with 99ccc46400db33031a43d1884473bf775e3b2a8784449c6745fdf31606dec485 not found: ID does not exist" containerID="99ccc46400db33031a43d1884473bf775e3b2a8784449c6745fdf31606dec485" Apr 23 18:46:24.057495 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:24.057459 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99ccc46400db33031a43d1884473bf775e3b2a8784449c6745fdf31606dec485"} err="failed to get container status \"99ccc46400db33031a43d1884473bf775e3b2a8784449c6745fdf31606dec485\": rpc error: code = NotFound desc = could not find container \"99ccc46400db33031a43d1884473bf775e3b2a8784449c6745fdf31606dec485\": container with ID starting with 99ccc46400db33031a43d1884473bf775e3b2a8784449c6745fdf31606dec485 not found: ID does not exist" Apr 23 18:46:24.598682 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:24.598649 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="980e2734-f854-455a-aed9-93aa43022fa9" path="/var/lib/kubelet/pods/980e2734-f854-455a-aed9-93aa43022fa9/volumes" Apr 23 18:46:24.993833 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:24.993744 2568 generic.go:358] "Generic (PLEG): container finished" podID="d3be1e59-5a16-460e-8089-d7b479f49f36" containerID="8192abb61cd2478b10194fcaf10e80040a24af7b34e638e1161b019a9d75cabb" exitCode=0 Apr 23 18:46:24.993833 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:24.993788 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8" event={"ID":"d3be1e59-5a16-460e-8089-d7b479f49f36","Type":"ContainerDied","Data":"8192abb61cd2478b10194fcaf10e80040a24af7b34e638e1161b019a9d75cabb"} Apr 23 18:46:25.997816 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:25.997784 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8" event={"ID":"d3be1e59-5a16-460e-8089-d7b479f49f36","Type":"ContainerStarted","Data":"7d6b7f88fd337a46c0b75b9d651220152de14a6994c75512c06be53b2309b4fc"} Apr 23 18:46:25.998250 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:25.997824 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8" event={"ID":"d3be1e59-5a16-460e-8089-d7b479f49f36","Type":"ContainerStarted","Data":"b42ccaa05b2e5896fcccdbae64b12595a3c59fea61927126de03dff5c625b9a5"} Apr 23 18:46:25.998250 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:25.998070 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8" Apr 23 18:46:25.998250 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:25.998097 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8" Apr 23 18:46:26.018558 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:26.018507 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8" podStartSLOduration=7.018488714 podStartE2EDuration="7.018488714s" podCreationTimestamp="2026-04-23 18:46:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:46:26.017772 +0000 UTC m=+3243.937864474" watchObservedRunningTime="2026-04-23 18:46:26.018488714 +0000 UTC m=+3243.938581185" Apr 23 18:46:32.008154 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:32.008122 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8" Apr 23 18:46:44.414619 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:44.414580 2568 scope.go:117] "RemoveContainer" containerID="0148638c9529385806c40560bbc7e1c275f5c12977b202d9a2b50d9e5f84756e" Apr 23 18:46:44.422040 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:44.422018 2568 scope.go:117] "RemoveContainer" containerID="5cf4ad4052272d5eb80a43747aabfec8396fc6f8411f50d5fcccc747b2c46624" Apr 23 18:46:44.428662 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:46:44.428642 2568 scope.go:117] "RemoveContainer" containerID="22ce176907e6d1910eb69018e30ae8f2b732042880c1468846500a3d7bd4d1d9" Apr 23 18:47:02.069284 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:02.069248 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8" Apr 23 18:47:09.737561 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:09.737521 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8"] Apr 23 18:47:09.738071 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:09.737864 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8" podUID="d3be1e59-5a16-460e-8089-d7b479f49f36" containerName="kserve-container" containerID="cri-o://b42ccaa05b2e5896fcccdbae64b12595a3c59fea61927126de03dff5c625b9a5" gracePeriod=30 Apr 23 18:47:09.738071 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:09.737924 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8" podUID="d3be1e59-5a16-460e-8089-d7b479f49f36" containerName="kube-rbac-proxy" containerID="cri-o://7d6b7f88fd337a46c0b75b9d651220152de14a6994c75512c06be53b2309b4fc" gracePeriod=30 Apr 23 18:47:09.844600 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:09.844556 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-6ccvs"] Apr 23 18:47:09.844876 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:09.844860 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="980e2734-f854-455a-aed9-93aa43022fa9" containerName="kserve-container" Apr 23 18:47:09.844972 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:09.844879 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="980e2734-f854-455a-aed9-93aa43022fa9" containerName="kserve-container" Apr 23 18:47:09.844972 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:09.844896 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="980e2734-f854-455a-aed9-93aa43022fa9" containerName="storage-initializer" Apr 23 18:47:09.844972 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:09.844901 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="980e2734-f854-455a-aed9-93aa43022fa9" containerName="storage-initializer" Apr 23 18:47:09.844972 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:09.844907 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="980e2734-f854-455a-aed9-93aa43022fa9" containerName="kube-rbac-proxy" Apr 23 18:47:09.844972 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:09.844913 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="980e2734-f854-455a-aed9-93aa43022fa9" containerName="kube-rbac-proxy" Apr 23 18:47:09.845167 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:09.844976 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="980e2734-f854-455a-aed9-93aa43022fa9" containerName="kube-rbac-proxy" Apr 23 18:47:09.845167 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:09.844987 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="980e2734-f854-455a-aed9-93aa43022fa9" containerName="kserve-container" Apr 23 18:47:09.848316 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:09.848292 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-6ccvs" Apr 23 18:47:09.850529 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:09.850503 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\"" Apr 23 18:47:09.850726 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:09.850709 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"xgboost-v2-mlserver-predictor-serving-cert\"" Apr 23 18:47:09.859667 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:09.859639 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-6ccvs"] Apr 23 18:47:09.866293 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:09.866265 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/be601995-977f-4504-94c2-9e6c12819b52-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-6ccvs\" (UID: \"be601995-977f-4504-94c2-9e6c12819b52\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-6ccvs" Apr 23 18:47:09.866453 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:09.866307 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/be601995-977f-4504-94c2-9e6c12819b52-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-6ccvs\" (UID: \"be601995-977f-4504-94c2-9e6c12819b52\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-6ccvs" Apr 23 18:47:09.866453 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:09.866339 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzrvx\" (UniqueName: \"kubernetes.io/projected/be601995-977f-4504-94c2-9e6c12819b52-kube-api-access-xzrvx\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-6ccvs\" (UID: \"be601995-977f-4504-94c2-9e6c12819b52\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-6ccvs" Apr 23 18:47:09.866453 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:09.866375 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/be601995-977f-4504-94c2-9e6c12819b52-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-6ccvs\" (UID: \"be601995-977f-4504-94c2-9e6c12819b52\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-6ccvs" Apr 23 18:47:09.967238 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:09.967200 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xzrvx\" (UniqueName: \"kubernetes.io/projected/be601995-977f-4504-94c2-9e6c12819b52-kube-api-access-xzrvx\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-6ccvs\" (UID: \"be601995-977f-4504-94c2-9e6c12819b52\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-6ccvs" Apr 23 18:47:09.967455 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:09.967255 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/be601995-977f-4504-94c2-9e6c12819b52-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-6ccvs\" (UID: \"be601995-977f-4504-94c2-9e6c12819b52\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-6ccvs" Apr 23 18:47:09.967455 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:09.967331 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/be601995-977f-4504-94c2-9e6c12819b52-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-6ccvs\" (UID: \"be601995-977f-4504-94c2-9e6c12819b52\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-6ccvs" Apr 23 18:47:09.967455 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:09.967370 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/be601995-977f-4504-94c2-9e6c12819b52-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-6ccvs\" (UID: \"be601995-977f-4504-94c2-9e6c12819b52\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-6ccvs" Apr 23 18:47:09.967764 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:09.967743 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/be601995-977f-4504-94c2-9e6c12819b52-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-6ccvs\" (UID: \"be601995-977f-4504-94c2-9e6c12819b52\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-6ccvs" Apr 23 18:47:09.967978 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:09.967960 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/be601995-977f-4504-94c2-9e6c12819b52-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-6ccvs\" (UID: \"be601995-977f-4504-94c2-9e6c12819b52\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-6ccvs" Apr 23 18:47:09.969807 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:09.969775 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/be601995-977f-4504-94c2-9e6c12819b52-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-6ccvs\" (UID: \"be601995-977f-4504-94c2-9e6c12819b52\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-6ccvs" Apr 23 18:47:09.978382 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:09.978356 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzrvx\" (UniqueName: \"kubernetes.io/projected/be601995-977f-4504-94c2-9e6c12819b52-kube-api-access-xzrvx\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-6ccvs\" (UID: \"be601995-977f-4504-94c2-9e6c12819b52\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-6ccvs" Apr 23 18:47:10.122812 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:10.122781 2568 generic.go:358] "Generic (PLEG): container finished" podID="d3be1e59-5a16-460e-8089-d7b479f49f36" containerID="7d6b7f88fd337a46c0b75b9d651220152de14a6994c75512c06be53b2309b4fc" exitCode=2 Apr 23 18:47:10.123023 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:10.122844 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8" event={"ID":"d3be1e59-5a16-460e-8089-d7b479f49f36","Type":"ContainerDied","Data":"7d6b7f88fd337a46c0b75b9d651220152de14a6994c75512c06be53b2309b4fc"} Apr 23 18:47:10.160135 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:10.160106 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-6ccvs" Apr 23 18:47:10.295001 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:10.294965 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-6ccvs"] Apr 23 18:47:10.298320 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:47:10.298292 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe601995_977f_4504_94c2_9e6c12819b52.slice/crio-70d96f6d9e657e1c45f12e54f0b897eabd56d3d1fffb11f5131c0b27cce0a88a WatchSource:0}: Error finding container 70d96f6d9e657e1c45f12e54f0b897eabd56d3d1fffb11f5131c0b27cce0a88a: Status 404 returned error can't find the container with id 70d96f6d9e657e1c45f12e54f0b897eabd56d3d1fffb11f5131c0b27cce0a88a Apr 23 18:47:11.126849 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:11.126811 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-6ccvs" event={"ID":"be601995-977f-4504-94c2-9e6c12819b52","Type":"ContainerStarted","Data":"c2ef0ab1c84c898f6eca58801b928191b0a034914bf0368e3dcd0f55993acc10"} Apr 23 18:47:11.126849 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:11.126851 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-6ccvs" event={"ID":"be601995-977f-4504-94c2-9e6c12819b52","Type":"ContainerStarted","Data":"70d96f6d9e657e1c45f12e54f0b897eabd56d3d1fffb11f5131c0b27cce0a88a"} Apr 23 18:47:12.002155 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:12.002109 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8" podUID="d3be1e59-5a16-460e-8089-d7b479f49f36" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.51:8643/healthz\": dial tcp 10.133.0.51:8643: connect: connection refused" Apr 23 18:47:14.137035 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:14.136997 2568 generic.go:358] "Generic (PLEG): container finished" podID="be601995-977f-4504-94c2-9e6c12819b52" containerID="c2ef0ab1c84c898f6eca58801b928191b0a034914bf0368e3dcd0f55993acc10" exitCode=0 Apr 23 18:47:14.137437 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:14.137071 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-6ccvs" event={"ID":"be601995-977f-4504-94c2-9e6c12819b52","Type":"ContainerDied","Data":"c2ef0ab1c84c898f6eca58801b928191b0a034914bf0368e3dcd0f55993acc10"} Apr 23 18:47:15.142427 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:15.142394 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-6ccvs" event={"ID":"be601995-977f-4504-94c2-9e6c12819b52","Type":"ContainerStarted","Data":"ba69ede76a1e82f88022fef1a5c27b21109240607ab0db083326c2d7079e3d92"} Apr 23 18:47:15.142427 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:15.142432 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-6ccvs" event={"ID":"be601995-977f-4504-94c2-9e6c12819b52","Type":"ContainerStarted","Data":"789fd4177143e1c229ee0773418861e09f0f17d2f94c2c4f2105772662df8475"} Apr 23 18:47:15.143031 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:15.142663 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-6ccvs" Apr 23 18:47:15.163605 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:15.163560 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-6ccvs" podStartSLOduration=6.163543203 podStartE2EDuration="6.163543203s" podCreationTimestamp="2026-04-23 18:47:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:47:15.162410868 +0000 UTC m=+3293.082503355" watchObservedRunningTime="2026-04-23 18:47:15.163543203 +0000 UTC m=+3293.083635678" Apr 23 18:47:16.145567 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:16.145534 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-6ccvs" Apr 23 18:47:16.575768 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:16.575742 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8" Apr 23 18:47:16.619299 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:16.619272 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d3be1e59-5a16-460e-8089-d7b479f49f36-proxy-tls\") pod \"d3be1e59-5a16-460e-8089-d7b479f49f36\" (UID: \"d3be1e59-5a16-460e-8089-d7b479f49f36\") " Apr 23 18:47:16.619491 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:16.619311 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfmlh\" (UniqueName: \"kubernetes.io/projected/d3be1e59-5a16-460e-8089-d7b479f49f36-kube-api-access-cfmlh\") pod \"d3be1e59-5a16-460e-8089-d7b479f49f36\" (UID: \"d3be1e59-5a16-460e-8089-d7b479f49f36\") " Apr 23 18:47:16.619491 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:16.619340 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d3be1e59-5a16-460e-8089-d7b479f49f36-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"d3be1e59-5a16-460e-8089-d7b479f49f36\" (UID: \"d3be1e59-5a16-460e-8089-d7b479f49f36\") " Apr 23 18:47:16.619491 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:16.619368 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d3be1e59-5a16-460e-8089-d7b479f49f36-kserve-provision-location\") pod \"d3be1e59-5a16-460e-8089-d7b479f49f36\" (UID: \"d3be1e59-5a16-460e-8089-d7b479f49f36\") " Apr 23 18:47:16.619722 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:16.619696 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3be1e59-5a16-460e-8089-d7b479f49f36-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d3be1e59-5a16-460e-8089-d7b479f49f36" (UID: "d3be1e59-5a16-460e-8089-d7b479f49f36"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:47:16.619722 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:16.619707 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3be1e59-5a16-460e-8089-d7b479f49f36-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config") pod "d3be1e59-5a16-460e-8089-d7b479f49f36" (UID: "d3be1e59-5a16-460e-8089-d7b479f49f36"). InnerVolumeSpecName "isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:47:16.621427 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:16.621402 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3be1e59-5a16-460e-8089-d7b479f49f36-kube-api-access-cfmlh" (OuterVolumeSpecName: "kube-api-access-cfmlh") pod "d3be1e59-5a16-460e-8089-d7b479f49f36" (UID: "d3be1e59-5a16-460e-8089-d7b479f49f36"). InnerVolumeSpecName "kube-api-access-cfmlh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:47:16.621536 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:16.621433 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3be1e59-5a16-460e-8089-d7b479f49f36-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d3be1e59-5a16-460e-8089-d7b479f49f36" (UID: "d3be1e59-5a16-460e-8089-d7b479f49f36"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:47:16.720284 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:16.720202 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d3be1e59-5a16-460e-8089-d7b479f49f36-proxy-tls\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:47:16.720284 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:16.720229 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cfmlh\" (UniqueName: \"kubernetes.io/projected/d3be1e59-5a16-460e-8089-d7b479f49f36-kube-api-access-cfmlh\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:47:16.720284 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:16.720240 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d3be1e59-5a16-460e-8089-d7b479f49f36-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:47:16.720284 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:16.720250 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d3be1e59-5a16-460e-8089-d7b479f49f36-kserve-provision-location\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:47:17.150208 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:17.150170 2568 generic.go:358] "Generic (PLEG): container finished" podID="d3be1e59-5a16-460e-8089-d7b479f49f36" containerID="b42ccaa05b2e5896fcccdbae64b12595a3c59fea61927126de03dff5c625b9a5" exitCode=0 Apr 23 18:47:17.150651 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:17.150250 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8" event={"ID":"d3be1e59-5a16-460e-8089-d7b479f49f36","Type":"ContainerDied","Data":"b42ccaa05b2e5896fcccdbae64b12595a3c59fea61927126de03dff5c625b9a5"} Apr 23 18:47:17.150651 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:17.150294 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8" Apr 23 18:47:17.150651 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:17.150310 2568 scope.go:117] "RemoveContainer" containerID="7d6b7f88fd337a46c0b75b9d651220152de14a6994c75512c06be53b2309b4fc" Apr 23 18:47:17.150651 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:17.150295 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8" event={"ID":"d3be1e59-5a16-460e-8089-d7b479f49f36","Type":"ContainerDied","Data":"e83db869ce687ecdda479ca8e4e315940f62ede74101dcf6d4fe24de71a4da1e"} Apr 23 18:47:17.158413 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:17.158394 2568 scope.go:117] "RemoveContainer" containerID="b42ccaa05b2e5896fcccdbae64b12595a3c59fea61927126de03dff5c625b9a5" Apr 23 18:47:17.165332 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:17.165311 2568 scope.go:117] "RemoveContainer" containerID="8192abb61cd2478b10194fcaf10e80040a24af7b34e638e1161b019a9d75cabb" Apr 23 18:47:17.173000 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:17.172974 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8"] Apr 23 18:47:17.173587 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:17.173566 2568 scope.go:117] "RemoveContainer" containerID="7d6b7f88fd337a46c0b75b9d651220152de14a6994c75512c06be53b2309b4fc" Apr 23 18:47:17.173894 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:47:17.173870 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d6b7f88fd337a46c0b75b9d651220152de14a6994c75512c06be53b2309b4fc\": container with ID starting with 7d6b7f88fd337a46c0b75b9d651220152de14a6994c75512c06be53b2309b4fc not found: ID does not exist" containerID="7d6b7f88fd337a46c0b75b9d651220152de14a6994c75512c06be53b2309b4fc" Apr 23 18:47:17.173975 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:17.173906 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d6b7f88fd337a46c0b75b9d651220152de14a6994c75512c06be53b2309b4fc"} err="failed to get container status \"7d6b7f88fd337a46c0b75b9d651220152de14a6994c75512c06be53b2309b4fc\": rpc error: code = NotFound desc = could not find container \"7d6b7f88fd337a46c0b75b9d651220152de14a6994c75512c06be53b2309b4fc\": container with ID starting with 7d6b7f88fd337a46c0b75b9d651220152de14a6994c75512c06be53b2309b4fc not found: ID does not exist" Apr 23 18:47:17.173975 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:17.173954 2568 scope.go:117] "RemoveContainer" containerID="b42ccaa05b2e5896fcccdbae64b12595a3c59fea61927126de03dff5c625b9a5" Apr 23 18:47:17.174271 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:47:17.174248 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b42ccaa05b2e5896fcccdbae64b12595a3c59fea61927126de03dff5c625b9a5\": container with ID starting with b42ccaa05b2e5896fcccdbae64b12595a3c59fea61927126de03dff5c625b9a5 not found: ID does not exist" containerID="b42ccaa05b2e5896fcccdbae64b12595a3c59fea61927126de03dff5c625b9a5" Apr 23 18:47:17.174335 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:17.174279 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b42ccaa05b2e5896fcccdbae64b12595a3c59fea61927126de03dff5c625b9a5"} err="failed to get container status \"b42ccaa05b2e5896fcccdbae64b12595a3c59fea61927126de03dff5c625b9a5\": rpc error: code = NotFound desc = could not find container \"b42ccaa05b2e5896fcccdbae64b12595a3c59fea61927126de03dff5c625b9a5\": container with ID starting with b42ccaa05b2e5896fcccdbae64b12595a3c59fea61927126de03dff5c625b9a5 not found: ID does not exist" Apr 23 18:47:17.174335 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:17.174297 2568 scope.go:117] "RemoveContainer" containerID="8192abb61cd2478b10194fcaf10e80040a24af7b34e638e1161b019a9d75cabb" Apr 23 18:47:17.174578 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:47:17.174555 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8192abb61cd2478b10194fcaf10e80040a24af7b34e638e1161b019a9d75cabb\": container with ID starting with 8192abb61cd2478b10194fcaf10e80040a24af7b34e638e1161b019a9d75cabb not found: ID does not exist" containerID="8192abb61cd2478b10194fcaf10e80040a24af7b34e638e1161b019a9d75cabb" Apr 23 18:47:17.174629 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:17.174586 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8192abb61cd2478b10194fcaf10e80040a24af7b34e638e1161b019a9d75cabb"} err="failed to get container status \"8192abb61cd2478b10194fcaf10e80040a24af7b34e638e1161b019a9d75cabb\": rpc error: code = NotFound desc = could not find container \"8192abb61cd2478b10194fcaf10e80040a24af7b34e638e1161b019a9d75cabb\": container with ID starting with 8192abb61cd2478b10194fcaf10e80040a24af7b34e638e1161b019a9d75cabb not found: ID does not exist" Apr 23 18:47:17.178317 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:17.178296 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ndll8"] Apr 23 18:47:18.598902 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:18.598867 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3be1e59-5a16-460e-8089-d7b479f49f36" path="/var/lib/kubelet/pods/d3be1e59-5a16-460e-8089-d7b479f49f36/volumes" Apr 23 18:47:22.156805 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:22.156770 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-6ccvs" Apr 23 18:47:22.727216 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:22.727182 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwrk7_52f4d4c2-438f-4486-8ffb-4afbaad14bf1/ovn-acl-logging/0.log" Apr 23 18:47:22.731495 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:22.731474 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwrk7_52f4d4c2-438f-4486-8ffb-4afbaad14bf1/ovn-acl-logging/0.log" Apr 23 18:47:22.731633 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:22.731517 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_8dc4456d0c7e39a0756b16fde0e83318/kube-rbac-proxy-crio/4.log" Apr 23 18:47:22.734902 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:22.734879 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_8dc4456d0c7e39a0756b16fde0e83318/kube-rbac-proxy-crio/4.log" Apr 23 18:47:52.161025 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:52.160993 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-6ccvs" Apr 23 18:47:59.912966 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:59.912910 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-6ccvs"] Apr 23 18:47:59.915387 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:59.913274 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-6ccvs" podUID="be601995-977f-4504-94c2-9e6c12819b52" containerName="kserve-container" containerID="cri-o://789fd4177143e1c229ee0773418861e09f0f17d2f94c2c4f2105772662df8475" gracePeriod=30 Apr 23 18:47:59.915387 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:47:59.913339 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-6ccvs" podUID="be601995-977f-4504-94c2-9e6c12819b52" containerName="kube-rbac-proxy" containerID="cri-o://ba69ede76a1e82f88022fef1a5c27b21109240607ab0db083326c2d7079e3d92" gracePeriod=30 Apr 23 18:48:00.026744 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:00.026709 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-hlsdw"] Apr 23 18:48:00.027104 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:00.027086 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d3be1e59-5a16-460e-8089-d7b479f49f36" containerName="kube-rbac-proxy" Apr 23 18:48:00.027104 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:00.027103 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3be1e59-5a16-460e-8089-d7b479f49f36" containerName="kube-rbac-proxy" Apr 23 18:48:00.027292 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:00.027115 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d3be1e59-5a16-460e-8089-d7b479f49f36" containerName="storage-initializer" Apr 23 18:48:00.027292 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:00.027121 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3be1e59-5a16-460e-8089-d7b479f49f36" containerName="storage-initializer" Apr 23 18:48:00.027292 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:00.027135 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d3be1e59-5a16-460e-8089-d7b479f49f36" containerName="kserve-container" Apr 23 18:48:00.027292 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:00.027140 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3be1e59-5a16-460e-8089-d7b479f49f36" containerName="kserve-container" Apr 23 18:48:00.027292 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:00.027191 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="d3be1e59-5a16-460e-8089-d7b479f49f36" containerName="kserve-container" Apr 23 18:48:00.027292 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:00.027200 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="d3be1e59-5a16-460e-8089-d7b479f49f36" containerName="kube-rbac-proxy" Apr 23 18:48:00.030082 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:00.030061 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-hlsdw" Apr 23 18:48:00.031895 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:00.031876 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-runtime-predictor-serving-cert\"" Apr 23 18:48:00.031992 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:00.031971 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\"" Apr 23 18:48:00.038529 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:00.038504 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-hlsdw"] Apr 23 18:48:00.162356 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:00.162329 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phxdn\" (UniqueName: \"kubernetes.io/projected/d30a8e3a-8bcf-4e78-975f-c3e20f65dd60-kube-api-access-phxdn\") pod \"isvc-xgboost-runtime-predictor-779db84d9-hlsdw\" (UID: \"d30a8e3a-8bcf-4e78-975f-c3e20f65dd60\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-hlsdw" Apr 23 18:48:00.162539 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:00.162368 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d30a8e3a-8bcf-4e78-975f-c3e20f65dd60-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-hlsdw\" (UID: \"d30a8e3a-8bcf-4e78-975f-c3e20f65dd60\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-hlsdw" Apr 23 18:48:00.162539 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:00.162403 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d30a8e3a-8bcf-4e78-975f-c3e20f65dd60-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-runtime-predictor-779db84d9-hlsdw\" (UID: \"d30a8e3a-8bcf-4e78-975f-c3e20f65dd60\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-hlsdw" Apr 23 18:48:00.162539 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:00.162467 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d30a8e3a-8bcf-4e78-975f-c3e20f65dd60-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-779db84d9-hlsdw\" (UID: \"d30a8e3a-8bcf-4e78-975f-c3e20f65dd60\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-hlsdw" Apr 23 18:48:00.263244 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:00.263162 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-phxdn\" (UniqueName: \"kubernetes.io/projected/d30a8e3a-8bcf-4e78-975f-c3e20f65dd60-kube-api-access-phxdn\") pod \"isvc-xgboost-runtime-predictor-779db84d9-hlsdw\" (UID: \"d30a8e3a-8bcf-4e78-975f-c3e20f65dd60\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-hlsdw" Apr 23 18:48:00.263244 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:00.263199 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d30a8e3a-8bcf-4e78-975f-c3e20f65dd60-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-hlsdw\" (UID: \"d30a8e3a-8bcf-4e78-975f-c3e20f65dd60\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-hlsdw" Apr 23 18:48:00.263463 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:00.263243 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d30a8e3a-8bcf-4e78-975f-c3e20f65dd60-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-runtime-predictor-779db84d9-hlsdw\" (UID: \"d30a8e3a-8bcf-4e78-975f-c3e20f65dd60\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-hlsdw" Apr 23 18:48:00.263463 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:00.263315 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d30a8e3a-8bcf-4e78-975f-c3e20f65dd60-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-779db84d9-hlsdw\" (UID: \"d30a8e3a-8bcf-4e78-975f-c3e20f65dd60\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-hlsdw" Apr 23 18:48:00.263717 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:00.263691 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d30a8e3a-8bcf-4e78-975f-c3e20f65dd60-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-779db84d9-hlsdw\" (UID: \"d30a8e3a-8bcf-4e78-975f-c3e20f65dd60\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-hlsdw" Apr 23 18:48:00.264166 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:00.264138 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d30a8e3a-8bcf-4e78-975f-c3e20f65dd60-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-runtime-predictor-779db84d9-hlsdw\" (UID: \"d30a8e3a-8bcf-4e78-975f-c3e20f65dd60\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-hlsdw" Apr 23 18:48:00.265373 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:00.265348 2568 generic.go:358] "Generic (PLEG): container finished" podID="be601995-977f-4504-94c2-9e6c12819b52" containerID="ba69ede76a1e82f88022fef1a5c27b21109240607ab0db083326c2d7079e3d92" exitCode=2 Apr 23 18:48:00.265467 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:00.265419 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-6ccvs" event={"ID":"be601995-977f-4504-94c2-9e6c12819b52","Type":"ContainerDied","Data":"ba69ede76a1e82f88022fef1a5c27b21109240607ab0db083326c2d7079e3d92"} Apr 23 18:48:00.265790 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:00.265773 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d30a8e3a-8bcf-4e78-975f-c3e20f65dd60-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-hlsdw\" (UID: \"d30a8e3a-8bcf-4e78-975f-c3e20f65dd60\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-hlsdw" Apr 23 18:48:00.272549 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:00.272524 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-phxdn\" (UniqueName: \"kubernetes.io/projected/d30a8e3a-8bcf-4e78-975f-c3e20f65dd60-kube-api-access-phxdn\") pod \"isvc-xgboost-runtime-predictor-779db84d9-hlsdw\" (UID: \"d30a8e3a-8bcf-4e78-975f-c3e20f65dd60\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-hlsdw" Apr 23 18:48:00.341275 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:00.341231 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-hlsdw" Apr 23 18:48:00.464493 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:00.464456 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-hlsdw"] Apr 23 18:48:00.467576 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:48:00.467541 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd30a8e3a_8bcf_4e78_975f_c3e20f65dd60.slice/crio-6f1f81ee018fb0ca91f12ff0de3fbd6ded9663fe28172d0b6067e5e331600c8e WatchSource:0}: Error finding container 6f1f81ee018fb0ca91f12ff0de3fbd6ded9663fe28172d0b6067e5e331600c8e: Status 404 returned error can't find the container with id 6f1f81ee018fb0ca91f12ff0de3fbd6ded9663fe28172d0b6067e5e331600c8e Apr 23 18:48:01.270029 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:01.269989 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-hlsdw" event={"ID":"d30a8e3a-8bcf-4e78-975f-c3e20f65dd60","Type":"ContainerStarted","Data":"4ce24c62349283b69193848d6560470a9532b2b0ec13f1be47dccd4116ab0776"} Apr 23 18:48:01.270029 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:01.270031 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-hlsdw" event={"ID":"d30a8e3a-8bcf-4e78-975f-c3e20f65dd60","Type":"ContainerStarted","Data":"6f1f81ee018fb0ca91f12ff0de3fbd6ded9663fe28172d0b6067e5e331600c8e"} Apr 23 18:48:02.151126 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:02.151083 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-6ccvs" podUID="be601995-977f-4504-94c2-9e6c12819b52" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.52:8643/healthz\": dial tcp 10.133.0.52:8643: connect: connection refused" Apr 23 18:48:05.283520 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:05.283489 2568 generic.go:358] "Generic (PLEG): container finished" podID="d30a8e3a-8bcf-4e78-975f-c3e20f65dd60" containerID="4ce24c62349283b69193848d6560470a9532b2b0ec13f1be47dccd4116ab0776" exitCode=0 Apr 23 18:48:05.283917 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:05.283562 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-hlsdw" event={"ID":"d30a8e3a-8bcf-4e78-975f-c3e20f65dd60","Type":"ContainerDied","Data":"4ce24c62349283b69193848d6560470a9532b2b0ec13f1be47dccd4116ab0776"} Apr 23 18:48:06.289039 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:06.289001 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-hlsdw" event={"ID":"d30a8e3a-8bcf-4e78-975f-c3e20f65dd60","Type":"ContainerStarted","Data":"afa916d0dd9c6062f8aa11f5d43776e4b0bc90f35b6cfab59dd64d64f0a4d773"} Apr 23 18:48:06.289039 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:06.289037 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-hlsdw" event={"ID":"d30a8e3a-8bcf-4e78-975f-c3e20f65dd60","Type":"ContainerStarted","Data":"cee261f7db24d65b540270b36de735ea2013f2e5b5cf402f8c56b739df6b2ace"} Apr 23 18:48:06.289479 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:06.289313 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-hlsdw" Apr 23 18:48:06.289479 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:06.289432 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-hlsdw" Apr 23 18:48:06.290664 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:06.290638 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-hlsdw" podUID="d30a8e3a-8bcf-4e78-975f-c3e20f65dd60" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.53:8080: connect: connection refused" Apr 23 18:48:06.308914 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:06.308872 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-hlsdw" podStartSLOduration=6.308858212 podStartE2EDuration="6.308858212s" podCreationTimestamp="2026-04-23 18:48:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:48:06.308819333 +0000 UTC m=+3344.228911822" watchObservedRunningTime="2026-04-23 18:48:06.308858212 +0000 UTC m=+3344.228950668" Apr 23 18:48:06.558327 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:06.558300 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-6ccvs" Apr 23 18:48:06.709232 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:06.709197 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzrvx\" (UniqueName: \"kubernetes.io/projected/be601995-977f-4504-94c2-9e6c12819b52-kube-api-access-xzrvx\") pod \"be601995-977f-4504-94c2-9e6c12819b52\" (UID: \"be601995-977f-4504-94c2-9e6c12819b52\") " Apr 23 18:48:06.709653 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:06.709286 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/be601995-977f-4504-94c2-9e6c12819b52-proxy-tls\") pod \"be601995-977f-4504-94c2-9e6c12819b52\" (UID: \"be601995-977f-4504-94c2-9e6c12819b52\") " Apr 23 18:48:06.709653 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:06.709319 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/be601995-977f-4504-94c2-9e6c12819b52-kserve-provision-location\") pod \"be601995-977f-4504-94c2-9e6c12819b52\" (UID: \"be601995-977f-4504-94c2-9e6c12819b52\") " Apr 23 18:48:06.709653 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:06.709358 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/be601995-977f-4504-94c2-9e6c12819b52-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"be601995-977f-4504-94c2-9e6c12819b52\" (UID: \"be601995-977f-4504-94c2-9e6c12819b52\") " Apr 23 18:48:06.709854 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:06.709652 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be601995-977f-4504-94c2-9e6c12819b52-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "be601995-977f-4504-94c2-9e6c12819b52" (UID: "be601995-977f-4504-94c2-9e6c12819b52"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:48:06.709854 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:06.709758 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be601995-977f-4504-94c2-9e6c12819b52-xgboost-v2-mlserver-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "xgboost-v2-mlserver-kube-rbac-proxy-sar-config") pod "be601995-977f-4504-94c2-9e6c12819b52" (UID: "be601995-977f-4504-94c2-9e6c12819b52"). InnerVolumeSpecName "xgboost-v2-mlserver-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:48:06.711400 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:06.711379 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be601995-977f-4504-94c2-9e6c12819b52-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "be601995-977f-4504-94c2-9e6c12819b52" (UID: "be601995-977f-4504-94c2-9e6c12819b52"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:48:06.711480 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:06.711435 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be601995-977f-4504-94c2-9e6c12819b52-kube-api-access-xzrvx" (OuterVolumeSpecName: "kube-api-access-xzrvx") pod "be601995-977f-4504-94c2-9e6c12819b52" (UID: "be601995-977f-4504-94c2-9e6c12819b52"). InnerVolumeSpecName "kube-api-access-xzrvx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:48:06.810300 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:06.810206 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xzrvx\" (UniqueName: \"kubernetes.io/projected/be601995-977f-4504-94c2-9e6c12819b52-kube-api-access-xzrvx\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:48:06.810300 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:06.810241 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/be601995-977f-4504-94c2-9e6c12819b52-proxy-tls\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:48:06.810300 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:06.810251 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/be601995-977f-4504-94c2-9e6c12819b52-kserve-provision-location\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:48:06.810300 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:06.810262 2568 reconciler_common.go:299] "Volume detached for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/be601995-977f-4504-94c2-9e6c12819b52-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:48:07.293862 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:07.293824 2568 generic.go:358] "Generic (PLEG): container finished" podID="be601995-977f-4504-94c2-9e6c12819b52" containerID="789fd4177143e1c229ee0773418861e09f0f17d2f94c2c4f2105772662df8475" exitCode=0 Apr 23 18:48:07.294310 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:07.293907 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-6ccvs" Apr 23 18:48:07.294310 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:07.293958 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-6ccvs" event={"ID":"be601995-977f-4504-94c2-9e6c12819b52","Type":"ContainerDied","Data":"789fd4177143e1c229ee0773418861e09f0f17d2f94c2c4f2105772662df8475"} Apr 23 18:48:07.294310 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:07.294000 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-6ccvs" event={"ID":"be601995-977f-4504-94c2-9e6c12819b52","Type":"ContainerDied","Data":"70d96f6d9e657e1c45f12e54f0b897eabd56d3d1fffb11f5131c0b27cce0a88a"} Apr 23 18:48:07.294310 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:07.294015 2568 scope.go:117] "RemoveContainer" containerID="ba69ede76a1e82f88022fef1a5c27b21109240607ab0db083326c2d7079e3d92" Apr 23 18:48:07.294533 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:07.294498 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-hlsdw" podUID="d30a8e3a-8bcf-4e78-975f-c3e20f65dd60" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.53:8080: connect: connection refused" Apr 23 18:48:07.301858 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:07.301838 2568 scope.go:117] "RemoveContainer" containerID="789fd4177143e1c229ee0773418861e09f0f17d2f94c2c4f2105772662df8475" Apr 23 18:48:07.308624 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:07.308606 2568 scope.go:117] "RemoveContainer" containerID="c2ef0ab1c84c898f6eca58801b928191b0a034914bf0368e3dcd0f55993acc10" Apr 23 18:48:07.315468 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:07.315447 2568 scope.go:117] "RemoveContainer" containerID="ba69ede76a1e82f88022fef1a5c27b21109240607ab0db083326c2d7079e3d92" Apr 23 18:48:07.315557 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:07.315528 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-6ccvs"] Apr 23 18:48:07.315771 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:48:07.315752 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba69ede76a1e82f88022fef1a5c27b21109240607ab0db083326c2d7079e3d92\": container with ID starting with ba69ede76a1e82f88022fef1a5c27b21109240607ab0db083326c2d7079e3d92 not found: ID does not exist" containerID="ba69ede76a1e82f88022fef1a5c27b21109240607ab0db083326c2d7079e3d92" Apr 23 18:48:07.315836 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:07.315778 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba69ede76a1e82f88022fef1a5c27b21109240607ab0db083326c2d7079e3d92"} err="failed to get container status \"ba69ede76a1e82f88022fef1a5c27b21109240607ab0db083326c2d7079e3d92\": rpc error: code = NotFound desc = could not find container \"ba69ede76a1e82f88022fef1a5c27b21109240607ab0db083326c2d7079e3d92\": container with ID starting with ba69ede76a1e82f88022fef1a5c27b21109240607ab0db083326c2d7079e3d92 not found: ID does not exist" Apr 23 18:48:07.315836 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:07.315801 2568 scope.go:117] "RemoveContainer" containerID="789fd4177143e1c229ee0773418861e09f0f17d2f94c2c4f2105772662df8475" Apr 23 18:48:07.316035 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:48:07.316016 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"789fd4177143e1c229ee0773418861e09f0f17d2f94c2c4f2105772662df8475\": container with ID starting with 789fd4177143e1c229ee0773418861e09f0f17d2f94c2c4f2105772662df8475 not found: ID does not exist" containerID="789fd4177143e1c229ee0773418861e09f0f17d2f94c2c4f2105772662df8475" Apr 23 18:48:07.316090 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:07.316039 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"789fd4177143e1c229ee0773418861e09f0f17d2f94c2c4f2105772662df8475"} err="failed to get container status \"789fd4177143e1c229ee0773418861e09f0f17d2f94c2c4f2105772662df8475\": rpc error: code = NotFound desc = could not find container \"789fd4177143e1c229ee0773418861e09f0f17d2f94c2c4f2105772662df8475\": container with ID starting with 789fd4177143e1c229ee0773418861e09f0f17d2f94c2c4f2105772662df8475 not found: ID does not exist" Apr 23 18:48:07.316090 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:07.316051 2568 scope.go:117] "RemoveContainer" containerID="c2ef0ab1c84c898f6eca58801b928191b0a034914bf0368e3dcd0f55993acc10" Apr 23 18:48:07.316258 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:48:07.316237 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2ef0ab1c84c898f6eca58801b928191b0a034914bf0368e3dcd0f55993acc10\": container with ID starting with c2ef0ab1c84c898f6eca58801b928191b0a034914bf0368e3dcd0f55993acc10 not found: ID does not exist" containerID="c2ef0ab1c84c898f6eca58801b928191b0a034914bf0368e3dcd0f55993acc10" Apr 23 18:48:07.316316 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:07.316261 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2ef0ab1c84c898f6eca58801b928191b0a034914bf0368e3dcd0f55993acc10"} err="failed to get container status \"c2ef0ab1c84c898f6eca58801b928191b0a034914bf0368e3dcd0f55993acc10\": rpc error: code = NotFound desc = could not find container \"c2ef0ab1c84c898f6eca58801b928191b0a034914bf0368e3dcd0f55993acc10\": container with ID starting with c2ef0ab1c84c898f6eca58801b928191b0a034914bf0368e3dcd0f55993acc10 not found: ID does not exist" Apr 23 18:48:07.321011 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:07.320990 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-6ccvs"] Apr 23 18:48:08.599143 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:08.599112 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be601995-977f-4504-94c2-9e6c12819b52" path="/var/lib/kubelet/pods/be601995-977f-4504-94c2-9e6c12819b52/volumes" Apr 23 18:48:12.299179 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:12.299149 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-hlsdw" Apr 23 18:48:12.299713 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:12.299687 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-hlsdw" podUID="d30a8e3a-8bcf-4e78-975f-c3e20f65dd60" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.53:8080: connect: connection refused" Apr 23 18:48:22.300508 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:22.300462 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-hlsdw" podUID="d30a8e3a-8bcf-4e78-975f-c3e20f65dd60" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.53:8080: connect: connection refused" Apr 23 18:48:32.299673 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:32.299628 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-hlsdw" podUID="d30a8e3a-8bcf-4e78-975f-c3e20f65dd60" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.53:8080: connect: connection refused" Apr 23 18:48:42.299874 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:42.299829 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-hlsdw" podUID="d30a8e3a-8bcf-4e78-975f-c3e20f65dd60" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.53:8080: connect: connection refused" Apr 23 18:48:52.300296 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:48:52.300255 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-hlsdw" podUID="d30a8e3a-8bcf-4e78-975f-c3e20f65dd60" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.53:8080: connect: connection refused" Apr 23 18:49:02.300621 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:02.300541 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-hlsdw" podUID="d30a8e3a-8bcf-4e78-975f-c3e20f65dd60" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.53:8080: connect: connection refused" Apr 23 18:49:12.300623 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:12.300593 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-hlsdw" Apr 23 18:49:20.122480 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:20.122440 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-hlsdw"] Apr 23 18:49:20.122867 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:20.122783 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-hlsdw" podUID="d30a8e3a-8bcf-4e78-975f-c3e20f65dd60" containerName="kserve-container" containerID="cri-o://cee261f7db24d65b540270b36de735ea2013f2e5b5cf402f8c56b739df6b2ace" gracePeriod=30 Apr 23 18:49:20.122923 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:20.122818 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-hlsdw" podUID="d30a8e3a-8bcf-4e78-975f-c3e20f65dd60" containerName="kube-rbac-proxy" containerID="cri-o://afa916d0dd9c6062f8aa11f5d43776e4b0bc90f35b6cfab59dd64d64f0a4d773" gracePeriod=30 Apr 23 18:49:20.220456 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:20.220424 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb"] Apr 23 18:49:20.220705 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:20.220693 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="be601995-977f-4504-94c2-9e6c12819b52" containerName="kserve-container" Apr 23 18:49:20.220751 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:20.220706 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="be601995-977f-4504-94c2-9e6c12819b52" containerName="kserve-container" Apr 23 18:49:20.220751 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:20.220718 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="be601995-977f-4504-94c2-9e6c12819b52" containerName="kube-rbac-proxy" Apr 23 18:49:20.220751 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:20.220724 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="be601995-977f-4504-94c2-9e6c12819b52" containerName="kube-rbac-proxy" Apr 23 18:49:20.220751 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:20.220741 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="be601995-977f-4504-94c2-9e6c12819b52" containerName="storage-initializer" Apr 23 18:49:20.220751 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:20.220747 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="be601995-977f-4504-94c2-9e6c12819b52" containerName="storage-initializer" Apr 23 18:49:20.220906 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:20.220786 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="be601995-977f-4504-94c2-9e6c12819b52" containerName="kserve-container" Apr 23 18:49:20.220906 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:20.220796 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="be601995-977f-4504-94c2-9e6c12819b52" containerName="kube-rbac-proxy" Apr 23 18:49:20.223534 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:20.223513 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb" Apr 23 18:49:20.225372 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:20.225347 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-runtime-predictor-serving-cert\"" Apr 23 18:49:20.225499 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:20.225351 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 23 18:49:20.237135 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:20.237113 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb"] Apr 23 18:49:20.343136 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:20.343104 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5895e4ad-7504-4651-9025-c435cdda3ec6-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb\" (UID: \"5895e4ad-7504-4651-9025-c435cdda3ec6\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb" Apr 23 18:49:20.343317 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:20.343149 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5895e4ad-7504-4651-9025-c435cdda3ec6-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb\" (UID: \"5895e4ad-7504-4651-9025-c435cdda3ec6\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb" Apr 23 18:49:20.343317 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:20.343209 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzjhq\" (UniqueName: \"kubernetes.io/projected/5895e4ad-7504-4651-9025-c435cdda3ec6-kube-api-access-qzjhq\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb\" (UID: \"5895e4ad-7504-4651-9025-c435cdda3ec6\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb" Apr 23 18:49:20.343317 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:20.343228 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5895e4ad-7504-4651-9025-c435cdda3ec6-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb\" (UID: \"5895e4ad-7504-4651-9025-c435cdda3ec6\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb" Apr 23 18:49:20.443632 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:20.443542 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5895e4ad-7504-4651-9025-c435cdda3ec6-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb\" (UID: \"5895e4ad-7504-4651-9025-c435cdda3ec6\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb" Apr 23 18:49:20.443632 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:20.443597 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qzjhq\" (UniqueName: \"kubernetes.io/projected/5895e4ad-7504-4651-9025-c435cdda3ec6-kube-api-access-qzjhq\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb\" (UID: \"5895e4ad-7504-4651-9025-c435cdda3ec6\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb" Apr 23 18:49:20.443632 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:20.443621 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5895e4ad-7504-4651-9025-c435cdda3ec6-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb\" (UID: \"5895e4ad-7504-4651-9025-c435cdda3ec6\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb" Apr 23 18:49:20.443909 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:20.443674 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5895e4ad-7504-4651-9025-c435cdda3ec6-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb\" (UID: \"5895e4ad-7504-4651-9025-c435cdda3ec6\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb" Apr 23 18:49:20.443909 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:49:20.443785 2568 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-serving-cert: secret "isvc-xgboost-v2-runtime-predictor-serving-cert" not found Apr 23 18:49:20.443909 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:49:20.443847 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5895e4ad-7504-4651-9025-c435cdda3ec6-proxy-tls podName:5895e4ad-7504-4651-9025-c435cdda3ec6 nodeName:}" failed. No retries permitted until 2026-04-23 18:49:20.94382757 +0000 UTC m=+3418.863920024 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/5895e4ad-7504-4651-9025-c435cdda3ec6-proxy-tls") pod "isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb" (UID: "5895e4ad-7504-4651-9025-c435cdda3ec6") : secret "isvc-xgboost-v2-runtime-predictor-serving-cert" not found Apr 23 18:49:20.448842 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:20.444415 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5895e4ad-7504-4651-9025-c435cdda3ec6-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb\" (UID: \"5895e4ad-7504-4651-9025-c435cdda3ec6\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb" Apr 23 18:49:20.448842 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:20.444555 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5895e4ad-7504-4651-9025-c435cdda3ec6-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb\" (UID: \"5895e4ad-7504-4651-9025-c435cdda3ec6\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb" Apr 23 18:49:20.452528 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:20.452503 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzjhq\" (UniqueName: \"kubernetes.io/projected/5895e4ad-7504-4651-9025-c435cdda3ec6-kube-api-access-qzjhq\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb\" (UID: \"5895e4ad-7504-4651-9025-c435cdda3ec6\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb" Apr 23 18:49:20.499939 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:20.499896 2568 generic.go:358] "Generic (PLEG): container finished" podID="d30a8e3a-8bcf-4e78-975f-c3e20f65dd60" containerID="afa916d0dd9c6062f8aa11f5d43776e4b0bc90f35b6cfab59dd64d64f0a4d773" exitCode=2 Apr 23 18:49:20.500102 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:20.499960 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-hlsdw" event={"ID":"d30a8e3a-8bcf-4e78-975f-c3e20f65dd60","Type":"ContainerDied","Data":"afa916d0dd9c6062f8aa11f5d43776e4b0bc90f35b6cfab59dd64d64f0a4d773"} Apr 23 18:49:20.948555 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:20.948520 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5895e4ad-7504-4651-9025-c435cdda3ec6-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb\" (UID: \"5895e4ad-7504-4651-9025-c435cdda3ec6\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb" Apr 23 18:49:20.951042 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:20.951018 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5895e4ad-7504-4651-9025-c435cdda3ec6-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb\" (UID: \"5895e4ad-7504-4651-9025-c435cdda3ec6\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb" Apr 23 18:49:21.133659 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:21.133609 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb" Apr 23 18:49:21.258425 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:21.258390 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb"] Apr 23 18:49:21.261887 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:49:21.261849 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5895e4ad_7504_4651_9025_c435cdda3ec6.slice/crio-534d54a6fe8e92b0f525ff292ab8bb528be92eefe96fcad09a28b779beb0a58a WatchSource:0}: Error finding container 534d54a6fe8e92b0f525ff292ab8bb528be92eefe96fcad09a28b779beb0a58a: Status 404 returned error can't find the container with id 534d54a6fe8e92b0f525ff292ab8bb528be92eefe96fcad09a28b779beb0a58a Apr 23 18:49:21.503920 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:21.503834 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb" event={"ID":"5895e4ad-7504-4651-9025-c435cdda3ec6","Type":"ContainerStarted","Data":"00960d9625d38765610bf6d45c892c5ba64d33b1390bfc884ba7b018121e674c"} Apr 23 18:49:21.503920 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:21.503875 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb" event={"ID":"5895e4ad-7504-4651-9025-c435cdda3ec6","Type":"ContainerStarted","Data":"534d54a6fe8e92b0f525ff292ab8bb528be92eefe96fcad09a28b779beb0a58a"} Apr 23 18:49:22.294789 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:22.294745 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-hlsdw" podUID="d30a8e3a-8bcf-4e78-975f-c3e20f65dd60" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.53:8643/healthz\": dial tcp 10.133.0.53:8643: connect: connection refused" Apr 23 18:49:22.300136 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:22.300112 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-hlsdw" podUID="d30a8e3a-8bcf-4e78-975f-c3e20f65dd60" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.53:8080: connect: connection refused" Apr 23 18:49:23.963102 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:23.963074 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-hlsdw" Apr 23 18:49:24.073876 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:24.073838 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phxdn\" (UniqueName: \"kubernetes.io/projected/d30a8e3a-8bcf-4e78-975f-c3e20f65dd60-kube-api-access-phxdn\") pod \"d30a8e3a-8bcf-4e78-975f-c3e20f65dd60\" (UID: \"d30a8e3a-8bcf-4e78-975f-c3e20f65dd60\") " Apr 23 18:49:24.074166 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:24.073901 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d30a8e3a-8bcf-4e78-975f-c3e20f65dd60-kserve-provision-location\") pod \"d30a8e3a-8bcf-4e78-975f-c3e20f65dd60\" (UID: \"d30a8e3a-8bcf-4e78-975f-c3e20f65dd60\") " Apr 23 18:49:24.074166 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:24.073966 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d30a8e3a-8bcf-4e78-975f-c3e20f65dd60-proxy-tls\") pod \"d30a8e3a-8bcf-4e78-975f-c3e20f65dd60\" (UID: \"d30a8e3a-8bcf-4e78-975f-c3e20f65dd60\") " Apr 23 18:49:24.074166 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:24.074020 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d30a8e3a-8bcf-4e78-975f-c3e20f65dd60-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"d30a8e3a-8bcf-4e78-975f-c3e20f65dd60\" (UID: \"d30a8e3a-8bcf-4e78-975f-c3e20f65dd60\") " Apr 23 18:49:24.074332 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:24.074239 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d30a8e3a-8bcf-4e78-975f-c3e20f65dd60-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d30a8e3a-8bcf-4e78-975f-c3e20f65dd60" (UID: "d30a8e3a-8bcf-4e78-975f-c3e20f65dd60"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:49:24.074372 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:24.074355 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d30a8e3a-8bcf-4e78-975f-c3e20f65dd60-isvc-xgboost-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-runtime-kube-rbac-proxy-sar-config") pod "d30a8e3a-8bcf-4e78-975f-c3e20f65dd60" (UID: "d30a8e3a-8bcf-4e78-975f-c3e20f65dd60"). InnerVolumeSpecName "isvc-xgboost-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:49:24.075987 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:24.075964 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d30a8e3a-8bcf-4e78-975f-c3e20f65dd60-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d30a8e3a-8bcf-4e78-975f-c3e20f65dd60" (UID: "d30a8e3a-8bcf-4e78-975f-c3e20f65dd60"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:49:24.076050 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:24.076013 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d30a8e3a-8bcf-4e78-975f-c3e20f65dd60-kube-api-access-phxdn" (OuterVolumeSpecName: "kube-api-access-phxdn") pod "d30a8e3a-8bcf-4e78-975f-c3e20f65dd60" (UID: "d30a8e3a-8bcf-4e78-975f-c3e20f65dd60"). InnerVolumeSpecName "kube-api-access-phxdn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:49:24.175604 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:24.175555 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d30a8e3a-8bcf-4e78-975f-c3e20f65dd60-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:49:24.175604 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:24.175599 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-phxdn\" (UniqueName: \"kubernetes.io/projected/d30a8e3a-8bcf-4e78-975f-c3e20f65dd60-kube-api-access-phxdn\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:49:24.175604 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:24.175609 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d30a8e3a-8bcf-4e78-975f-c3e20f65dd60-kserve-provision-location\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:49:24.175845 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:24.175620 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d30a8e3a-8bcf-4e78-975f-c3e20f65dd60-proxy-tls\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:49:24.513870 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:24.513777 2568 generic.go:358] "Generic (PLEG): container finished" podID="d30a8e3a-8bcf-4e78-975f-c3e20f65dd60" containerID="cee261f7db24d65b540270b36de735ea2013f2e5b5cf402f8c56b739df6b2ace" exitCode=0 Apr 23 18:49:24.513870 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:24.513820 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-hlsdw" event={"ID":"d30a8e3a-8bcf-4e78-975f-c3e20f65dd60","Type":"ContainerDied","Data":"cee261f7db24d65b540270b36de735ea2013f2e5b5cf402f8c56b739df6b2ace"} Apr 23 18:49:24.513870 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:24.513869 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-hlsdw" Apr 23 18:49:24.514134 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:24.513884 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-hlsdw" event={"ID":"d30a8e3a-8bcf-4e78-975f-c3e20f65dd60","Type":"ContainerDied","Data":"6f1f81ee018fb0ca91f12ff0de3fbd6ded9663fe28172d0b6067e5e331600c8e"} Apr 23 18:49:24.514134 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:24.513901 2568 scope.go:117] "RemoveContainer" containerID="afa916d0dd9c6062f8aa11f5d43776e4b0bc90f35b6cfab59dd64d64f0a4d773" Apr 23 18:49:24.522058 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:24.522033 2568 scope.go:117] "RemoveContainer" containerID="cee261f7db24d65b540270b36de735ea2013f2e5b5cf402f8c56b739df6b2ace" Apr 23 18:49:24.531196 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:24.531170 2568 scope.go:117] "RemoveContainer" containerID="4ce24c62349283b69193848d6560470a9532b2b0ec13f1be47dccd4116ab0776" Apr 23 18:49:24.536868 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:24.536839 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-hlsdw"] Apr 23 18:49:24.539651 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:24.539632 2568 scope.go:117] "RemoveContainer" containerID="afa916d0dd9c6062f8aa11f5d43776e4b0bc90f35b6cfab59dd64d64f0a4d773" Apr 23 18:49:24.540007 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:49:24.539987 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afa916d0dd9c6062f8aa11f5d43776e4b0bc90f35b6cfab59dd64d64f0a4d773\": container with ID starting with afa916d0dd9c6062f8aa11f5d43776e4b0bc90f35b6cfab59dd64d64f0a4d773 not found: ID does not exist" containerID="afa916d0dd9c6062f8aa11f5d43776e4b0bc90f35b6cfab59dd64d64f0a4d773" Apr 23 18:49:24.540114 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:24.540013 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afa916d0dd9c6062f8aa11f5d43776e4b0bc90f35b6cfab59dd64d64f0a4d773"} err="failed to get container status \"afa916d0dd9c6062f8aa11f5d43776e4b0bc90f35b6cfab59dd64d64f0a4d773\": rpc error: code = NotFound desc = could not find container \"afa916d0dd9c6062f8aa11f5d43776e4b0bc90f35b6cfab59dd64d64f0a4d773\": container with ID starting with afa916d0dd9c6062f8aa11f5d43776e4b0bc90f35b6cfab59dd64d64f0a4d773 not found: ID does not exist" Apr 23 18:49:24.540114 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:24.540034 2568 scope.go:117] "RemoveContainer" containerID="cee261f7db24d65b540270b36de735ea2013f2e5b5cf402f8c56b739df6b2ace" Apr 23 18:49:24.540757 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:49:24.540736 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cee261f7db24d65b540270b36de735ea2013f2e5b5cf402f8c56b739df6b2ace\": container with ID starting with cee261f7db24d65b540270b36de735ea2013f2e5b5cf402f8c56b739df6b2ace not found: ID does not exist" containerID="cee261f7db24d65b540270b36de735ea2013f2e5b5cf402f8c56b739df6b2ace" Apr 23 18:49:24.540835 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:24.540762 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cee261f7db24d65b540270b36de735ea2013f2e5b5cf402f8c56b739df6b2ace"} err="failed to get container status \"cee261f7db24d65b540270b36de735ea2013f2e5b5cf402f8c56b739df6b2ace\": rpc error: code = NotFound desc = could not find container \"cee261f7db24d65b540270b36de735ea2013f2e5b5cf402f8c56b739df6b2ace\": container with ID starting with cee261f7db24d65b540270b36de735ea2013f2e5b5cf402f8c56b739df6b2ace not found: ID does not exist" Apr 23 18:49:24.540835 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:24.540782 2568 scope.go:117] "RemoveContainer" containerID="4ce24c62349283b69193848d6560470a9532b2b0ec13f1be47dccd4116ab0776" Apr 23 18:49:24.541158 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:49:24.541137 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ce24c62349283b69193848d6560470a9532b2b0ec13f1be47dccd4116ab0776\": container with ID starting with 4ce24c62349283b69193848d6560470a9532b2b0ec13f1be47dccd4116ab0776 not found: ID does not exist" containerID="4ce24c62349283b69193848d6560470a9532b2b0ec13f1be47dccd4116ab0776" Apr 23 18:49:24.541243 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:24.541164 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ce24c62349283b69193848d6560470a9532b2b0ec13f1be47dccd4116ab0776"} err="failed to get container status \"4ce24c62349283b69193848d6560470a9532b2b0ec13f1be47dccd4116ab0776\": rpc error: code = NotFound desc = could not find container \"4ce24c62349283b69193848d6560470a9532b2b0ec13f1be47dccd4116ab0776\": container with ID starting with 4ce24c62349283b69193848d6560470a9532b2b0ec13f1be47dccd4116ab0776 not found: ID does not exist" Apr 23 18:49:24.541849 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:24.541829 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-hlsdw"] Apr 23 18:49:24.598500 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:24.598469 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d30a8e3a-8bcf-4e78-975f-c3e20f65dd60" path="/var/lib/kubelet/pods/d30a8e3a-8bcf-4e78-975f-c3e20f65dd60/volumes" Apr 23 18:49:25.520006 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:25.519970 2568 generic.go:358] "Generic (PLEG): container finished" podID="5895e4ad-7504-4651-9025-c435cdda3ec6" containerID="00960d9625d38765610bf6d45c892c5ba64d33b1390bfc884ba7b018121e674c" exitCode=0 Apr 23 18:49:25.520449 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:25.520039 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb" event={"ID":"5895e4ad-7504-4651-9025-c435cdda3ec6","Type":"ContainerDied","Data":"00960d9625d38765610bf6d45c892c5ba64d33b1390bfc884ba7b018121e674c"} Apr 23 18:49:26.525103 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:26.525067 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb" event={"ID":"5895e4ad-7504-4651-9025-c435cdda3ec6","Type":"ContainerStarted","Data":"d7c8db28802bfb012a5bdda951a1a16b0e7c5be116f06955687a232e06568c9a"} Apr 23 18:49:26.525546 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:26.525108 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb" event={"ID":"5895e4ad-7504-4651-9025-c435cdda3ec6","Type":"ContainerStarted","Data":"a96a790b8f1f510aa076ed2c69ed21d7ea5813984b8d7d70032ec537146ab808"} Apr 23 18:49:26.525546 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:26.525336 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb" Apr 23 18:49:26.545003 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:26.544947 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb" podStartSLOduration=6.5449136 podStartE2EDuration="6.5449136s" podCreationTimestamp="2026-04-23 18:49:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:49:26.543483742 +0000 UTC m=+3424.463576229" watchObservedRunningTime="2026-04-23 18:49:26.5449136 +0000 UTC m=+3424.465006077" Apr 23 18:49:27.528061 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:27.528029 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb" Apr 23 18:49:33.537398 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:49:33.537363 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb" Apr 23 18:50:03.569343 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:03.569305 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb" podUID="5895e4ad-7504-4651-9025-c435cdda3ec6" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 23 18:50:13.540030 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:13.540000 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb" Apr 23 18:50:20.335628 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:20.335595 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb"] Apr 23 18:50:20.336231 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:20.335918 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb" podUID="5895e4ad-7504-4651-9025-c435cdda3ec6" containerName="kserve-container" containerID="cri-o://a96a790b8f1f510aa076ed2c69ed21d7ea5813984b8d7d70032ec537146ab808" gracePeriod=30 Apr 23 18:50:20.336231 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:20.335981 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb" podUID="5895e4ad-7504-4651-9025-c435cdda3ec6" containerName="kube-rbac-proxy" containerID="cri-o://d7c8db28802bfb012a5bdda951a1a16b0e7c5be116f06955687a232e06568c9a" gracePeriod=30 Apr 23 18:50:20.408088 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:20.408052 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-69cbv"] Apr 23 18:50:20.408339 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:20.408327 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d30a8e3a-8bcf-4e78-975f-c3e20f65dd60" containerName="kube-rbac-proxy" Apr 23 18:50:20.408399 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:20.408340 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="d30a8e3a-8bcf-4e78-975f-c3e20f65dd60" containerName="kube-rbac-proxy" Apr 23 18:50:20.408399 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:20.408352 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d30a8e3a-8bcf-4e78-975f-c3e20f65dd60" containerName="kserve-container" Apr 23 18:50:20.408399 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:20.408358 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="d30a8e3a-8bcf-4e78-975f-c3e20f65dd60" containerName="kserve-container" Apr 23 18:50:20.408399 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:20.408367 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d30a8e3a-8bcf-4e78-975f-c3e20f65dd60" containerName="storage-initializer" Apr 23 18:50:20.408399 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:20.408373 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="d30a8e3a-8bcf-4e78-975f-c3e20f65dd60" containerName="storage-initializer" Apr 23 18:50:20.408566 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:20.408421 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="d30a8e3a-8bcf-4e78-975f-c3e20f65dd60" containerName="kserve-container" Apr 23 18:50:20.408566 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:20.408430 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="d30a8e3a-8bcf-4e78-975f-c3e20f65dd60" containerName="kube-rbac-proxy" Apr 23 18:50:20.411801 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:20.411781 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-69cbv" Apr 23 18:50:20.413656 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:20.413633 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-predictor-serving-cert\"" Apr 23 18:50:20.413781 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:20.413644 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-kube-rbac-proxy-sar-config\"" Apr 23 18:50:20.421685 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:20.421660 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-69cbv"] Apr 23 18:50:20.478159 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:20.478129 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/640e0835-684a-47eb-b888-ff812cf0805c-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-69cbv\" (UID: \"640e0835-684a-47eb-b888-ff812cf0805c\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-69cbv" Apr 23 18:50:20.478322 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:20.478166 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/640e0835-684a-47eb-b888-ff812cf0805c-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-69cbv\" (UID: \"640e0835-684a-47eb-b888-ff812cf0805c\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-69cbv" Apr 23 18:50:20.478322 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:20.478252 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/640e0835-684a-47eb-b888-ff812cf0805c-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-69cbv\" (UID: \"640e0835-684a-47eb-b888-ff812cf0805c\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-69cbv" Apr 23 18:50:20.478322 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:20.478287 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdb6m\" (UniqueName: \"kubernetes.io/projected/640e0835-684a-47eb-b888-ff812cf0805c-kube-api-access-tdb6m\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-69cbv\" (UID: \"640e0835-684a-47eb-b888-ff812cf0805c\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-69cbv" Apr 23 18:50:20.579468 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:20.579426 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/640e0835-684a-47eb-b888-ff812cf0805c-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-69cbv\" (UID: \"640e0835-684a-47eb-b888-ff812cf0805c\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-69cbv" Apr 23 18:50:20.579468 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:20.579470 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/640e0835-684a-47eb-b888-ff812cf0805c-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-69cbv\" (UID: \"640e0835-684a-47eb-b888-ff812cf0805c\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-69cbv" Apr 23 18:50:20.579701 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:20.579501 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/640e0835-684a-47eb-b888-ff812cf0805c-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-69cbv\" (UID: \"640e0835-684a-47eb-b888-ff812cf0805c\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-69cbv" Apr 23 18:50:20.579701 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:20.579526 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tdb6m\" (UniqueName: \"kubernetes.io/projected/640e0835-684a-47eb-b888-ff812cf0805c-kube-api-access-tdb6m\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-69cbv\" (UID: \"640e0835-684a-47eb-b888-ff812cf0805c\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-69cbv" Apr 23 18:50:20.579701 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:50:20.579593 2568 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-v2-predictor-serving-cert: secret "isvc-xgboost-v2-predictor-serving-cert" not found Apr 23 18:50:20.579701 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:50:20.579672 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/640e0835-684a-47eb-b888-ff812cf0805c-proxy-tls podName:640e0835-684a-47eb-b888-ff812cf0805c nodeName:}" failed. No retries permitted until 2026-04-23 18:50:21.079650194 +0000 UTC m=+3478.999742654 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/640e0835-684a-47eb-b888-ff812cf0805c-proxy-tls") pod "isvc-xgboost-v2-predictor-6fcdd6977c-69cbv" (UID: "640e0835-684a-47eb-b888-ff812cf0805c") : secret "isvc-xgboost-v2-predictor-serving-cert" not found Apr 23 18:50:20.579920 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:20.579903 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/640e0835-684a-47eb-b888-ff812cf0805c-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-69cbv\" (UID: \"640e0835-684a-47eb-b888-ff812cf0805c\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-69cbv" Apr 23 18:50:20.580211 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:20.580194 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/640e0835-684a-47eb-b888-ff812cf0805c-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-69cbv\" (UID: \"640e0835-684a-47eb-b888-ff812cf0805c\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-69cbv" Apr 23 18:50:20.588217 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:20.588157 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdb6m\" (UniqueName: \"kubernetes.io/projected/640e0835-684a-47eb-b888-ff812cf0805c-kube-api-access-tdb6m\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-69cbv\" (UID: \"640e0835-684a-47eb-b888-ff812cf0805c\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-69cbv" Apr 23 18:50:20.669846 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:20.669815 2568 generic.go:358] "Generic (PLEG): container finished" podID="5895e4ad-7504-4651-9025-c435cdda3ec6" containerID="d7c8db28802bfb012a5bdda951a1a16b0e7c5be116f06955687a232e06568c9a" exitCode=2 Apr 23 18:50:20.670017 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:20.669885 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb" event={"ID":"5895e4ad-7504-4651-9025-c435cdda3ec6","Type":"ContainerDied","Data":"d7c8db28802bfb012a5bdda951a1a16b0e7c5be116f06955687a232e06568c9a"} Apr 23 18:50:21.083236 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:21.083205 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/640e0835-684a-47eb-b888-ff812cf0805c-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-69cbv\" (UID: \"640e0835-684a-47eb-b888-ff812cf0805c\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-69cbv" Apr 23 18:50:21.085572 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:21.085540 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/640e0835-684a-47eb-b888-ff812cf0805c-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-69cbv\" (UID: \"640e0835-684a-47eb-b888-ff812cf0805c\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-69cbv" Apr 23 18:50:21.322504 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:21.322464 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-69cbv" Apr 23 18:50:21.446633 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:21.446602 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-69cbv"] Apr 23 18:50:21.449048 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:50:21.449012 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod640e0835_684a_47eb_b888_ff812cf0805c.slice/crio-fb4dd74893a8b7fdbfb253cfc2f8be522c530c21ce06ba5cbbd4f771a9ec54d6 WatchSource:0}: Error finding container fb4dd74893a8b7fdbfb253cfc2f8be522c530c21ce06ba5cbbd4f771a9ec54d6: Status 404 returned error can't find the container with id fb4dd74893a8b7fdbfb253cfc2f8be522c530c21ce06ba5cbbd4f771a9ec54d6 Apr 23 18:50:21.450860 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:21.450844 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 18:50:21.673924 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:21.673842 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-69cbv" event={"ID":"640e0835-684a-47eb-b888-ff812cf0805c","Type":"ContainerStarted","Data":"d30817271f7108d9b2c71c5b127d5f91c65888333e964cbd8f8665e6a69f1dda"} Apr 23 18:50:21.673924 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:21.673882 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-69cbv" event={"ID":"640e0835-684a-47eb-b888-ff812cf0805c","Type":"ContainerStarted","Data":"fb4dd74893a8b7fdbfb253cfc2f8be522c530c21ce06ba5cbbd4f771a9ec54d6"} Apr 23 18:50:23.532288 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:23.532249 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb" podUID="5895e4ad-7504-4651-9025-c435cdda3ec6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.54:8643/healthz\": dial tcp 10.133.0.54:8643: connect: connection refused" Apr 23 18:50:25.686943 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:25.686896 2568 generic.go:358] "Generic (PLEG): container finished" podID="640e0835-684a-47eb-b888-ff812cf0805c" containerID="d30817271f7108d9b2c71c5b127d5f91c65888333e964cbd8f8665e6a69f1dda" exitCode=0 Apr 23 18:50:25.687326 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:25.686968 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-69cbv" event={"ID":"640e0835-684a-47eb-b888-ff812cf0805c","Type":"ContainerDied","Data":"d30817271f7108d9b2c71c5b127d5f91c65888333e964cbd8f8665e6a69f1dda"} Apr 23 18:50:26.691447 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:26.691407 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-69cbv" event={"ID":"640e0835-684a-47eb-b888-ff812cf0805c","Type":"ContainerStarted","Data":"38208a55116402f53f81e51d8abaa9410dd6c6f369d7321ccd98d8f062175e33"} Apr 23 18:50:26.691447 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:26.691446 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-69cbv" event={"ID":"640e0835-684a-47eb-b888-ff812cf0805c","Type":"ContainerStarted","Data":"6c9b23d396883f956af77acc4d5a9bc591853442aeea0a650c7dd0a2d177f954"} Apr 23 18:50:26.691917 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:26.691645 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-69cbv" Apr 23 18:50:26.710582 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:26.710536 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-69cbv" podStartSLOduration=6.710522514 podStartE2EDuration="6.710522514s" podCreationTimestamp="2026-04-23 18:50:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:50:26.70908688 +0000 UTC m=+3484.629179356" watchObservedRunningTime="2026-04-23 18:50:26.710522514 +0000 UTC m=+3484.630614988" Apr 23 18:50:27.696040 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:27.696002 2568 generic.go:358] "Generic (PLEG): container finished" podID="5895e4ad-7504-4651-9025-c435cdda3ec6" containerID="a96a790b8f1f510aa076ed2c69ed21d7ea5813984b8d7d70032ec537146ab808" exitCode=0 Apr 23 18:50:27.696409 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:27.696084 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb" event={"ID":"5895e4ad-7504-4651-9025-c435cdda3ec6","Type":"ContainerDied","Data":"a96a790b8f1f510aa076ed2c69ed21d7ea5813984b8d7d70032ec537146ab808"} Apr 23 18:50:27.696476 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:27.696460 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-69cbv" Apr 23 18:50:27.697627 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:27.697601 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-69cbv" podUID="640e0835-684a-47eb-b888-ff812cf0805c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.55:8080: connect: connection refused" Apr 23 18:50:27.783743 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:27.783718 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb" Apr 23 18:50:27.834597 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:27.834572 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzjhq\" (UniqueName: \"kubernetes.io/projected/5895e4ad-7504-4651-9025-c435cdda3ec6-kube-api-access-qzjhq\") pod \"5895e4ad-7504-4651-9025-c435cdda3ec6\" (UID: \"5895e4ad-7504-4651-9025-c435cdda3ec6\") " Apr 23 18:50:27.834759 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:27.834616 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5895e4ad-7504-4651-9025-c435cdda3ec6-proxy-tls\") pod \"5895e4ad-7504-4651-9025-c435cdda3ec6\" (UID: \"5895e4ad-7504-4651-9025-c435cdda3ec6\") " Apr 23 18:50:27.834759 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:27.834658 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5895e4ad-7504-4651-9025-c435cdda3ec6-kserve-provision-location\") pod \"5895e4ad-7504-4651-9025-c435cdda3ec6\" (UID: \"5895e4ad-7504-4651-9025-c435cdda3ec6\") " Apr 23 18:50:27.834759 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:27.834694 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5895e4ad-7504-4651-9025-c435cdda3ec6-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"5895e4ad-7504-4651-9025-c435cdda3ec6\" (UID: \"5895e4ad-7504-4651-9025-c435cdda3ec6\") " Apr 23 18:50:27.835063 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:27.835038 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5895e4ad-7504-4651-9025-c435cdda3ec6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5895e4ad-7504-4651-9025-c435cdda3ec6" (UID: "5895e4ad-7504-4651-9025-c435cdda3ec6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:50:27.835123 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:27.835085 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5895e4ad-7504-4651-9025-c435cdda3ec6-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config") pod "5895e4ad-7504-4651-9025-c435cdda3ec6" (UID: "5895e4ad-7504-4651-9025-c435cdda3ec6"). InnerVolumeSpecName "isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:50:27.836660 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:27.836641 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5895e4ad-7504-4651-9025-c435cdda3ec6-kube-api-access-qzjhq" (OuterVolumeSpecName: "kube-api-access-qzjhq") pod "5895e4ad-7504-4651-9025-c435cdda3ec6" (UID: "5895e4ad-7504-4651-9025-c435cdda3ec6"). InnerVolumeSpecName "kube-api-access-qzjhq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:50:27.836726 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:27.836677 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5895e4ad-7504-4651-9025-c435cdda3ec6-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5895e4ad-7504-4651-9025-c435cdda3ec6" (UID: "5895e4ad-7504-4651-9025-c435cdda3ec6"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:50:27.935766 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:27.935729 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qzjhq\" (UniqueName: \"kubernetes.io/projected/5895e4ad-7504-4651-9025-c435cdda3ec6-kube-api-access-qzjhq\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:50:27.935766 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:27.935758 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5895e4ad-7504-4651-9025-c435cdda3ec6-proxy-tls\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:50:27.935766 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:27.935770 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5895e4ad-7504-4651-9025-c435cdda3ec6-kserve-provision-location\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:50:27.936055 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:27.935780 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5895e4ad-7504-4651-9025-c435cdda3ec6-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:50:28.700388 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:28.700355 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb" Apr 23 18:50:28.700388 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:28.700355 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb" event={"ID":"5895e4ad-7504-4651-9025-c435cdda3ec6","Type":"ContainerDied","Data":"534d54a6fe8e92b0f525ff292ab8bb528be92eefe96fcad09a28b779beb0a58a"} Apr 23 18:50:28.700874 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:28.700411 2568 scope.go:117] "RemoveContainer" containerID="d7c8db28802bfb012a5bdda951a1a16b0e7c5be116f06955687a232e06568c9a" Apr 23 18:50:28.700874 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:28.700636 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-69cbv" podUID="640e0835-684a-47eb-b888-ff812cf0805c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.55:8080: connect: connection refused" Apr 23 18:50:28.708249 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:28.708223 2568 scope.go:117] "RemoveContainer" containerID="a96a790b8f1f510aa076ed2c69ed21d7ea5813984b8d7d70032ec537146ab808" Apr 23 18:50:28.715265 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:28.715243 2568 scope.go:117] "RemoveContainer" containerID="00960d9625d38765610bf6d45c892c5ba64d33b1390bfc884ba7b018121e674c" Apr 23 18:50:28.718829 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:28.718806 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb"] Apr 23 18:50:28.724535 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:28.724512 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-h5txb"] Apr 23 18:50:30.601544 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:30.601515 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5895e4ad-7504-4651-9025-c435cdda3ec6" path="/var/lib/kubelet/pods/5895e4ad-7504-4651-9025-c435cdda3ec6/volumes" Apr 23 18:50:33.705097 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:33.705018 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-69cbv" Apr 23 18:50:33.705613 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:33.705587 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-69cbv" podUID="640e0835-684a-47eb-b888-ff812cf0805c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.55:8080: connect: connection refused" Apr 23 18:50:43.706428 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:43.706388 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-69cbv" podUID="640e0835-684a-47eb-b888-ff812cf0805c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.55:8080: connect: connection refused" Apr 23 18:50:53.705685 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:50:53.705644 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-69cbv" podUID="640e0835-684a-47eb-b888-ff812cf0805c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.55:8080: connect: connection refused" Apr 23 18:51:03.705527 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:03.705487 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-69cbv" podUID="640e0835-684a-47eb-b888-ff812cf0805c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.55:8080: connect: connection refused" Apr 23 18:51:13.705993 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:13.705953 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-69cbv" podUID="640e0835-684a-47eb-b888-ff812cf0805c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.55:8080: connect: connection refused" Apr 23 18:51:23.705573 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:23.705528 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-69cbv" podUID="640e0835-684a-47eb-b888-ff812cf0805c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.55:8080: connect: connection refused" Apr 23 18:51:33.706610 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:33.706578 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-69cbv" Apr 23 18:51:40.551035 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:40.551005 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-69cbv"] Apr 23 18:51:40.551497 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:40.551383 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-69cbv" podUID="640e0835-684a-47eb-b888-ff812cf0805c" containerName="kserve-container" containerID="cri-o://6c9b23d396883f956af77acc4d5a9bc591853442aeea0a650c7dd0a2d177f954" gracePeriod=30 Apr 23 18:51:40.551576 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:40.551506 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-69cbv" podUID="640e0835-684a-47eb-b888-ff812cf0805c" containerName="kube-rbac-proxy" containerID="cri-o://38208a55116402f53f81e51d8abaa9410dd6c6f369d7321ccd98d8f062175e33" gracePeriod=30 Apr 23 18:51:40.649493 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:40.649463 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-569fbb67c4-rvlgm"] Apr 23 18:51:40.649746 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:40.649734 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5895e4ad-7504-4651-9025-c435cdda3ec6" containerName="kube-rbac-proxy" Apr 23 18:51:40.649800 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:40.649748 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="5895e4ad-7504-4651-9025-c435cdda3ec6" containerName="kube-rbac-proxy" Apr 23 18:51:40.649800 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:40.649765 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5895e4ad-7504-4651-9025-c435cdda3ec6" containerName="storage-initializer" Apr 23 18:51:40.649800 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:40.649771 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="5895e4ad-7504-4651-9025-c435cdda3ec6" containerName="storage-initializer" Apr 23 18:51:40.649800 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:40.649780 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5895e4ad-7504-4651-9025-c435cdda3ec6" containerName="kserve-container" Apr 23 18:51:40.649800 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:40.649785 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="5895e4ad-7504-4651-9025-c435cdda3ec6" containerName="kserve-container" Apr 23 18:51:40.649986 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:40.649828 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="5895e4ad-7504-4651-9025-c435cdda3ec6" containerName="kube-rbac-proxy" Apr 23 18:51:40.649986 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:40.649838 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="5895e4ad-7504-4651-9025-c435cdda3ec6" containerName="kserve-container" Apr 23 18:51:40.652710 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:40.652690 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-569fbb67c4-rvlgm" Apr 23 18:51:40.654576 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:40.654550 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 23 18:51:40.654576 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:40.654568 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-predictor-serving-cert\"" Apr 23 18:51:40.654747 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:40.654628 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-kube-rbac-proxy-sar-config\"" Apr 23 18:51:40.662707 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:40.662687 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-569fbb67c4-rvlgm"] Apr 23 18:51:40.680807 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:40.680783 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-predictor-569fbb67c4-rvlgm\" (UID: \"f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-569fbb67c4-rvlgm" Apr 23 18:51:40.680950 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:40.680824 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-569fbb67c4-rvlgm\" (UID: \"f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-569fbb67c4-rvlgm" Apr 23 18:51:40.680950 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:40.680850 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mth27\" (UniqueName: \"kubernetes.io/projected/f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd-kube-api-access-mth27\") pod \"isvc-sklearn-s3-predictor-569fbb67c4-rvlgm\" (UID: \"f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-569fbb67c4-rvlgm" Apr 23 18:51:40.680950 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:40.680906 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd-proxy-tls\") pod \"isvc-sklearn-s3-predictor-569fbb67c4-rvlgm\" (UID: \"f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-569fbb67c4-rvlgm" Apr 23 18:51:40.782271 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:40.782236 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-569fbb67c4-rvlgm\" (UID: \"f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-569fbb67c4-rvlgm" Apr 23 18:51:40.782457 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:40.782284 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mth27\" (UniqueName: \"kubernetes.io/projected/f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd-kube-api-access-mth27\") pod \"isvc-sklearn-s3-predictor-569fbb67c4-rvlgm\" (UID: \"f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-569fbb67c4-rvlgm" Apr 23 18:51:40.782457 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:40.782338 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd-proxy-tls\") pod \"isvc-sklearn-s3-predictor-569fbb67c4-rvlgm\" (UID: \"f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-569fbb67c4-rvlgm" Apr 23 18:51:40.782457 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:40.782381 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-predictor-569fbb67c4-rvlgm\" (UID: \"f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-569fbb67c4-rvlgm" Apr 23 18:51:40.782671 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:51:40.782496 2568 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-predictor-serving-cert: secret "isvc-sklearn-s3-predictor-serving-cert" not found Apr 23 18:51:40.782671 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:51:40.782565 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd-proxy-tls podName:f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd nodeName:}" failed. No retries permitted until 2026-04-23 18:51:41.282541758 +0000 UTC m=+3559.202634212 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd-proxy-tls") pod "isvc-sklearn-s3-predictor-569fbb67c4-rvlgm" (UID: "f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd") : secret "isvc-sklearn-s3-predictor-serving-cert" not found Apr 23 18:51:40.782671 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:40.782631 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-569fbb67c4-rvlgm\" (UID: \"f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-569fbb67c4-rvlgm" Apr 23 18:51:40.783099 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:40.783077 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-predictor-569fbb67c4-rvlgm\" (UID: \"f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-569fbb67c4-rvlgm" Apr 23 18:51:40.793264 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:40.793238 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mth27\" (UniqueName: \"kubernetes.io/projected/f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd-kube-api-access-mth27\") pod \"isvc-sklearn-s3-predictor-569fbb67c4-rvlgm\" (UID: \"f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-569fbb67c4-rvlgm" Apr 23 18:51:40.904225 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:40.904196 2568 generic.go:358] "Generic (PLEG): container finished" podID="640e0835-684a-47eb-b888-ff812cf0805c" containerID="38208a55116402f53f81e51d8abaa9410dd6c6f369d7321ccd98d8f062175e33" exitCode=2 Apr 23 18:51:40.904404 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:40.904231 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-69cbv" event={"ID":"640e0835-684a-47eb-b888-ff812cf0805c","Type":"ContainerDied","Data":"38208a55116402f53f81e51d8abaa9410dd6c6f369d7321ccd98d8f062175e33"} Apr 23 18:51:41.287548 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:41.287451 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd-proxy-tls\") pod \"isvc-sklearn-s3-predictor-569fbb67c4-rvlgm\" (UID: \"f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-569fbb67c4-rvlgm" Apr 23 18:51:41.289832 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:41.289802 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd-proxy-tls\") pod \"isvc-sklearn-s3-predictor-569fbb67c4-rvlgm\" (UID: \"f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-569fbb67c4-rvlgm" Apr 23 18:51:41.563473 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:41.563389 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-569fbb67c4-rvlgm" Apr 23 18:51:41.686663 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:41.686627 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-569fbb67c4-rvlgm"] Apr 23 18:51:41.690625 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:51:41.690582 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7a54c88_4e7b_4dc0_b90e_b67c46e25fbd.slice/crio-8f83592f9367c1b6674f2292f17059f500845da9c382957a07d2a44ed4c1905e WatchSource:0}: Error finding container 8f83592f9367c1b6674f2292f17059f500845da9c382957a07d2a44ed4c1905e: Status 404 returned error can't find the container with id 8f83592f9367c1b6674f2292f17059f500845da9c382957a07d2a44ed4c1905e Apr 23 18:51:41.908451 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:41.908418 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-569fbb67c4-rvlgm" event={"ID":"f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd","Type":"ContainerStarted","Data":"3dc974f6348b1b96727dbbda9bf53028ab5077313fc103de1e3a5a62334a1f0d"} Apr 23 18:51:41.908451 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:41.908454 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-569fbb67c4-rvlgm" event={"ID":"f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd","Type":"ContainerStarted","Data":"8f83592f9367c1b6674f2292f17059f500845da9c382957a07d2a44ed4c1905e"} Apr 23 18:51:42.912917 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:42.912882 2568 generic.go:358] "Generic (PLEG): container finished" podID="f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd" containerID="3dc974f6348b1b96727dbbda9bf53028ab5077313fc103de1e3a5a62334a1f0d" exitCode=0 Apr 23 18:51:42.913411 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:42.912955 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-569fbb67c4-rvlgm" event={"ID":"f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd","Type":"ContainerDied","Data":"3dc974f6348b1b96727dbbda9bf53028ab5077313fc103de1e3a5a62334a1f0d"} Apr 23 18:51:43.701074 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:43.701031 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-69cbv" podUID="640e0835-684a-47eb-b888-ff812cf0805c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.55:8643/healthz\": dial tcp 10.133.0.55:8643: connect: connection refused" Apr 23 18:51:43.706479 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:43.706450 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-69cbv" podUID="640e0835-684a-47eb-b888-ff812cf0805c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.55:8080: connect: connection refused" Apr 23 18:51:43.917192 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:43.917161 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-569fbb67c4-rvlgm" event={"ID":"f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd","Type":"ContainerStarted","Data":"297eff1a5a4d937740fa4174b640b876daf3c966ff189946b5c068bc91ec04c2"} Apr 23 18:51:43.917192 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:43.917195 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-569fbb67c4-rvlgm" event={"ID":"f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd","Type":"ContainerStarted","Data":"dd6e58066cd85a059af37b569267ac3c3154ba3bdd3b52b8232b348917088baf"} Apr 23 18:51:43.917593 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:43.917287 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-569fbb67c4-rvlgm" Apr 23 18:51:43.939882 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:43.939842 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-569fbb67c4-rvlgm" podStartSLOduration=3.9398266509999997 podStartE2EDuration="3.939826651s" podCreationTimestamp="2026-04-23 18:51:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:51:43.938618809 +0000 UTC m=+3561.858711283" watchObservedRunningTime="2026-04-23 18:51:43.939826651 +0000 UTC m=+3561.859919191" Apr 23 18:51:44.190921 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:44.190891 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-69cbv" Apr 23 18:51:44.207572 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:44.207540 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/640e0835-684a-47eb-b888-ff812cf0805c-kserve-provision-location\") pod \"640e0835-684a-47eb-b888-ff812cf0805c\" (UID: \"640e0835-684a-47eb-b888-ff812cf0805c\") " Apr 23 18:51:44.207713 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:44.207595 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdb6m\" (UniqueName: \"kubernetes.io/projected/640e0835-684a-47eb-b888-ff812cf0805c-kube-api-access-tdb6m\") pod \"640e0835-684a-47eb-b888-ff812cf0805c\" (UID: \"640e0835-684a-47eb-b888-ff812cf0805c\") " Apr 23 18:51:44.207713 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:44.207666 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/640e0835-684a-47eb-b888-ff812cf0805c-proxy-tls\") pod \"640e0835-684a-47eb-b888-ff812cf0805c\" (UID: \"640e0835-684a-47eb-b888-ff812cf0805c\") " Apr 23 18:51:44.207713 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:44.207700 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/640e0835-684a-47eb-b888-ff812cf0805c-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"640e0835-684a-47eb-b888-ff812cf0805c\" (UID: \"640e0835-684a-47eb-b888-ff812cf0805c\") " Apr 23 18:51:44.207947 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:44.207912 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/640e0835-684a-47eb-b888-ff812cf0805c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "640e0835-684a-47eb-b888-ff812cf0805c" (UID: "640e0835-684a-47eb-b888-ff812cf0805c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:51:44.208162 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:44.208138 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/640e0835-684a-47eb-b888-ff812cf0805c-isvc-xgboost-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-v2-kube-rbac-proxy-sar-config") pod "640e0835-684a-47eb-b888-ff812cf0805c" (UID: "640e0835-684a-47eb-b888-ff812cf0805c"). InnerVolumeSpecName "isvc-xgboost-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:51:44.209824 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:44.209796 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/640e0835-684a-47eb-b888-ff812cf0805c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "640e0835-684a-47eb-b888-ff812cf0805c" (UID: "640e0835-684a-47eb-b888-ff812cf0805c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:51:44.209886 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:44.209848 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/640e0835-684a-47eb-b888-ff812cf0805c-kube-api-access-tdb6m" (OuterVolumeSpecName: "kube-api-access-tdb6m") pod "640e0835-684a-47eb-b888-ff812cf0805c" (UID: "640e0835-684a-47eb-b888-ff812cf0805c"). InnerVolumeSpecName "kube-api-access-tdb6m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:51:44.309106 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:44.309029 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/640e0835-684a-47eb-b888-ff812cf0805c-proxy-tls\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:51:44.309106 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:44.309054 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/640e0835-684a-47eb-b888-ff812cf0805c-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:51:44.309106 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:44.309064 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/640e0835-684a-47eb-b888-ff812cf0805c-kserve-provision-location\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:51:44.309106 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:44.309073 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tdb6m\" (UniqueName: \"kubernetes.io/projected/640e0835-684a-47eb-b888-ff812cf0805c-kube-api-access-tdb6m\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:51:44.921630 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:44.921540 2568 generic.go:358] "Generic (PLEG): container finished" podID="640e0835-684a-47eb-b888-ff812cf0805c" containerID="6c9b23d396883f956af77acc4d5a9bc591853442aeea0a650c7dd0a2d177f954" exitCode=0 Apr 23 18:51:44.922051 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:44.921628 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-69cbv" Apr 23 18:51:44.922051 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:44.921627 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-69cbv" event={"ID":"640e0835-684a-47eb-b888-ff812cf0805c","Type":"ContainerDied","Data":"6c9b23d396883f956af77acc4d5a9bc591853442aeea0a650c7dd0a2d177f954"} Apr 23 18:51:44.922051 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:44.921669 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-69cbv" event={"ID":"640e0835-684a-47eb-b888-ff812cf0805c","Type":"ContainerDied","Data":"fb4dd74893a8b7fdbfb253cfc2f8be522c530c21ce06ba5cbbd4f771a9ec54d6"} Apr 23 18:51:44.922051 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:44.921689 2568 scope.go:117] "RemoveContainer" containerID="38208a55116402f53f81e51d8abaa9410dd6c6f369d7321ccd98d8f062175e33" Apr 23 18:51:44.922328 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:44.922308 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-569fbb67c4-rvlgm" Apr 23 18:51:44.923854 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:44.923827 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-569fbb67c4-rvlgm" podUID="f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.56:8080: connect: connection refused" Apr 23 18:51:44.929376 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:44.929358 2568 scope.go:117] "RemoveContainer" containerID="6c9b23d396883f956af77acc4d5a9bc591853442aeea0a650c7dd0a2d177f954" Apr 23 18:51:44.936007 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:44.935990 2568 scope.go:117] "RemoveContainer" containerID="d30817271f7108d9b2c71c5b127d5f91c65888333e964cbd8f8665e6a69f1dda" Apr 23 18:51:44.946048 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:44.945687 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-69cbv"] Apr 23 18:51:44.946211 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:44.946194 2568 scope.go:117] "RemoveContainer" containerID="38208a55116402f53f81e51d8abaa9410dd6c6f369d7321ccd98d8f062175e33" Apr 23 18:51:44.946590 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:51:44.946568 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38208a55116402f53f81e51d8abaa9410dd6c6f369d7321ccd98d8f062175e33\": container with ID starting with 38208a55116402f53f81e51d8abaa9410dd6c6f369d7321ccd98d8f062175e33 not found: ID does not exist" containerID="38208a55116402f53f81e51d8abaa9410dd6c6f369d7321ccd98d8f062175e33" Apr 23 18:51:44.946671 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:44.946601 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38208a55116402f53f81e51d8abaa9410dd6c6f369d7321ccd98d8f062175e33"} err="failed to get container status \"38208a55116402f53f81e51d8abaa9410dd6c6f369d7321ccd98d8f062175e33\": rpc error: code = NotFound desc = could not find container \"38208a55116402f53f81e51d8abaa9410dd6c6f369d7321ccd98d8f062175e33\": container with ID starting with 38208a55116402f53f81e51d8abaa9410dd6c6f369d7321ccd98d8f062175e33 not found: ID does not exist" Apr 23 18:51:44.946671 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:44.946626 2568 scope.go:117] "RemoveContainer" containerID="6c9b23d396883f956af77acc4d5a9bc591853442aeea0a650c7dd0a2d177f954" Apr 23 18:51:44.947096 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:51:44.947056 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c9b23d396883f956af77acc4d5a9bc591853442aeea0a650c7dd0a2d177f954\": container with ID starting with 6c9b23d396883f956af77acc4d5a9bc591853442aeea0a650c7dd0a2d177f954 not found: ID does not exist" containerID="6c9b23d396883f956af77acc4d5a9bc591853442aeea0a650c7dd0a2d177f954" Apr 23 18:51:44.947211 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:44.947105 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c9b23d396883f956af77acc4d5a9bc591853442aeea0a650c7dd0a2d177f954"} err="failed to get container status \"6c9b23d396883f956af77acc4d5a9bc591853442aeea0a650c7dd0a2d177f954\": rpc error: code = NotFound desc = could not find container \"6c9b23d396883f956af77acc4d5a9bc591853442aeea0a650c7dd0a2d177f954\": container with ID starting with 6c9b23d396883f956af77acc4d5a9bc591853442aeea0a650c7dd0a2d177f954 not found: ID does not exist" Apr 23 18:51:44.947211 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:44.947129 2568 scope.go:117] "RemoveContainer" containerID="d30817271f7108d9b2c71c5b127d5f91c65888333e964cbd8f8665e6a69f1dda" Apr 23 18:51:44.947455 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:51:44.947428 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d30817271f7108d9b2c71c5b127d5f91c65888333e964cbd8f8665e6a69f1dda\": container with ID starting with d30817271f7108d9b2c71c5b127d5f91c65888333e964cbd8f8665e6a69f1dda not found: ID does not exist" containerID="d30817271f7108d9b2c71c5b127d5f91c65888333e964cbd8f8665e6a69f1dda" Apr 23 18:51:44.947560 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:44.947461 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d30817271f7108d9b2c71c5b127d5f91c65888333e964cbd8f8665e6a69f1dda"} err="failed to get container status \"d30817271f7108d9b2c71c5b127d5f91c65888333e964cbd8f8665e6a69f1dda\": rpc error: code = NotFound desc = could not find container \"d30817271f7108d9b2c71c5b127d5f91c65888333e964cbd8f8665e6a69f1dda\": container with ID starting with d30817271f7108d9b2c71c5b127d5f91c65888333e964cbd8f8665e6a69f1dda not found: ID does not exist" Apr 23 18:51:44.948142 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:44.948122 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-69cbv"] Apr 23 18:51:45.925856 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:45.925819 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-569fbb67c4-rvlgm" podUID="f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.56:8080: connect: connection refused" Apr 23 18:51:46.598838 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:46.598804 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="640e0835-684a-47eb-b888-ff812cf0805c" path="/var/lib/kubelet/pods/640e0835-684a-47eb-b888-ff812cf0805c/volumes" Apr 23 18:51:50.930505 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:50.930471 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-569fbb67c4-rvlgm" Apr 23 18:51:50.931013 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:51:50.930988 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-569fbb67c4-rvlgm" podUID="f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.56:8080: connect: connection refused" Apr 23 18:52:00.931834 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:52:00.931796 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-569fbb67c4-rvlgm" podUID="f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.56:8080: connect: connection refused" Apr 23 18:52:10.931796 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:52:10.931758 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-569fbb67c4-rvlgm" podUID="f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.56:8080: connect: connection refused" Apr 23 18:52:20.931718 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:52:20.931673 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-569fbb67c4-rvlgm" podUID="f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.56:8080: connect: connection refused" Apr 23 18:52:22.748313 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:52:22.748221 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwrk7_52f4d4c2-438f-4486-8ffb-4afbaad14bf1/ovn-acl-logging/0.log" Apr 23 18:52:22.751706 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:52:22.751683 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwrk7_52f4d4c2-438f-4486-8ffb-4afbaad14bf1/ovn-acl-logging/0.log" Apr 23 18:52:22.751949 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:52:22.751907 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_8dc4456d0c7e39a0756b16fde0e83318/kube-rbac-proxy-crio/4.log" Apr 23 18:52:22.755349 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:52:22.755328 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_8dc4456d0c7e39a0756b16fde0e83318/kube-rbac-proxy-crio/4.log" Apr 23 18:52:30.931179 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:52:30.931140 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-569fbb67c4-rvlgm" podUID="f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.56:8080: connect: connection refused" Apr 23 18:52:40.931586 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:52:40.931549 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-569fbb67c4-rvlgm" podUID="f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.56:8080: connect: connection refused" Apr 23 18:52:50.932047 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:52:50.932015 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-569fbb67c4-rvlgm" Apr 23 18:53:00.799278 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:00.799235 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-569fbb67c4-rvlgm"] Apr 23 18:53:00.801724 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:00.799667 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-569fbb67c4-rvlgm" podUID="f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd" containerName="kserve-container" containerID="cri-o://dd6e58066cd85a059af37b569267ac3c3154ba3bdd3b52b8232b348917088baf" gracePeriod=30 Apr 23 18:53:00.801724 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:00.799701 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-569fbb67c4-rvlgm" podUID="f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd" containerName="kube-rbac-proxy" containerID="cri-o://297eff1a5a4d937740fa4174b640b876daf3c966ff189946b5c068bc91ec04c2" gracePeriod=30 Apr 23 18:53:00.926686 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:00.926645 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-569fbb67c4-rvlgm" podUID="f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.56:8643/healthz\": dial tcp 10.133.0.56:8643: connect: connection refused" Apr 23 18:53:00.931632 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:00.931595 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-569fbb67c4-rvlgm" podUID="f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.56:8080: connect: connection refused" Apr 23 18:53:00.989792 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:00.989751 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j"] Apr 23 18:53:00.990097 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:00.990077 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="640e0835-684a-47eb-b888-ff812cf0805c" containerName="kube-rbac-proxy" Apr 23 18:53:00.990097 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:00.990098 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="640e0835-684a-47eb-b888-ff812cf0805c" containerName="kube-rbac-proxy" Apr 23 18:53:00.990097 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:00.990112 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="640e0835-684a-47eb-b888-ff812cf0805c" containerName="kserve-container" Apr 23 18:53:00.990317 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:00.990120 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="640e0835-684a-47eb-b888-ff812cf0805c" containerName="kserve-container" Apr 23 18:53:00.990317 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:00.990136 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="640e0835-684a-47eb-b888-ff812cf0805c" containerName="storage-initializer" Apr 23 18:53:00.990317 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:00.990144 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="640e0835-684a-47eb-b888-ff812cf0805c" containerName="storage-initializer" Apr 23 18:53:00.990317 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:00.990201 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="640e0835-684a-47eb-b888-ff812cf0805c" containerName="kserve-container" Apr 23 18:53:00.990317 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:00.990215 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="640e0835-684a-47eb-b888-ff812cf0805c" containerName="kube-rbac-proxy" Apr 23 18:53:00.993192 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:00.993167 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j" Apr 23 18:53:00.995002 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:00.994975 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-pass-predictor-serving-cert\"" Apr 23 18:53:00.995125 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:00.995040 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\"" Apr 23 18:53:00.995125 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:00.995040 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 23 18:53:01.003114 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:01.003056 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j"] Apr 23 18:53:01.080390 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:01.080351 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/1b94ea5b-5812-442e-b351-02b01e3347b3-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j\" (UID: \"1b94ea5b-5812-442e-b351-02b01e3347b3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j" Apr 23 18:53:01.080599 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:01.080397 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b94ea5b-5812-442e-b351-02b01e3347b3-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j\" (UID: \"1b94ea5b-5812-442e-b351-02b01e3347b3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j" Apr 23 18:53:01.080599 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:01.080417 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktkh2\" (UniqueName: \"kubernetes.io/projected/1b94ea5b-5812-442e-b351-02b01e3347b3-kube-api-access-ktkh2\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j\" (UID: \"1b94ea5b-5812-442e-b351-02b01e3347b3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j" Apr 23 18:53:01.080599 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:01.080531 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1b94ea5b-5812-442e-b351-02b01e3347b3-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j\" (UID: \"1b94ea5b-5812-442e-b351-02b01e3347b3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j" Apr 23 18:53:01.080599 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:01.080578 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b94ea5b-5812-442e-b351-02b01e3347b3-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j\" (UID: \"1b94ea5b-5812-442e-b351-02b01e3347b3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j" Apr 23 18:53:01.135640 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:01.135607 2568 generic.go:358] "Generic (PLEG): container finished" podID="f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd" containerID="297eff1a5a4d937740fa4174b640b876daf3c966ff189946b5c068bc91ec04c2" exitCode=2 Apr 23 18:53:01.135819 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:01.135682 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-569fbb67c4-rvlgm" event={"ID":"f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd","Type":"ContainerDied","Data":"297eff1a5a4d937740fa4174b640b876daf3c966ff189946b5c068bc91ec04c2"} Apr 23 18:53:01.181827 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:01.181788 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b94ea5b-5812-442e-b351-02b01e3347b3-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j\" (UID: \"1b94ea5b-5812-442e-b351-02b01e3347b3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j" Apr 23 18:53:01.182043 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:01.181850 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/1b94ea5b-5812-442e-b351-02b01e3347b3-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j\" (UID: \"1b94ea5b-5812-442e-b351-02b01e3347b3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j" Apr 23 18:53:01.182043 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:01.181871 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b94ea5b-5812-442e-b351-02b01e3347b3-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j\" (UID: \"1b94ea5b-5812-442e-b351-02b01e3347b3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j" Apr 23 18:53:01.182043 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:01.181892 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ktkh2\" (UniqueName: \"kubernetes.io/projected/1b94ea5b-5812-442e-b351-02b01e3347b3-kube-api-access-ktkh2\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j\" (UID: \"1b94ea5b-5812-442e-b351-02b01e3347b3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j" Apr 23 18:53:01.182043 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:01.181942 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1b94ea5b-5812-442e-b351-02b01e3347b3-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j\" (UID: \"1b94ea5b-5812-442e-b351-02b01e3347b3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j" Apr 23 18:53:01.182287 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:01.182261 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b94ea5b-5812-442e-b351-02b01e3347b3-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j\" (UID: \"1b94ea5b-5812-442e-b351-02b01e3347b3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j" Apr 23 18:53:01.182545 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:01.182511 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1b94ea5b-5812-442e-b351-02b01e3347b3-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j\" (UID: \"1b94ea5b-5812-442e-b351-02b01e3347b3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j" Apr 23 18:53:01.182669 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:01.182584 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/1b94ea5b-5812-442e-b351-02b01e3347b3-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j\" (UID: \"1b94ea5b-5812-442e-b351-02b01e3347b3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j" Apr 23 18:53:01.184360 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:01.184342 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b94ea5b-5812-442e-b351-02b01e3347b3-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j\" (UID: \"1b94ea5b-5812-442e-b351-02b01e3347b3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j" Apr 23 18:53:01.190704 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:01.190679 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktkh2\" (UniqueName: \"kubernetes.io/projected/1b94ea5b-5812-442e-b351-02b01e3347b3-kube-api-access-ktkh2\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j\" (UID: \"1b94ea5b-5812-442e-b351-02b01e3347b3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j" Apr 23 18:53:01.305804 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:01.305760 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j" Apr 23 18:53:01.431902 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:01.431863 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j"] Apr 23 18:53:01.436185 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:53:01.436152 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b94ea5b_5812_442e_b351_02b01e3347b3.slice/crio-29b01ea1760e05bd28b6f88b995f2b067dfb147e13d50dd4aacc17f8b70a6e05 WatchSource:0}: Error finding container 29b01ea1760e05bd28b6f88b995f2b067dfb147e13d50dd4aacc17f8b70a6e05: Status 404 returned error can't find the container with id 29b01ea1760e05bd28b6f88b995f2b067dfb147e13d50dd4aacc17f8b70a6e05 Apr 23 18:53:02.140147 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:02.140108 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j" event={"ID":"1b94ea5b-5812-442e-b351-02b01e3347b3","Type":"ContainerStarted","Data":"7f87ca706da39ed823b069559d07a45591a3c2a4a659a30900452b931a3df2df"} Apr 23 18:53:02.140147 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:02.140149 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j" event={"ID":"1b94ea5b-5812-442e-b351-02b01e3347b3","Type":"ContainerStarted","Data":"29b01ea1760e05bd28b6f88b995f2b067dfb147e13d50dd4aacc17f8b70a6e05"} Apr 23 18:53:03.145102 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:03.145070 2568 generic.go:358] "Generic (PLEG): container finished" podID="1b94ea5b-5812-442e-b351-02b01e3347b3" containerID="7f87ca706da39ed823b069559d07a45591a3c2a4a659a30900452b931a3df2df" exitCode=0 Apr 23 18:53:03.145494 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:03.145111 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j" event={"ID":"1b94ea5b-5812-442e-b351-02b01e3347b3","Type":"ContainerDied","Data":"7f87ca706da39ed823b069559d07a45591a3c2a4a659a30900452b931a3df2df"} Apr 23 18:53:04.150867 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:04.150826 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j" event={"ID":"1b94ea5b-5812-442e-b351-02b01e3347b3","Type":"ContainerStarted","Data":"33dcd7b67e323535fa0b6ec185a622d268f70642e961713a1b0ab019a760ac6a"} Apr 23 18:53:04.151282 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:04.150873 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j" event={"ID":"1b94ea5b-5812-442e-b351-02b01e3347b3","Type":"ContainerStarted","Data":"7288595377abf28c323efdfeafde5d13e2588f530accd33603fd9247888d4a1d"} Apr 23 18:53:04.151282 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:04.151072 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j" Apr 23 18:53:04.172526 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:04.172417 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j" podStartSLOduration=4.172392639 podStartE2EDuration="4.172392639s" podCreationTimestamp="2026-04-23 18:53:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:53:04.171299175 +0000 UTC m=+3642.091391649" watchObservedRunningTime="2026-04-23 18:53:04.172392639 +0000 UTC m=+3642.092485115" Apr 23 18:53:05.132943 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:05.132902 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-569fbb67c4-rvlgm" Apr 23 18:53:05.155945 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:05.155896 2568 generic.go:358] "Generic (PLEG): container finished" podID="f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd" containerID="dd6e58066cd85a059af37b569267ac3c3154ba3bdd3b52b8232b348917088baf" exitCode=0 Apr 23 18:53:05.156408 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:05.155971 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-569fbb67c4-rvlgm" event={"ID":"f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd","Type":"ContainerDied","Data":"dd6e58066cd85a059af37b569267ac3c3154ba3bdd3b52b8232b348917088baf"} Apr 23 18:53:05.156408 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:05.156001 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-569fbb67c4-rvlgm" Apr 23 18:53:05.156408 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:05.156030 2568 scope.go:117] "RemoveContainer" containerID="297eff1a5a4d937740fa4174b640b876daf3c966ff189946b5c068bc91ec04c2" Apr 23 18:53:05.156408 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:05.156015 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-569fbb67c4-rvlgm" event={"ID":"f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd","Type":"ContainerDied","Data":"8f83592f9367c1b6674f2292f17059f500845da9c382957a07d2a44ed4c1905e"} Apr 23 18:53:05.157078 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:05.157041 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j" Apr 23 18:53:05.161954 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:05.157926 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j" podUID="1b94ea5b-5812-442e-b351-02b01e3347b3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.57:8080: connect: connection refused" Apr 23 18:53:05.171203 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:05.171172 2568 scope.go:117] "RemoveContainer" containerID="dd6e58066cd85a059af37b569267ac3c3154ba3bdd3b52b8232b348917088baf" Apr 23 18:53:05.178831 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:05.178809 2568 scope.go:117] "RemoveContainer" containerID="3dc974f6348b1b96727dbbda9bf53028ab5077313fc103de1e3a5a62334a1f0d" Apr 23 18:53:05.185680 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:05.185656 2568 scope.go:117] "RemoveContainer" containerID="297eff1a5a4d937740fa4174b640b876daf3c966ff189946b5c068bc91ec04c2" Apr 23 18:53:05.185995 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:53:05.185974 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"297eff1a5a4d937740fa4174b640b876daf3c966ff189946b5c068bc91ec04c2\": container with ID starting with 297eff1a5a4d937740fa4174b640b876daf3c966ff189946b5c068bc91ec04c2 not found: ID does not exist" containerID="297eff1a5a4d937740fa4174b640b876daf3c966ff189946b5c068bc91ec04c2" Apr 23 18:53:05.186060 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:05.186007 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"297eff1a5a4d937740fa4174b640b876daf3c966ff189946b5c068bc91ec04c2"} err="failed to get container status \"297eff1a5a4d937740fa4174b640b876daf3c966ff189946b5c068bc91ec04c2\": rpc error: code = NotFound desc = could not find container \"297eff1a5a4d937740fa4174b640b876daf3c966ff189946b5c068bc91ec04c2\": container with ID starting with 297eff1a5a4d937740fa4174b640b876daf3c966ff189946b5c068bc91ec04c2 not found: ID does not exist" Apr 23 18:53:05.186060 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:05.186032 2568 scope.go:117] "RemoveContainer" containerID="dd6e58066cd85a059af37b569267ac3c3154ba3bdd3b52b8232b348917088baf" Apr 23 18:53:05.186282 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:53:05.186262 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd6e58066cd85a059af37b569267ac3c3154ba3bdd3b52b8232b348917088baf\": container with ID starting with dd6e58066cd85a059af37b569267ac3c3154ba3bdd3b52b8232b348917088baf not found: ID does not exist" containerID="dd6e58066cd85a059af37b569267ac3c3154ba3bdd3b52b8232b348917088baf" Apr 23 18:53:05.186336 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:05.186286 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd6e58066cd85a059af37b569267ac3c3154ba3bdd3b52b8232b348917088baf"} err="failed to get container status \"dd6e58066cd85a059af37b569267ac3c3154ba3bdd3b52b8232b348917088baf\": rpc error: code = NotFound desc = could not find container \"dd6e58066cd85a059af37b569267ac3c3154ba3bdd3b52b8232b348917088baf\": container with ID starting with dd6e58066cd85a059af37b569267ac3c3154ba3bdd3b52b8232b348917088baf not found: ID does not exist" Apr 23 18:53:05.186336 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:05.186299 2568 scope.go:117] "RemoveContainer" containerID="3dc974f6348b1b96727dbbda9bf53028ab5077313fc103de1e3a5a62334a1f0d" Apr 23 18:53:05.186522 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:53:05.186506 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dc974f6348b1b96727dbbda9bf53028ab5077313fc103de1e3a5a62334a1f0d\": container with ID starting with 3dc974f6348b1b96727dbbda9bf53028ab5077313fc103de1e3a5a62334a1f0d not found: ID does not exist" containerID="3dc974f6348b1b96727dbbda9bf53028ab5077313fc103de1e3a5a62334a1f0d" Apr 23 18:53:05.186567 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:05.186525 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dc974f6348b1b96727dbbda9bf53028ab5077313fc103de1e3a5a62334a1f0d"} err="failed to get container status \"3dc974f6348b1b96727dbbda9bf53028ab5077313fc103de1e3a5a62334a1f0d\": rpc error: code = NotFound desc = could not find container \"3dc974f6348b1b96727dbbda9bf53028ab5077313fc103de1e3a5a62334a1f0d\": container with ID starting with 3dc974f6348b1b96727dbbda9bf53028ab5077313fc103de1e3a5a62334a1f0d not found: ID does not exist" Apr 23 18:53:05.211137 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:05.211103 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd-kserve-provision-location\") pod \"f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd\" (UID: \"f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd\") " Apr 23 18:53:05.211475 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:05.211446 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd" (UID: "f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:53:05.211622 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:05.211602 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd-kserve-provision-location\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:53:05.311912 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:05.311822 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mth27\" (UniqueName: \"kubernetes.io/projected/f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd-kube-api-access-mth27\") pod \"f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd\" (UID: \"f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd\") " Apr 23 18:53:05.311912 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:05.311870 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd-proxy-tls\") pod \"f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd\" (UID: \"f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd\") " Apr 23 18:53:05.311912 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:05.311895 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd\" (UID: \"f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd\") " Apr 23 18:53:05.312317 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:05.312282 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd-isvc-sklearn-s3-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-kube-rbac-proxy-sar-config") pod "f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd" (UID: "f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd"). InnerVolumeSpecName "isvc-sklearn-s3-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:53:05.314002 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:05.313979 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd-kube-api-access-mth27" (OuterVolumeSpecName: "kube-api-access-mth27") pod "f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd" (UID: "f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd"). InnerVolumeSpecName "kube-api-access-mth27". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:53:05.314093 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:05.314057 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd" (UID: "f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:53:05.412506 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:05.412465 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mth27\" (UniqueName: \"kubernetes.io/projected/f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd-kube-api-access-mth27\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:53:05.412506 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:05.412500 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd-proxy-tls\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:53:05.412506 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:05.412511 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:53:05.477325 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:05.477283 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-569fbb67c4-rvlgm"] Apr 23 18:53:05.482782 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:05.482755 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-569fbb67c4-rvlgm"] Apr 23 18:53:06.159536 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:06.159498 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j" podUID="1b94ea5b-5812-442e-b351-02b01e3347b3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.57:8080: connect: connection refused" Apr 23 18:53:06.598305 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:06.598270 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd" path="/var/lib/kubelet/pods/f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd/volumes" Apr 23 18:53:11.163777 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:11.163747 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j" Apr 23 18:53:11.164283 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:11.164252 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j" podUID="1b94ea5b-5812-442e-b351-02b01e3347b3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.57:8080: connect: connection refused" Apr 23 18:53:21.164839 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:21.164798 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j" podUID="1b94ea5b-5812-442e-b351-02b01e3347b3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.57:8080: connect: connection refused" Apr 23 18:53:31.164896 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:31.164855 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j" podUID="1b94ea5b-5812-442e-b351-02b01e3347b3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.57:8080: connect: connection refused" Apr 23 18:53:41.165073 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:41.164978 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j" podUID="1b94ea5b-5812-442e-b351-02b01e3347b3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.57:8080: connect: connection refused" Apr 23 18:53:51.164476 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:53:51.164437 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j" podUID="1b94ea5b-5812-442e-b351-02b01e3347b3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.57:8080: connect: connection refused" Apr 23 18:54:01.164634 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:01.164600 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j" podUID="1b94ea5b-5812-442e-b351-02b01e3347b3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.57:8080: connect: connection refused" Apr 23 18:54:11.165572 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:11.165542 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j" Apr 23 18:54:21.004204 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:21.004167 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j"] Apr 23 18:54:21.004663 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:21.004604 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j" podUID="1b94ea5b-5812-442e-b351-02b01e3347b3" containerName="kserve-container" containerID="cri-o://7288595377abf28c323efdfeafde5d13e2588f530accd33603fd9247888d4a1d" gracePeriod=30 Apr 23 18:54:21.004815 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:21.004666 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j" podUID="1b94ea5b-5812-442e-b351-02b01e3347b3" containerName="kube-rbac-proxy" containerID="cri-o://33dcd7b67e323535fa0b6ec185a622d268f70642e961713a1b0ab019a760ac6a" gracePeriod=30 Apr 23 18:54:21.160002 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:21.159967 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j" podUID="1b94ea5b-5812-442e-b351-02b01e3347b3" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.57:8643/healthz\": dial tcp 10.133.0.57:8643: connect: connection refused" Apr 23 18:54:21.164652 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:21.164628 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j" podUID="1b94ea5b-5812-442e-b351-02b01e3347b3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.57:8080: connect: connection refused" Apr 23 18:54:21.368313 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:21.368281 2568 generic.go:358] "Generic (PLEG): container finished" podID="1b94ea5b-5812-442e-b351-02b01e3347b3" containerID="33dcd7b67e323535fa0b6ec185a622d268f70642e961713a1b0ab019a760ac6a" exitCode=2 Apr 23 18:54:21.368477 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:21.368355 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j" event={"ID":"1b94ea5b-5812-442e-b351-02b01e3347b3","Type":"ContainerDied","Data":"33dcd7b67e323535fa0b6ec185a622d268f70642e961713a1b0ab019a760ac6a"} Apr 23 18:54:22.118483 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:22.118449 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk"] Apr 23 18:54:22.118860 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:22.118714 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd" containerName="kserve-container" Apr 23 18:54:22.118860 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:22.118726 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd" containerName="kserve-container" Apr 23 18:54:22.118860 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:22.118740 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd" containerName="storage-initializer" Apr 23 18:54:22.118860 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:22.118746 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd" containerName="storage-initializer" Apr 23 18:54:22.118860 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:22.118759 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd" containerName="kube-rbac-proxy" Apr 23 18:54:22.118860 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:22.118764 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd" containerName="kube-rbac-proxy" Apr 23 18:54:22.118860 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:22.118804 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd" containerName="kserve-container" Apr 23 18:54:22.118860 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:22.118814 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="f7a54c88-4e7b-4dc0-b90e-b67c46e25fbd" containerName="kube-rbac-proxy" Apr 23 18:54:22.121874 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:22.121851 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk" Apr 23 18:54:22.123799 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:22.123780 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-fail-predictor-serving-cert\"" Apr 23 18:54:22.123913 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:22.123784 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\"" Apr 23 18:54:22.133546 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:22.133522 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk"] Apr 23 18:54:22.241392 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:22.241353 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khf58\" (UniqueName: \"kubernetes.io/projected/84a76fef-9488-4ca9-9589-0ad4873c30d4-kube-api-access-khf58\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk\" (UID: \"84a76fef-9488-4ca9-9589-0ad4873c30d4\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk" Apr 23 18:54:22.241392 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:22.241389 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84a76fef-9488-4ca9-9589-0ad4873c30d4-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk\" (UID: \"84a76fef-9488-4ca9-9589-0ad4873c30d4\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk" Apr 23 18:54:22.241637 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:22.241471 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/84a76fef-9488-4ca9-9589-0ad4873c30d4-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk\" (UID: \"84a76fef-9488-4ca9-9589-0ad4873c30d4\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk" Apr 23 18:54:22.241637 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:22.241556 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/84a76fef-9488-4ca9-9589-0ad4873c30d4-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk\" (UID: \"84a76fef-9488-4ca9-9589-0ad4873c30d4\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk" Apr 23 18:54:22.342369 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:22.342338 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-khf58\" (UniqueName: \"kubernetes.io/projected/84a76fef-9488-4ca9-9589-0ad4873c30d4-kube-api-access-khf58\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk\" (UID: \"84a76fef-9488-4ca9-9589-0ad4873c30d4\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk" Apr 23 18:54:22.342476 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:22.342375 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84a76fef-9488-4ca9-9589-0ad4873c30d4-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk\" (UID: \"84a76fef-9488-4ca9-9589-0ad4873c30d4\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk" Apr 23 18:54:22.342476 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:22.342410 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/84a76fef-9488-4ca9-9589-0ad4873c30d4-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk\" (UID: \"84a76fef-9488-4ca9-9589-0ad4873c30d4\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk" Apr 23 18:54:22.342476 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:22.342459 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/84a76fef-9488-4ca9-9589-0ad4873c30d4-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk\" (UID: \"84a76fef-9488-4ca9-9589-0ad4873c30d4\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk" Apr 23 18:54:22.342833 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:22.342805 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84a76fef-9488-4ca9-9589-0ad4873c30d4-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk\" (UID: \"84a76fef-9488-4ca9-9589-0ad4873c30d4\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk" Apr 23 18:54:22.343133 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:22.343112 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/84a76fef-9488-4ca9-9589-0ad4873c30d4-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk\" (UID: \"84a76fef-9488-4ca9-9589-0ad4873c30d4\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk" Apr 23 18:54:22.344914 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:22.344894 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/84a76fef-9488-4ca9-9589-0ad4873c30d4-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk\" (UID: \"84a76fef-9488-4ca9-9589-0ad4873c30d4\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk" Apr 23 18:54:22.351431 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:22.351407 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-khf58\" (UniqueName: \"kubernetes.io/projected/84a76fef-9488-4ca9-9589-0ad4873c30d4-kube-api-access-khf58\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk\" (UID: \"84a76fef-9488-4ca9-9589-0ad4873c30d4\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk" Apr 23 18:54:22.432227 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:22.432144 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk" Apr 23 18:54:22.552669 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:22.552637 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk"] Apr 23 18:54:22.556785 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:54:22.556759 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84a76fef_9488_4ca9_9589_0ad4873c30d4.slice/crio-5c1f961f7b23cc7280c1cf3e1178be43b32659fb3b70e135501f144c9aabb776 WatchSource:0}: Error finding container 5c1f961f7b23cc7280c1cf3e1178be43b32659fb3b70e135501f144c9aabb776: Status 404 returned error can't find the container with id 5c1f961f7b23cc7280c1cf3e1178be43b32659fb3b70e135501f144c9aabb776 Apr 23 18:54:23.375245 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:23.375204 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk" event={"ID":"84a76fef-9488-4ca9-9589-0ad4873c30d4","Type":"ContainerStarted","Data":"de236da68eedbe11d28f1bbe3dd3e88d963d5510203f94cb24d0ac366800883a"} Apr 23 18:54:23.375245 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:23.375247 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk" event={"ID":"84a76fef-9488-4ca9-9589-0ad4873c30d4","Type":"ContainerStarted","Data":"5c1f961f7b23cc7280c1cf3e1178be43b32659fb3b70e135501f144c9aabb776"} Apr 23 18:54:25.342130 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:25.342105 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j" Apr 23 18:54:25.382646 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:25.382615 2568 generic.go:358] "Generic (PLEG): container finished" podID="1b94ea5b-5812-442e-b351-02b01e3347b3" containerID="7288595377abf28c323efdfeafde5d13e2588f530accd33603fd9247888d4a1d" exitCode=0 Apr 23 18:54:25.382817 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:25.382691 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j" Apr 23 18:54:25.382817 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:25.382690 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j" event={"ID":"1b94ea5b-5812-442e-b351-02b01e3347b3","Type":"ContainerDied","Data":"7288595377abf28c323efdfeafde5d13e2588f530accd33603fd9247888d4a1d"} Apr 23 18:54:25.382817 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:25.382806 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j" event={"ID":"1b94ea5b-5812-442e-b351-02b01e3347b3","Type":"ContainerDied","Data":"29b01ea1760e05bd28b6f88b995f2b067dfb147e13d50dd4aacc17f8b70a6e05"} Apr 23 18:54:25.382995 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:25.382829 2568 scope.go:117] "RemoveContainer" containerID="33dcd7b67e323535fa0b6ec185a622d268f70642e961713a1b0ab019a760ac6a" Apr 23 18:54:25.390140 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:25.390091 2568 scope.go:117] "RemoveContainer" containerID="7288595377abf28c323efdfeafde5d13e2588f530accd33603fd9247888d4a1d" Apr 23 18:54:25.396962 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:25.396940 2568 scope.go:117] "RemoveContainer" containerID="7f87ca706da39ed823b069559d07a45591a3c2a4a659a30900452b931a3df2df" Apr 23 18:54:25.403291 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:25.403276 2568 scope.go:117] "RemoveContainer" containerID="33dcd7b67e323535fa0b6ec185a622d268f70642e961713a1b0ab019a760ac6a" Apr 23 18:54:25.403542 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:54:25.403524 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33dcd7b67e323535fa0b6ec185a622d268f70642e961713a1b0ab019a760ac6a\": container with ID starting with 33dcd7b67e323535fa0b6ec185a622d268f70642e961713a1b0ab019a760ac6a not found: ID does not exist" containerID="33dcd7b67e323535fa0b6ec185a622d268f70642e961713a1b0ab019a760ac6a" Apr 23 18:54:25.403590 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:25.403557 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33dcd7b67e323535fa0b6ec185a622d268f70642e961713a1b0ab019a760ac6a"} err="failed to get container status \"33dcd7b67e323535fa0b6ec185a622d268f70642e961713a1b0ab019a760ac6a\": rpc error: code = NotFound desc = could not find container \"33dcd7b67e323535fa0b6ec185a622d268f70642e961713a1b0ab019a760ac6a\": container with ID starting with 33dcd7b67e323535fa0b6ec185a622d268f70642e961713a1b0ab019a760ac6a not found: ID does not exist" Apr 23 18:54:25.403590 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:25.403576 2568 scope.go:117] "RemoveContainer" containerID="7288595377abf28c323efdfeafde5d13e2588f530accd33603fd9247888d4a1d" Apr 23 18:54:25.403794 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:54:25.403777 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7288595377abf28c323efdfeafde5d13e2588f530accd33603fd9247888d4a1d\": container with ID starting with 7288595377abf28c323efdfeafde5d13e2588f530accd33603fd9247888d4a1d not found: ID does not exist" containerID="7288595377abf28c323efdfeafde5d13e2588f530accd33603fd9247888d4a1d" Apr 23 18:54:25.403841 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:25.403802 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7288595377abf28c323efdfeafde5d13e2588f530accd33603fd9247888d4a1d"} err="failed to get container status \"7288595377abf28c323efdfeafde5d13e2588f530accd33603fd9247888d4a1d\": rpc error: code = NotFound desc = could not find container \"7288595377abf28c323efdfeafde5d13e2588f530accd33603fd9247888d4a1d\": container with ID starting with 7288595377abf28c323efdfeafde5d13e2588f530accd33603fd9247888d4a1d not found: ID does not exist" Apr 23 18:54:25.403841 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:25.403818 2568 scope.go:117] "RemoveContainer" containerID="7f87ca706da39ed823b069559d07a45591a3c2a4a659a30900452b931a3df2df" Apr 23 18:54:25.404041 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:54:25.404021 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f87ca706da39ed823b069559d07a45591a3c2a4a659a30900452b931a3df2df\": container with ID starting with 7f87ca706da39ed823b069559d07a45591a3c2a4a659a30900452b931a3df2df not found: ID does not exist" containerID="7f87ca706da39ed823b069559d07a45591a3c2a4a659a30900452b931a3df2df" Apr 23 18:54:25.404102 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:25.404048 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f87ca706da39ed823b069559d07a45591a3c2a4a659a30900452b931a3df2df"} err="failed to get container status \"7f87ca706da39ed823b069559d07a45591a3c2a4a659a30900452b931a3df2df\": rpc error: code = NotFound desc = could not find container \"7f87ca706da39ed823b069559d07a45591a3c2a4a659a30900452b931a3df2df\": container with ID starting with 7f87ca706da39ed823b069559d07a45591a3c2a4a659a30900452b931a3df2df not found: ID does not exist" Apr 23 18:54:25.468288 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:25.468205 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktkh2\" (UniqueName: \"kubernetes.io/projected/1b94ea5b-5812-442e-b351-02b01e3347b3-kube-api-access-ktkh2\") pod \"1b94ea5b-5812-442e-b351-02b01e3347b3\" (UID: \"1b94ea5b-5812-442e-b351-02b01e3347b3\") " Apr 23 18:54:25.468288 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:25.468240 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b94ea5b-5812-442e-b351-02b01e3347b3-kserve-provision-location\") pod \"1b94ea5b-5812-442e-b351-02b01e3347b3\" (UID: \"1b94ea5b-5812-442e-b351-02b01e3347b3\") " Apr 23 18:54:25.468288 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:25.468279 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/1b94ea5b-5812-442e-b351-02b01e3347b3-cabundle-cert\") pod \"1b94ea5b-5812-442e-b351-02b01e3347b3\" (UID: \"1b94ea5b-5812-442e-b351-02b01e3347b3\") " Apr 23 18:54:25.468546 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:25.468393 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b94ea5b-5812-442e-b351-02b01e3347b3-proxy-tls\") pod \"1b94ea5b-5812-442e-b351-02b01e3347b3\" (UID: \"1b94ea5b-5812-442e-b351-02b01e3347b3\") " Apr 23 18:54:25.468546 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:25.468457 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1b94ea5b-5812-442e-b351-02b01e3347b3-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"1b94ea5b-5812-442e-b351-02b01e3347b3\" (UID: \"1b94ea5b-5812-442e-b351-02b01e3347b3\") " Apr 23 18:54:25.468663 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:25.468639 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b94ea5b-5812-442e-b351-02b01e3347b3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1b94ea5b-5812-442e-b351-02b01e3347b3" (UID: "1b94ea5b-5812-442e-b351-02b01e3347b3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:54:25.468732 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:25.468659 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b94ea5b-5812-442e-b351-02b01e3347b3-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "1b94ea5b-5812-442e-b351-02b01e3347b3" (UID: "1b94ea5b-5812-442e-b351-02b01e3347b3"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:54:25.468864 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:25.468837 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b94ea5b-5812-442e-b351-02b01e3347b3-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config") pod "1b94ea5b-5812-442e-b351-02b01e3347b3" (UID: "1b94ea5b-5812-442e-b351-02b01e3347b3"). InnerVolumeSpecName "isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:54:25.470513 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:25.470486 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b94ea5b-5812-442e-b351-02b01e3347b3-kube-api-access-ktkh2" (OuterVolumeSpecName: "kube-api-access-ktkh2") pod "1b94ea5b-5812-442e-b351-02b01e3347b3" (UID: "1b94ea5b-5812-442e-b351-02b01e3347b3"). InnerVolumeSpecName "kube-api-access-ktkh2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:54:25.470513 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:25.470498 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b94ea5b-5812-442e-b351-02b01e3347b3-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "1b94ea5b-5812-442e-b351-02b01e3347b3" (UID: "1b94ea5b-5812-442e-b351-02b01e3347b3"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:54:25.569158 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:25.569118 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ktkh2\" (UniqueName: \"kubernetes.io/projected/1b94ea5b-5812-442e-b351-02b01e3347b3-kube-api-access-ktkh2\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:54:25.569158 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:25.569150 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b94ea5b-5812-442e-b351-02b01e3347b3-kserve-provision-location\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:54:25.569158 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:25.569165 2568 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/1b94ea5b-5812-442e-b351-02b01e3347b3-cabundle-cert\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:54:25.569393 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:25.569179 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b94ea5b-5812-442e-b351-02b01e3347b3-proxy-tls\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:54:25.569393 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:25.569191 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1b94ea5b-5812-442e-b351-02b01e3347b3-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:54:25.705908 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:25.705875 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j"] Apr 23 18:54:25.713428 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:25.713403 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-6d77c98f46-g2r5j"] Apr 23 18:54:26.599725 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:26.599692 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b94ea5b-5812-442e-b351-02b01e3347b3" path="/var/lib/kubelet/pods/1b94ea5b-5812-442e-b351-02b01e3347b3/volumes" Apr 23 18:54:27.395185 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:27.395158 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk_84a76fef-9488-4ca9-9589-0ad4873c30d4/storage-initializer/0.log" Apr 23 18:54:27.395355 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:27.395193 2568 generic.go:358] "Generic (PLEG): container finished" podID="84a76fef-9488-4ca9-9589-0ad4873c30d4" containerID="de236da68eedbe11d28f1bbe3dd3e88d963d5510203f94cb24d0ac366800883a" exitCode=1 Apr 23 18:54:27.395355 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:27.395277 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk" event={"ID":"84a76fef-9488-4ca9-9589-0ad4873c30d4","Type":"ContainerDied","Data":"de236da68eedbe11d28f1bbe3dd3e88d963d5510203f94cb24d0ac366800883a"} Apr 23 18:54:28.399715 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:28.399685 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk_84a76fef-9488-4ca9-9589-0ad4873c30d4/storage-initializer/0.log" Apr 23 18:54:28.400197 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:28.399785 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk" event={"ID":"84a76fef-9488-4ca9-9589-0ad4873c30d4","Type":"ContainerStarted","Data":"17aec3a8e0148323d13c1eb29f8089cbf7a47c1cdc47277a8380ae11ddfcef27"} Apr 23 18:54:31.410703 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:31.410625 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk_84a76fef-9488-4ca9-9589-0ad4873c30d4/storage-initializer/1.log" Apr 23 18:54:31.411120 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:31.410965 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk_84a76fef-9488-4ca9-9589-0ad4873c30d4/storage-initializer/0.log" Apr 23 18:54:31.411120 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:31.410993 2568 generic.go:358] "Generic (PLEG): container finished" podID="84a76fef-9488-4ca9-9589-0ad4873c30d4" containerID="17aec3a8e0148323d13c1eb29f8089cbf7a47c1cdc47277a8380ae11ddfcef27" exitCode=1 Apr 23 18:54:31.411120 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:31.411072 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk" event={"ID":"84a76fef-9488-4ca9-9589-0ad4873c30d4","Type":"ContainerDied","Data":"17aec3a8e0148323d13c1eb29f8089cbf7a47c1cdc47277a8380ae11ddfcef27"} Apr 23 18:54:31.411226 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:31.411123 2568 scope.go:117] "RemoveContainer" containerID="de236da68eedbe11d28f1bbe3dd3e88d963d5510203f94cb24d0ac366800883a" Apr 23 18:54:31.411496 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:31.411481 2568 scope.go:117] "RemoveContainer" containerID="de236da68eedbe11d28f1bbe3dd3e88d963d5510203f94cb24d0ac366800883a" Apr 23 18:54:31.420979 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:54:31.420942 2568 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk_kserve-ci-e2e-test_84a76fef-9488-4ca9-9589-0ad4873c30d4_0 in pod sandbox 5c1f961f7b23cc7280c1cf3e1178be43b32659fb3b70e135501f144c9aabb776 from index: no such id: 'de236da68eedbe11d28f1bbe3dd3e88d963d5510203f94cb24d0ac366800883a'" containerID="de236da68eedbe11d28f1bbe3dd3e88d963d5510203f94cb24d0ac366800883a" Apr 23 18:54:31.421106 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:31.420985 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de236da68eedbe11d28f1bbe3dd3e88d963d5510203f94cb24d0ac366800883a"} err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk_kserve-ci-e2e-test_84a76fef-9488-4ca9-9589-0ad4873c30d4_0 in pod sandbox 5c1f961f7b23cc7280c1cf3e1178be43b32659fb3b70e135501f144c9aabb776 from index: no such id: 'de236da68eedbe11d28f1bbe3dd3e88d963d5510203f94cb24d0ac366800883a'" Apr 23 18:54:31.421189 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:54:31.421173 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk_kserve-ci-e2e-test(84a76fef-9488-4ca9-9589-0ad4873c30d4)\"" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk" podUID="84a76fef-9488-4ca9-9589-0ad4873c30d4" Apr 23 18:54:32.086604 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:32.086563 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk"] Apr 23 18:54:32.415075 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:32.414993 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk_84a76fef-9488-4ca9-9589-0ad4873c30d4/storage-initializer/1.log" Apr 23 18:54:32.541880 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:32.541857 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk_84a76fef-9488-4ca9-9589-0ad4873c30d4/storage-initializer/1.log" Apr 23 18:54:32.542042 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:32.541922 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk" Apr 23 18:54:32.625414 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:32.625370 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khf58\" (UniqueName: \"kubernetes.io/projected/84a76fef-9488-4ca9-9589-0ad4873c30d4-kube-api-access-khf58\") pod \"84a76fef-9488-4ca9-9589-0ad4873c30d4\" (UID: \"84a76fef-9488-4ca9-9589-0ad4873c30d4\") " Apr 23 18:54:32.625414 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:32.625417 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/84a76fef-9488-4ca9-9589-0ad4873c30d4-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"84a76fef-9488-4ca9-9589-0ad4873c30d4\" (UID: \"84a76fef-9488-4ca9-9589-0ad4873c30d4\") " Apr 23 18:54:32.625633 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:32.625452 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/84a76fef-9488-4ca9-9589-0ad4873c30d4-proxy-tls\") pod \"84a76fef-9488-4ca9-9589-0ad4873c30d4\" (UID: \"84a76fef-9488-4ca9-9589-0ad4873c30d4\") " Apr 23 18:54:32.625633 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:32.625487 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84a76fef-9488-4ca9-9589-0ad4873c30d4-kserve-provision-location\") pod \"84a76fef-9488-4ca9-9589-0ad4873c30d4\" (UID: \"84a76fef-9488-4ca9-9589-0ad4873c30d4\") " Apr 23 18:54:32.625828 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:32.625803 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84a76fef-9488-4ca9-9589-0ad4873c30d4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "84a76fef-9488-4ca9-9589-0ad4873c30d4" (UID: "84a76fef-9488-4ca9-9589-0ad4873c30d4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:54:32.625873 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:32.625808 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84a76fef-9488-4ca9-9589-0ad4873c30d4-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config") pod "84a76fef-9488-4ca9-9589-0ad4873c30d4" (UID: "84a76fef-9488-4ca9-9589-0ad4873c30d4"). InnerVolumeSpecName "isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:54:32.627632 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:32.627595 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84a76fef-9488-4ca9-9589-0ad4873c30d4-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "84a76fef-9488-4ca9-9589-0ad4873c30d4" (UID: "84a76fef-9488-4ca9-9589-0ad4873c30d4"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:54:32.627632 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:32.627604 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84a76fef-9488-4ca9-9589-0ad4873c30d4-kube-api-access-khf58" (OuterVolumeSpecName: "kube-api-access-khf58") pod "84a76fef-9488-4ca9-9589-0ad4873c30d4" (UID: "84a76fef-9488-4ca9-9589-0ad4873c30d4"). InnerVolumeSpecName "kube-api-access-khf58". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:54:32.726396 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:32.726291 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-khf58\" (UniqueName: \"kubernetes.io/projected/84a76fef-9488-4ca9-9589-0ad4873c30d4-kube-api-access-khf58\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:54:32.726396 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:32.726337 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/84a76fef-9488-4ca9-9589-0ad4873c30d4-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:54:32.726396 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:32.726354 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/84a76fef-9488-4ca9-9589-0ad4873c30d4-proxy-tls\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:54:32.726396 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:32.726363 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84a76fef-9488-4ca9-9589-0ad4873c30d4-kserve-provision-location\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:54:33.209471 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:33.209434 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv"] Apr 23 18:54:33.209803 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:33.209784 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b94ea5b-5812-442e-b351-02b01e3347b3" containerName="kserve-container" Apr 23 18:54:33.209895 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:33.209805 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b94ea5b-5812-442e-b351-02b01e3347b3" containerName="kserve-container" Apr 23 18:54:33.209895 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:33.209821 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b94ea5b-5812-442e-b351-02b01e3347b3" containerName="storage-initializer" Apr 23 18:54:33.209895 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:33.209830 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b94ea5b-5812-442e-b351-02b01e3347b3" containerName="storage-initializer" Apr 23 18:54:33.209895 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:33.209842 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="84a76fef-9488-4ca9-9589-0ad4873c30d4" containerName="storage-initializer" Apr 23 18:54:33.209895 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:33.209851 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="84a76fef-9488-4ca9-9589-0ad4873c30d4" containerName="storage-initializer" Apr 23 18:54:33.209895 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:33.209863 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b94ea5b-5812-442e-b351-02b01e3347b3" containerName="kube-rbac-proxy" Apr 23 18:54:33.209895 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:33.209872 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b94ea5b-5812-442e-b351-02b01e3347b3" containerName="kube-rbac-proxy" Apr 23 18:54:33.210302 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:33.209959 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="1b94ea5b-5812-442e-b351-02b01e3347b3" containerName="kube-rbac-proxy" Apr 23 18:54:33.210302 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:33.209973 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="84a76fef-9488-4ca9-9589-0ad4873c30d4" containerName="storage-initializer" Apr 23 18:54:33.210302 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:33.209988 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="84a76fef-9488-4ca9-9589-0ad4873c30d4" containerName="storage-initializer" Apr 23 18:54:33.210302 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:33.210000 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="1b94ea5b-5812-442e-b351-02b01e3347b3" containerName="kserve-container" Apr 23 18:54:33.210302 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:33.210071 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="84a76fef-9488-4ca9-9589-0ad4873c30d4" containerName="storage-initializer" Apr 23 18:54:33.210302 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:33.210081 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="84a76fef-9488-4ca9-9589-0ad4873c30d4" containerName="storage-initializer" Apr 23 18:54:33.214566 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:33.214544 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv" Apr 23 18:54:33.216545 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:33.216521 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-pass-predictor-serving-cert\"" Apr 23 18:54:33.216545 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:33.216537 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\"" Apr 23 18:54:33.216704 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:33.216549 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 23 18:54:33.226386 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:33.226360 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv"] Apr 23 18:54:33.331217 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:33.331183 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f0bff882-7563-41a4-a2cd-77259d4ba28f-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv\" (UID: \"f0bff882-7563-41a4-a2cd-77259d4ba28f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv" Apr 23 18:54:33.331409 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:33.331227 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f0bff882-7563-41a4-a2cd-77259d4ba28f-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv\" (UID: \"f0bff882-7563-41a4-a2cd-77259d4ba28f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv" Apr 23 18:54:33.331409 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:33.331247 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f0bff882-7563-41a4-a2cd-77259d4ba28f-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv\" (UID: \"f0bff882-7563-41a4-a2cd-77259d4ba28f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv" Apr 23 18:54:33.331409 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:33.331265 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f0bff882-7563-41a4-a2cd-77259d4ba28f-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv\" (UID: \"f0bff882-7563-41a4-a2cd-77259d4ba28f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv" Apr 23 18:54:33.331409 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:33.331294 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c2g7\" (UniqueName: \"kubernetes.io/projected/f0bff882-7563-41a4-a2cd-77259d4ba28f-kube-api-access-4c2g7\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv\" (UID: \"f0bff882-7563-41a4-a2cd-77259d4ba28f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv" Apr 23 18:54:33.418573 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:33.418545 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk_84a76fef-9488-4ca9-9589-0ad4873c30d4/storage-initializer/1.log" Apr 23 18:54:33.419021 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:33.418606 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk" event={"ID":"84a76fef-9488-4ca9-9589-0ad4873c30d4","Type":"ContainerDied","Data":"5c1f961f7b23cc7280c1cf3e1178be43b32659fb3b70e135501f144c9aabb776"} Apr 23 18:54:33.419021 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:33.418648 2568 scope.go:117] "RemoveContainer" containerID="17aec3a8e0148323d13c1eb29f8089cbf7a47c1cdc47277a8380ae11ddfcef27" Apr 23 18:54:33.419021 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:33.418662 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk" Apr 23 18:54:33.432562 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:33.432540 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f0bff882-7563-41a4-a2cd-77259d4ba28f-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv\" (UID: \"f0bff882-7563-41a4-a2cd-77259d4ba28f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv" Apr 23 18:54:33.432649 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:33.432579 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f0bff882-7563-41a4-a2cd-77259d4ba28f-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv\" (UID: \"f0bff882-7563-41a4-a2cd-77259d4ba28f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv" Apr 23 18:54:33.432649 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:33.432601 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f0bff882-7563-41a4-a2cd-77259d4ba28f-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv\" (UID: \"f0bff882-7563-41a4-a2cd-77259d4ba28f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv" Apr 23 18:54:33.432796 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:33.432765 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f0bff882-7563-41a4-a2cd-77259d4ba28f-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv\" (UID: \"f0bff882-7563-41a4-a2cd-77259d4ba28f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv" Apr 23 18:54:33.432906 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:33.432821 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4c2g7\" (UniqueName: \"kubernetes.io/projected/f0bff882-7563-41a4-a2cd-77259d4ba28f-kube-api-access-4c2g7\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv\" (UID: \"f0bff882-7563-41a4-a2cd-77259d4ba28f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv" Apr 23 18:54:33.433141 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:33.433120 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f0bff882-7563-41a4-a2cd-77259d4ba28f-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv\" (UID: \"f0bff882-7563-41a4-a2cd-77259d4ba28f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv" Apr 23 18:54:33.433321 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:33.433299 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f0bff882-7563-41a4-a2cd-77259d4ba28f-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv\" (UID: \"f0bff882-7563-41a4-a2cd-77259d4ba28f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv" Apr 23 18:54:33.433386 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:33.433324 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f0bff882-7563-41a4-a2cd-77259d4ba28f-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv\" (UID: \"f0bff882-7563-41a4-a2cd-77259d4ba28f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv" Apr 23 18:54:33.435035 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:33.435015 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f0bff882-7563-41a4-a2cd-77259d4ba28f-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv\" (UID: \"f0bff882-7563-41a4-a2cd-77259d4ba28f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv" Apr 23 18:54:33.440624 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:33.440600 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c2g7\" (UniqueName: \"kubernetes.io/projected/f0bff882-7563-41a4-a2cd-77259d4ba28f-kube-api-access-4c2g7\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv\" (UID: \"f0bff882-7563-41a4-a2cd-77259d4ba28f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv" Apr 23 18:54:33.455350 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:33.455321 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk"] Apr 23 18:54:33.459204 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:33.459177 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-7fffb8d84d-dbrpk"] Apr 23 18:54:33.525189 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:33.525094 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv" Apr 23 18:54:33.647830 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:33.647788 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv"] Apr 23 18:54:33.650843 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:54:33.650807 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0bff882_7563_41a4_a2cd_77259d4ba28f.slice/crio-90103736f2d7d679aadad56970d906073fd2c2f89026e8a91cd5afdf99b8b9ae WatchSource:0}: Error finding container 90103736f2d7d679aadad56970d906073fd2c2f89026e8a91cd5afdf99b8b9ae: Status 404 returned error can't find the container with id 90103736f2d7d679aadad56970d906073fd2c2f89026e8a91cd5afdf99b8b9ae Apr 23 18:54:34.424527 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:34.424489 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv" event={"ID":"f0bff882-7563-41a4-a2cd-77259d4ba28f","Type":"ContainerStarted","Data":"454eb0d44c99d5eca725605b890c7f099337217df2fc4114d82735fc427c57fc"} Apr 23 18:54:34.424945 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:34.424532 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv" event={"ID":"f0bff882-7563-41a4-a2cd-77259d4ba28f","Type":"ContainerStarted","Data":"90103736f2d7d679aadad56970d906073fd2c2f89026e8a91cd5afdf99b8b9ae"} Apr 23 18:54:34.598852 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:34.598816 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84a76fef-9488-4ca9-9589-0ad4873c30d4" path="/var/lib/kubelet/pods/84a76fef-9488-4ca9-9589-0ad4873c30d4/volumes" Apr 23 18:54:35.429607 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:35.429572 2568 generic.go:358] "Generic (PLEG): container finished" podID="f0bff882-7563-41a4-a2cd-77259d4ba28f" containerID="454eb0d44c99d5eca725605b890c7f099337217df2fc4114d82735fc427c57fc" exitCode=0 Apr 23 18:54:35.430161 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:35.429634 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv" event={"ID":"f0bff882-7563-41a4-a2cd-77259d4ba28f","Type":"ContainerDied","Data":"454eb0d44c99d5eca725605b890c7f099337217df2fc4114d82735fc427c57fc"} Apr 23 18:54:36.434918 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:36.434880 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv" event={"ID":"f0bff882-7563-41a4-a2cd-77259d4ba28f","Type":"ContainerStarted","Data":"fe8994803d2c4d57b054a8a98f60cc951a0d91133e57f7b1677238e0ca7fedcc"} Apr 23 18:54:36.434918 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:36.434922 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv" event={"ID":"f0bff882-7563-41a4-a2cd-77259d4ba28f","Type":"ContainerStarted","Data":"2dd5baff5eb03a864841b9038a21bcddd26275c3d270006703d9dda0c9306e6c"} Apr 23 18:54:36.435433 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:36.435121 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv" Apr 23 18:54:36.435433 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:36.435256 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv" Apr 23 18:54:36.436467 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:36.436439 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv" podUID="f0bff882-7563-41a4-a2cd-77259d4ba28f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.59:8080: connect: connection refused" Apr 23 18:54:36.454833 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:36.454779 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv" podStartSLOduration=3.454763569 podStartE2EDuration="3.454763569s" podCreationTimestamp="2026-04-23 18:54:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:54:36.453577591 +0000 UTC m=+3734.373670086" watchObservedRunningTime="2026-04-23 18:54:36.454763569 +0000 UTC m=+3734.374856043" Apr 23 18:54:37.438013 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:37.437971 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv" podUID="f0bff882-7563-41a4-a2cd-77259d4ba28f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.59:8080: connect: connection refused" Apr 23 18:54:42.442236 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:42.442205 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv" Apr 23 18:54:42.442732 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:42.442706 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv" podUID="f0bff882-7563-41a4-a2cd-77259d4ba28f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.59:8080: connect: connection refused" Apr 23 18:54:52.442749 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:54:52.442706 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv" podUID="f0bff882-7563-41a4-a2cd-77259d4ba28f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.59:8080: connect: connection refused" Apr 23 18:55:02.443682 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:02.443597 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv" podUID="f0bff882-7563-41a4-a2cd-77259d4ba28f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.59:8080: connect: connection refused" Apr 23 18:55:12.443147 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:12.443105 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv" podUID="f0bff882-7563-41a4-a2cd-77259d4ba28f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.59:8080: connect: connection refused" Apr 23 18:55:22.443002 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:22.442961 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv" podUID="f0bff882-7563-41a4-a2cd-77259d4ba28f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.59:8080: connect: connection refused" Apr 23 18:55:32.443133 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:32.443097 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv" podUID="f0bff882-7563-41a4-a2cd-77259d4ba28f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.59:8080: connect: connection refused" Apr 23 18:55:42.444098 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:42.444065 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv" Apr 23 18:55:43.223697 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:43.223662 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv"] Apr 23 18:55:43.223988 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:43.223962 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv" podUID="f0bff882-7563-41a4-a2cd-77259d4ba28f" containerName="kserve-container" containerID="cri-o://2dd5baff5eb03a864841b9038a21bcddd26275c3d270006703d9dda0c9306e6c" gracePeriod=30 Apr 23 18:55:43.224096 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:43.224004 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv" podUID="f0bff882-7563-41a4-a2cd-77259d4ba28f" containerName="kube-rbac-proxy" containerID="cri-o://fe8994803d2c4d57b054a8a98f60cc951a0d91133e57f7b1677238e0ca7fedcc" gracePeriod=30 Apr 23 18:55:43.619201 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:43.619165 2568 generic.go:358] "Generic (PLEG): container finished" podID="f0bff882-7563-41a4-a2cd-77259d4ba28f" containerID="fe8994803d2c4d57b054a8a98f60cc951a0d91133e57f7b1677238e0ca7fedcc" exitCode=2 Apr 23 18:55:43.619608 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:43.619236 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv" event={"ID":"f0bff882-7563-41a4-a2cd-77259d4ba28f","Type":"ContainerDied","Data":"fe8994803d2c4d57b054a8a98f60cc951a0d91133e57f7b1677238e0ca7fedcc"} Apr 23 18:55:44.367462 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:44.367421 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-574c4b9dd4-nh84n"] Apr 23 18:55:44.371331 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:44.371309 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-574c4b9dd4-nh84n" Apr 23 18:55:44.373376 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:44.373353 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\"" Apr 23 18:55:44.373631 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:44.373614 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-fail-predictor-serving-cert\"" Apr 23 18:55:44.379999 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:44.379973 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-574c4b9dd4-nh84n"] Apr 23 18:55:44.471780 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:44.471738 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c56n\" (UniqueName: \"kubernetes.io/projected/7d553f61-65fc-42d0-9be8-8a8f7c0d622a-kube-api-access-6c56n\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-574c4b9dd4-nh84n\" (UID: \"7d553f61-65fc-42d0-9be8-8a8f7c0d622a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-574c4b9dd4-nh84n" Apr 23 18:55:44.472006 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:44.471808 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7d553f61-65fc-42d0-9be8-8a8f7c0d622a-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-574c4b9dd4-nh84n\" (UID: \"7d553f61-65fc-42d0-9be8-8a8f7c0d622a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-574c4b9dd4-nh84n" Apr 23 18:55:44.472006 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:44.471846 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7d553f61-65fc-42d0-9be8-8a8f7c0d622a-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-574c4b9dd4-nh84n\" (UID: \"7d553f61-65fc-42d0-9be8-8a8f7c0d622a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-574c4b9dd4-nh84n" Apr 23 18:55:44.472006 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:44.471892 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7d553f61-65fc-42d0-9be8-8a8f7c0d622a-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-574c4b9dd4-nh84n\" (UID: \"7d553f61-65fc-42d0-9be8-8a8f7c0d622a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-574c4b9dd4-nh84n" Apr 23 18:55:44.573275 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:44.573243 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6c56n\" (UniqueName: \"kubernetes.io/projected/7d553f61-65fc-42d0-9be8-8a8f7c0d622a-kube-api-access-6c56n\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-574c4b9dd4-nh84n\" (UID: \"7d553f61-65fc-42d0-9be8-8a8f7c0d622a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-574c4b9dd4-nh84n" Apr 23 18:55:44.573485 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:44.573307 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7d553f61-65fc-42d0-9be8-8a8f7c0d622a-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-574c4b9dd4-nh84n\" (UID: \"7d553f61-65fc-42d0-9be8-8a8f7c0d622a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-574c4b9dd4-nh84n" Apr 23 18:55:44.573485 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:44.573345 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7d553f61-65fc-42d0-9be8-8a8f7c0d622a-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-574c4b9dd4-nh84n\" (UID: \"7d553f61-65fc-42d0-9be8-8a8f7c0d622a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-574c4b9dd4-nh84n" Apr 23 18:55:44.573485 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:44.573391 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7d553f61-65fc-42d0-9be8-8a8f7c0d622a-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-574c4b9dd4-nh84n\" (UID: \"7d553f61-65fc-42d0-9be8-8a8f7c0d622a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-574c4b9dd4-nh84n" Apr 23 18:55:44.573910 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:44.573875 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7d553f61-65fc-42d0-9be8-8a8f7c0d622a-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-574c4b9dd4-nh84n\" (UID: \"7d553f61-65fc-42d0-9be8-8a8f7c0d622a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-574c4b9dd4-nh84n" Apr 23 18:55:44.574146 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:44.574124 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7d553f61-65fc-42d0-9be8-8a8f7c0d622a-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-574c4b9dd4-nh84n\" (UID: \"7d553f61-65fc-42d0-9be8-8a8f7c0d622a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-574c4b9dd4-nh84n" Apr 23 18:55:44.575812 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:44.575790 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7d553f61-65fc-42d0-9be8-8a8f7c0d622a-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-574c4b9dd4-nh84n\" (UID: \"7d553f61-65fc-42d0-9be8-8a8f7c0d622a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-574c4b9dd4-nh84n" Apr 23 18:55:44.582674 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:44.582652 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c56n\" (UniqueName: \"kubernetes.io/projected/7d553f61-65fc-42d0-9be8-8a8f7c0d622a-kube-api-access-6c56n\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-574c4b9dd4-nh84n\" (UID: \"7d553f61-65fc-42d0-9be8-8a8f7c0d622a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-574c4b9dd4-nh84n" Apr 23 18:55:44.683370 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:44.683277 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-574c4b9dd4-nh84n" Apr 23 18:55:44.802371 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:44.802340 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-574c4b9dd4-nh84n"] Apr 23 18:55:44.805594 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:55:44.805562 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d553f61_65fc_42d0_9be8_8a8f7c0d622a.slice/crio-99a702b7d938f7a22bcec020d41504a5e7b1be2a742e05aa23dadcce073ffe24 WatchSource:0}: Error finding container 99a702b7d938f7a22bcec020d41504a5e7b1be2a742e05aa23dadcce073ffe24: Status 404 returned error can't find the container with id 99a702b7d938f7a22bcec020d41504a5e7b1be2a742e05aa23dadcce073ffe24 Apr 23 18:55:44.807341 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:44.807316 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 18:55:45.626401 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:45.626367 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-574c4b9dd4-nh84n" event={"ID":"7d553f61-65fc-42d0-9be8-8a8f7c0d622a","Type":"ContainerStarted","Data":"dbc5959a5da7c5b428106c00c528c8a71bcdcbd6b8b548afc07c73974048c8cd"} Apr 23 18:55:45.626401 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:45.626403 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-574c4b9dd4-nh84n" event={"ID":"7d553f61-65fc-42d0-9be8-8a8f7c0d622a","Type":"ContainerStarted","Data":"99a702b7d938f7a22bcec020d41504a5e7b1be2a742e05aa23dadcce073ffe24"} Apr 23 18:55:47.439134 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:47.439093 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv" podUID="f0bff882-7563-41a4-a2cd-77259d4ba28f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.59:8643/healthz\": dial tcp 10.133.0.59:8643: connect: connection refused" Apr 23 18:55:47.761466 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:47.761441 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv" Apr 23 18:55:47.799655 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:47.799628 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f0bff882-7563-41a4-a2cd-77259d4ba28f-proxy-tls\") pod \"f0bff882-7563-41a4-a2cd-77259d4ba28f\" (UID: \"f0bff882-7563-41a4-a2cd-77259d4ba28f\") " Apr 23 18:55:47.799819 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:47.799688 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4c2g7\" (UniqueName: \"kubernetes.io/projected/f0bff882-7563-41a4-a2cd-77259d4ba28f-kube-api-access-4c2g7\") pod \"f0bff882-7563-41a4-a2cd-77259d4ba28f\" (UID: \"f0bff882-7563-41a4-a2cd-77259d4ba28f\") " Apr 23 18:55:47.799819 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:47.799717 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f0bff882-7563-41a4-a2cd-77259d4ba28f-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"f0bff882-7563-41a4-a2cd-77259d4ba28f\" (UID: \"f0bff882-7563-41a4-a2cd-77259d4ba28f\") " Apr 23 18:55:47.799819 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:47.799750 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f0bff882-7563-41a4-a2cd-77259d4ba28f-cabundle-cert\") pod \"f0bff882-7563-41a4-a2cd-77259d4ba28f\" (UID: \"f0bff882-7563-41a4-a2cd-77259d4ba28f\") " Apr 23 18:55:47.799819 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:47.799766 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f0bff882-7563-41a4-a2cd-77259d4ba28f-kserve-provision-location\") pod \"f0bff882-7563-41a4-a2cd-77259d4ba28f\" (UID: \"f0bff882-7563-41a4-a2cd-77259d4ba28f\") " Apr 23 18:55:47.800179 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:47.800145 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0bff882-7563-41a4-a2cd-77259d4ba28f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f0bff882-7563-41a4-a2cd-77259d4ba28f" (UID: "f0bff882-7563-41a4-a2cd-77259d4ba28f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:55:47.800267 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:47.800188 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0bff882-7563-41a4-a2cd-77259d4ba28f-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config") pod "f0bff882-7563-41a4-a2cd-77259d4ba28f" (UID: "f0bff882-7563-41a4-a2cd-77259d4ba28f"). InnerVolumeSpecName "isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:55:47.800267 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:47.800190 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0bff882-7563-41a4-a2cd-77259d4ba28f-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "f0bff882-7563-41a4-a2cd-77259d4ba28f" (UID: "f0bff882-7563-41a4-a2cd-77259d4ba28f"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:55:47.801837 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:47.801816 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0bff882-7563-41a4-a2cd-77259d4ba28f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f0bff882-7563-41a4-a2cd-77259d4ba28f" (UID: "f0bff882-7563-41a4-a2cd-77259d4ba28f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:55:47.801918 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:47.801853 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0bff882-7563-41a4-a2cd-77259d4ba28f-kube-api-access-4c2g7" (OuterVolumeSpecName: "kube-api-access-4c2g7") pod "f0bff882-7563-41a4-a2cd-77259d4ba28f" (UID: "f0bff882-7563-41a4-a2cd-77259d4ba28f"). InnerVolumeSpecName "kube-api-access-4c2g7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:55:47.901198 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:47.901160 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4c2g7\" (UniqueName: \"kubernetes.io/projected/f0bff882-7563-41a4-a2cd-77259d4ba28f-kube-api-access-4c2g7\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:55:47.901198 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:47.901190 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f0bff882-7563-41a4-a2cd-77259d4ba28f-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:55:47.901198 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:47.901203 2568 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f0bff882-7563-41a4-a2cd-77259d4ba28f-cabundle-cert\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:55:47.901430 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:47.901212 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f0bff882-7563-41a4-a2cd-77259d4ba28f-kserve-provision-location\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:55:47.901430 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:47.901221 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f0bff882-7563-41a4-a2cd-77259d4ba28f-proxy-tls\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:55:48.636306 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:48.636278 2568 generic.go:358] "Generic (PLEG): container finished" podID="f0bff882-7563-41a4-a2cd-77259d4ba28f" containerID="2dd5baff5eb03a864841b9038a21bcddd26275c3d270006703d9dda0c9306e6c" exitCode=0 Apr 23 18:55:48.636653 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:48.636365 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv" Apr 23 18:55:48.636653 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:48.636364 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv" event={"ID":"f0bff882-7563-41a4-a2cd-77259d4ba28f","Type":"ContainerDied","Data":"2dd5baff5eb03a864841b9038a21bcddd26275c3d270006703d9dda0c9306e6c"} Apr 23 18:55:48.636653 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:48.636474 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv" event={"ID":"f0bff882-7563-41a4-a2cd-77259d4ba28f","Type":"ContainerDied","Data":"90103736f2d7d679aadad56970d906073fd2c2f89026e8a91cd5afdf99b8b9ae"} Apr 23 18:55:48.636653 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:48.636497 2568 scope.go:117] "RemoveContainer" containerID="fe8994803d2c4d57b054a8a98f60cc951a0d91133e57f7b1677238e0ca7fedcc" Apr 23 18:55:48.644108 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:48.644090 2568 scope.go:117] "RemoveContainer" containerID="2dd5baff5eb03a864841b9038a21bcddd26275c3d270006703d9dda0c9306e6c" Apr 23 18:55:48.650939 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:48.650910 2568 scope.go:117] "RemoveContainer" containerID="454eb0d44c99d5eca725605b890c7f099337217df2fc4114d82735fc427c57fc" Apr 23 18:55:48.656004 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:48.655982 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv"] Apr 23 18:55:48.657774 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:48.657753 2568 scope.go:117] "RemoveContainer" containerID="fe8994803d2c4d57b054a8a98f60cc951a0d91133e57f7b1677238e0ca7fedcc" Apr 23 18:55:48.658066 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:55:48.658047 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe8994803d2c4d57b054a8a98f60cc951a0d91133e57f7b1677238e0ca7fedcc\": container with ID starting with fe8994803d2c4d57b054a8a98f60cc951a0d91133e57f7b1677238e0ca7fedcc not found: ID does not exist" containerID="fe8994803d2c4d57b054a8a98f60cc951a0d91133e57f7b1677238e0ca7fedcc" Apr 23 18:55:48.658141 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:48.658074 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe8994803d2c4d57b054a8a98f60cc951a0d91133e57f7b1677238e0ca7fedcc"} err="failed to get container status \"fe8994803d2c4d57b054a8a98f60cc951a0d91133e57f7b1677238e0ca7fedcc\": rpc error: code = NotFound desc = could not find container \"fe8994803d2c4d57b054a8a98f60cc951a0d91133e57f7b1677238e0ca7fedcc\": container with ID starting with fe8994803d2c4d57b054a8a98f60cc951a0d91133e57f7b1677238e0ca7fedcc not found: ID does not exist" Apr 23 18:55:48.658141 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:48.658096 2568 scope.go:117] "RemoveContainer" containerID="2dd5baff5eb03a864841b9038a21bcddd26275c3d270006703d9dda0c9306e6c" Apr 23 18:55:48.658318 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:55:48.658303 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dd5baff5eb03a864841b9038a21bcddd26275c3d270006703d9dda0c9306e6c\": container with ID starting with 2dd5baff5eb03a864841b9038a21bcddd26275c3d270006703d9dda0c9306e6c not found: ID does not exist" containerID="2dd5baff5eb03a864841b9038a21bcddd26275c3d270006703d9dda0c9306e6c" Apr 23 18:55:48.658361 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:48.658320 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dd5baff5eb03a864841b9038a21bcddd26275c3d270006703d9dda0c9306e6c"} err="failed to get container status \"2dd5baff5eb03a864841b9038a21bcddd26275c3d270006703d9dda0c9306e6c\": rpc error: code = NotFound desc = could not find container \"2dd5baff5eb03a864841b9038a21bcddd26275c3d270006703d9dda0c9306e6c\": container with ID starting with 2dd5baff5eb03a864841b9038a21bcddd26275c3d270006703d9dda0c9306e6c not found: ID does not exist" Apr 23 18:55:48.658361 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:48.658341 2568 scope.go:117] "RemoveContainer" containerID="454eb0d44c99d5eca725605b890c7f099337217df2fc4114d82735fc427c57fc" Apr 23 18:55:48.658561 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:55:48.658545 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"454eb0d44c99d5eca725605b890c7f099337217df2fc4114d82735fc427c57fc\": container with ID starting with 454eb0d44c99d5eca725605b890c7f099337217df2fc4114d82735fc427c57fc not found: ID does not exist" containerID="454eb0d44c99d5eca725605b890c7f099337217df2fc4114d82735fc427c57fc" Apr 23 18:55:48.658612 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:48.658565 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"454eb0d44c99d5eca725605b890c7f099337217df2fc4114d82735fc427c57fc"} err="failed to get container status \"454eb0d44c99d5eca725605b890c7f099337217df2fc4114d82735fc427c57fc\": rpc error: code = NotFound desc = could not find container \"454eb0d44c99d5eca725605b890c7f099337217df2fc4114d82735fc427c57fc\": container with ID starting with 454eb0d44c99d5eca725605b890c7f099337217df2fc4114d82735fc427c57fc not found: ID does not exist" Apr 23 18:55:48.660685 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:48.660664 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-6c6bdfcdc4-kg5fv"] Apr 23 18:55:50.598830 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:50.598798 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0bff882-7563-41a4-a2cd-77259d4ba28f" path="/var/lib/kubelet/pods/f0bff882-7563-41a4-a2cd-77259d4ba28f/volumes" Apr 23 18:55:50.644831 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:50.644808 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-574c4b9dd4-nh84n_7d553f61-65fc-42d0-9be8-8a8f7c0d622a/storage-initializer/0.log" Apr 23 18:55:50.645001 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:50.644843 2568 generic.go:358] "Generic (PLEG): container finished" podID="7d553f61-65fc-42d0-9be8-8a8f7c0d622a" containerID="dbc5959a5da7c5b428106c00c528c8a71bcdcbd6b8b548afc07c73974048c8cd" exitCode=1 Apr 23 18:55:50.645001 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:50.644910 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-574c4b9dd4-nh84n" event={"ID":"7d553f61-65fc-42d0-9be8-8a8f7c0d622a","Type":"ContainerDied","Data":"dbc5959a5da7c5b428106c00c528c8a71bcdcbd6b8b548afc07c73974048c8cd"} Apr 23 18:55:51.650335 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:51.650307 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-574c4b9dd4-nh84n_7d553f61-65fc-42d0-9be8-8a8f7c0d622a/storage-initializer/0.log" Apr 23 18:55:51.650770 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:51.650415 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-574c4b9dd4-nh84n" event={"ID":"7d553f61-65fc-42d0-9be8-8a8f7c0d622a","Type":"ContainerStarted","Data":"adc6e6ccc9647995ac4d6264013a29174e76859f51d67bc75b142b5f4293caac"} Apr 23 18:55:54.325507 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:54.325472 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-574c4b9dd4-nh84n"] Apr 23 18:55:54.325992 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:54.325780 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-574c4b9dd4-nh84n" podUID="7d553f61-65fc-42d0-9be8-8a8f7c0d622a" containerName="storage-initializer" containerID="cri-o://adc6e6ccc9647995ac4d6264013a29174e76859f51d67bc75b142b5f4293caac" gracePeriod=30 Apr 23 18:55:55.393960 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:55.393911 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh"] Apr 23 18:55:55.394325 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:55.394207 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f0bff882-7563-41a4-a2cd-77259d4ba28f" containerName="kube-rbac-proxy" Apr 23 18:55:55.394325 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:55.394218 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0bff882-7563-41a4-a2cd-77259d4ba28f" containerName="kube-rbac-proxy" Apr 23 18:55:55.394325 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:55.394227 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f0bff882-7563-41a4-a2cd-77259d4ba28f" containerName="storage-initializer" Apr 23 18:55:55.394325 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:55.394233 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0bff882-7563-41a4-a2cd-77259d4ba28f" containerName="storage-initializer" Apr 23 18:55:55.394325 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:55.394244 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f0bff882-7563-41a4-a2cd-77259d4ba28f" containerName="kserve-container" Apr 23 18:55:55.394325 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:55.394249 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0bff882-7563-41a4-a2cd-77259d4ba28f" containerName="kserve-container" Apr 23 18:55:55.394325 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:55.394291 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="f0bff882-7563-41a4-a2cd-77259d4ba28f" containerName="kserve-container" Apr 23 18:55:55.394325 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:55.394299 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="f0bff882-7563-41a4-a2cd-77259d4ba28f" containerName="kube-rbac-proxy" Apr 23 18:55:55.397013 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:55.396984 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh" Apr 23 18:55:55.398990 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:55.398967 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\"" Apr 23 18:55:55.399066 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:55.398972 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-pass-predictor-serving-cert\"" Apr 23 18:55:55.399237 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:55.399224 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 23 18:55:55.411545 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:55.411486 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh"] Apr 23 18:55:55.462273 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:55.462236 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c455d1bd-98b4-472b-b639-999401359fd4-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh\" (UID: \"c455d1bd-98b4-472b-b639-999401359fd4\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh" Apr 23 18:55:55.462429 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:55.462281 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c455d1bd-98b4-472b-b639-999401359fd4-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh\" (UID: \"c455d1bd-98b4-472b-b639-999401359fd4\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh" Apr 23 18:55:55.462429 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:55.462308 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c455d1bd-98b4-472b-b639-999401359fd4-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh\" (UID: \"c455d1bd-98b4-472b-b639-999401359fd4\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh" Apr 23 18:55:55.462429 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:55.462334 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c455d1bd-98b4-472b-b639-999401359fd4-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh\" (UID: \"c455d1bd-98b4-472b-b639-999401359fd4\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh" Apr 23 18:55:55.462429 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:55.462361 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wknnz\" (UniqueName: \"kubernetes.io/projected/c455d1bd-98b4-472b-b639-999401359fd4-kube-api-access-wknnz\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh\" (UID: \"c455d1bd-98b4-472b-b639-999401359fd4\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh" Apr 23 18:55:55.563061 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:55.563022 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c455d1bd-98b4-472b-b639-999401359fd4-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh\" (UID: \"c455d1bd-98b4-472b-b639-999401359fd4\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh" Apr 23 18:55:55.563061 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:55.563066 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c455d1bd-98b4-472b-b639-999401359fd4-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh\" (UID: \"c455d1bd-98b4-472b-b639-999401359fd4\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh" Apr 23 18:55:55.563323 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:55.563094 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c455d1bd-98b4-472b-b639-999401359fd4-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh\" (UID: \"c455d1bd-98b4-472b-b639-999401359fd4\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh" Apr 23 18:55:55.563323 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:55.563119 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c455d1bd-98b4-472b-b639-999401359fd4-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh\" (UID: \"c455d1bd-98b4-472b-b639-999401359fd4\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh" Apr 23 18:55:55.563323 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:55.563149 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wknnz\" (UniqueName: \"kubernetes.io/projected/c455d1bd-98b4-472b-b639-999401359fd4-kube-api-access-wknnz\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh\" (UID: \"c455d1bd-98b4-472b-b639-999401359fd4\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh" Apr 23 18:55:55.563498 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:55.563484 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c455d1bd-98b4-472b-b639-999401359fd4-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh\" (UID: \"c455d1bd-98b4-472b-b639-999401359fd4\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh" Apr 23 18:55:55.563774 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:55.563751 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c455d1bd-98b4-472b-b639-999401359fd4-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh\" (UID: \"c455d1bd-98b4-472b-b639-999401359fd4\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh" Apr 23 18:55:55.563864 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:55.563813 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c455d1bd-98b4-472b-b639-999401359fd4-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh\" (UID: \"c455d1bd-98b4-472b-b639-999401359fd4\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh" Apr 23 18:55:55.565577 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:55.565555 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c455d1bd-98b4-472b-b639-999401359fd4-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh\" (UID: \"c455d1bd-98b4-472b-b639-999401359fd4\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh" Apr 23 18:55:55.572070 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:55.572049 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wknnz\" (UniqueName: \"kubernetes.io/projected/c455d1bd-98b4-472b-b639-999401359fd4-kube-api-access-wknnz\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh\" (UID: \"c455d1bd-98b4-472b-b639-999401359fd4\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh" Apr 23 18:55:55.707222 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:55.707139 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh" Apr 23 18:55:55.829518 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:55.829486 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh"] Apr 23 18:55:55.832506 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:55:55.832475 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc455d1bd_98b4_472b_b639_999401359fd4.slice/crio-5154a8edf44022487c530f660b1e7b19842834312d8e5099699999371915fd32 WatchSource:0}: Error finding container 5154a8edf44022487c530f660b1e7b19842834312d8e5099699999371915fd32: Status 404 returned error can't find the container with id 5154a8edf44022487c530f660b1e7b19842834312d8e5099699999371915fd32 Apr 23 18:55:56.666413 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:56.666375 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh" event={"ID":"c455d1bd-98b4-472b-b639-999401359fd4","Type":"ContainerStarted","Data":"3d4788d8710c35802fa1b425c23eaa011e2cd6912e535a6d8aef6904524c9174"} Apr 23 18:55:56.666858 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:56.666430 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh" event={"ID":"c455d1bd-98b4-472b-b639-999401359fd4","Type":"ContainerStarted","Data":"5154a8edf44022487c530f660b1e7b19842834312d8e5099699999371915fd32"} Apr 23 18:55:57.273837 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:57.273808 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-574c4b9dd4-nh84n_7d553f61-65fc-42d0-9be8-8a8f7c0d622a/storage-initializer/1.log" Apr 23 18:55:57.274198 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:57.274181 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-574c4b9dd4-nh84n_7d553f61-65fc-42d0-9be8-8a8f7c0d622a/storage-initializer/0.log" Apr 23 18:55:57.274261 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:57.274251 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-574c4b9dd4-nh84n" Apr 23 18:55:57.378162 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:57.378069 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7d553f61-65fc-42d0-9be8-8a8f7c0d622a-kserve-provision-location\") pod \"7d553f61-65fc-42d0-9be8-8a8f7c0d622a\" (UID: \"7d553f61-65fc-42d0-9be8-8a8f7c0d622a\") " Apr 23 18:55:57.378162 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:57.378122 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6c56n\" (UniqueName: \"kubernetes.io/projected/7d553f61-65fc-42d0-9be8-8a8f7c0d622a-kube-api-access-6c56n\") pod \"7d553f61-65fc-42d0-9be8-8a8f7c0d622a\" (UID: \"7d553f61-65fc-42d0-9be8-8a8f7c0d622a\") " Apr 23 18:55:57.378162 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:57.378151 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7d553f61-65fc-42d0-9be8-8a8f7c0d622a-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"7d553f61-65fc-42d0-9be8-8a8f7c0d622a\" (UID: \"7d553f61-65fc-42d0-9be8-8a8f7c0d622a\") " Apr 23 18:55:57.378411 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:57.378177 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7d553f61-65fc-42d0-9be8-8a8f7c0d622a-proxy-tls\") pod \"7d553f61-65fc-42d0-9be8-8a8f7c0d622a\" (UID: \"7d553f61-65fc-42d0-9be8-8a8f7c0d622a\") " Apr 23 18:55:57.378461 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:57.378393 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d553f61-65fc-42d0-9be8-8a8f7c0d622a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7d553f61-65fc-42d0-9be8-8a8f7c0d622a" (UID: "7d553f61-65fc-42d0-9be8-8a8f7c0d622a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:55:57.378532 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:57.378511 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d553f61-65fc-42d0-9be8-8a8f7c0d622a-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config") pod "7d553f61-65fc-42d0-9be8-8a8f7c0d622a" (UID: "7d553f61-65fc-42d0-9be8-8a8f7c0d622a"). InnerVolumeSpecName "isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:55:57.380277 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:57.380256 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d553f61-65fc-42d0-9be8-8a8f7c0d622a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "7d553f61-65fc-42d0-9be8-8a8f7c0d622a" (UID: "7d553f61-65fc-42d0-9be8-8a8f7c0d622a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:55:57.380334 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:57.380309 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d553f61-65fc-42d0-9be8-8a8f7c0d622a-kube-api-access-6c56n" (OuterVolumeSpecName: "kube-api-access-6c56n") pod "7d553f61-65fc-42d0-9be8-8a8f7c0d622a" (UID: "7d553f61-65fc-42d0-9be8-8a8f7c0d622a"). InnerVolumeSpecName "kube-api-access-6c56n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:55:57.479170 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:57.479136 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6c56n\" (UniqueName: \"kubernetes.io/projected/7d553f61-65fc-42d0-9be8-8a8f7c0d622a-kube-api-access-6c56n\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:55:57.479170 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:57.479166 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7d553f61-65fc-42d0-9be8-8a8f7c0d622a-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:55:57.479170 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:57.479176 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7d553f61-65fc-42d0-9be8-8a8f7c0d622a-proxy-tls\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:55:57.479385 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:57.479190 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7d553f61-65fc-42d0-9be8-8a8f7c0d622a-kserve-provision-location\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:55:57.669951 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:57.669852 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-574c4b9dd4-nh84n_7d553f61-65fc-42d0-9be8-8a8f7c0d622a/storage-initializer/1.log" Apr 23 18:55:57.670342 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:57.670215 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-574c4b9dd4-nh84n_7d553f61-65fc-42d0-9be8-8a8f7c0d622a/storage-initializer/0.log" Apr 23 18:55:57.670342 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:57.670259 2568 generic.go:358] "Generic (PLEG): container finished" podID="7d553f61-65fc-42d0-9be8-8a8f7c0d622a" containerID="adc6e6ccc9647995ac4d6264013a29174e76859f51d67bc75b142b5f4293caac" exitCode=1 Apr 23 18:55:57.670440 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:57.670337 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-574c4b9dd4-nh84n" Apr 23 18:55:57.670440 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:57.670353 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-574c4b9dd4-nh84n" event={"ID":"7d553f61-65fc-42d0-9be8-8a8f7c0d622a","Type":"ContainerDied","Data":"adc6e6ccc9647995ac4d6264013a29174e76859f51d67bc75b142b5f4293caac"} Apr 23 18:55:57.670440 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:57.670389 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-574c4b9dd4-nh84n" event={"ID":"7d553f61-65fc-42d0-9be8-8a8f7c0d622a","Type":"ContainerDied","Data":"99a702b7d938f7a22bcec020d41504a5e7b1be2a742e05aa23dadcce073ffe24"} Apr 23 18:55:57.670440 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:57.670411 2568 scope.go:117] "RemoveContainer" containerID="adc6e6ccc9647995ac4d6264013a29174e76859f51d67bc75b142b5f4293caac" Apr 23 18:55:57.671836 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:57.671811 2568 generic.go:358] "Generic (PLEG): container finished" podID="c455d1bd-98b4-472b-b639-999401359fd4" containerID="3d4788d8710c35802fa1b425c23eaa011e2cd6912e535a6d8aef6904524c9174" exitCode=0 Apr 23 18:55:57.671973 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:57.671852 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh" event={"ID":"c455d1bd-98b4-472b-b639-999401359fd4","Type":"ContainerDied","Data":"3d4788d8710c35802fa1b425c23eaa011e2cd6912e535a6d8aef6904524c9174"} Apr 23 18:55:57.678752 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:57.678689 2568 scope.go:117] "RemoveContainer" containerID="dbc5959a5da7c5b428106c00c528c8a71bcdcbd6b8b548afc07c73974048c8cd" Apr 23 18:55:57.686126 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:57.686079 2568 scope.go:117] "RemoveContainer" containerID="adc6e6ccc9647995ac4d6264013a29174e76859f51d67bc75b142b5f4293caac" Apr 23 18:55:57.686436 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:55:57.686411 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adc6e6ccc9647995ac4d6264013a29174e76859f51d67bc75b142b5f4293caac\": container with ID starting with adc6e6ccc9647995ac4d6264013a29174e76859f51d67bc75b142b5f4293caac not found: ID does not exist" containerID="adc6e6ccc9647995ac4d6264013a29174e76859f51d67bc75b142b5f4293caac" Apr 23 18:55:57.686507 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:57.686450 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adc6e6ccc9647995ac4d6264013a29174e76859f51d67bc75b142b5f4293caac"} err="failed to get container status \"adc6e6ccc9647995ac4d6264013a29174e76859f51d67bc75b142b5f4293caac\": rpc error: code = NotFound desc = could not find container \"adc6e6ccc9647995ac4d6264013a29174e76859f51d67bc75b142b5f4293caac\": container with ID starting with adc6e6ccc9647995ac4d6264013a29174e76859f51d67bc75b142b5f4293caac not found: ID does not exist" Apr 23 18:55:57.686507 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:57.686477 2568 scope.go:117] "RemoveContainer" containerID="dbc5959a5da7c5b428106c00c528c8a71bcdcbd6b8b548afc07c73974048c8cd" Apr 23 18:55:57.686749 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:55:57.686732 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbc5959a5da7c5b428106c00c528c8a71bcdcbd6b8b548afc07c73974048c8cd\": container with ID starting with dbc5959a5da7c5b428106c00c528c8a71bcdcbd6b8b548afc07c73974048c8cd not found: ID does not exist" containerID="dbc5959a5da7c5b428106c00c528c8a71bcdcbd6b8b548afc07c73974048c8cd" Apr 23 18:55:57.686791 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:57.686753 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbc5959a5da7c5b428106c00c528c8a71bcdcbd6b8b548afc07c73974048c8cd"} err="failed to get container status \"dbc5959a5da7c5b428106c00c528c8a71bcdcbd6b8b548afc07c73974048c8cd\": rpc error: code = NotFound desc = could not find container \"dbc5959a5da7c5b428106c00c528c8a71bcdcbd6b8b548afc07c73974048c8cd\": container with ID starting with dbc5959a5da7c5b428106c00c528c8a71bcdcbd6b8b548afc07c73974048c8cd not found: ID does not exist" Apr 23 18:55:57.721856 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:57.721829 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-574c4b9dd4-nh84n"] Apr 23 18:55:57.726980 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:57.726950 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-574c4b9dd4-nh84n"] Apr 23 18:55:58.599521 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:58.599489 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d553f61-65fc-42d0-9be8-8a8f7c0d622a" path="/var/lib/kubelet/pods/7d553f61-65fc-42d0-9be8-8a8f7c0d622a/volumes" Apr 23 18:55:58.677572 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:58.677538 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh" event={"ID":"c455d1bd-98b4-472b-b639-999401359fd4","Type":"ContainerStarted","Data":"53ea9a4fed3bdc282ab2faedc32331a57544041ed06ace5c3cd47063f27d3c52"} Apr 23 18:55:58.677572 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:58.677574 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh" event={"ID":"c455d1bd-98b4-472b-b639-999401359fd4","Type":"ContainerStarted","Data":"141a9bee373063046264ab26ddf2c2234c919770be0f6516bc5a1320bbc1eafc"} Apr 23 18:55:58.678029 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:58.677727 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh" Apr 23 18:55:58.700138 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:58.700087 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh" podStartSLOduration=3.700071303 podStartE2EDuration="3.700071303s" podCreationTimestamp="2026-04-23 18:55:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:55:58.698786412 +0000 UTC m=+3816.618878887" watchObservedRunningTime="2026-04-23 18:55:58.700071303 +0000 UTC m=+3816.620163778" Apr 23 18:55:59.680961 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:59.680917 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh" Apr 23 18:55:59.682231 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:55:59.682203 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh" podUID="c455d1bd-98b4-472b-b639-999401359fd4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.61:8080: connect: connection refused" Apr 23 18:56:00.685053 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:56:00.685011 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh" podUID="c455d1bd-98b4-472b-b639-999401359fd4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.61:8080: connect: connection refused" Apr 23 18:56:05.689774 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:56:05.689744 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh" Apr 23 18:56:05.690267 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:56:05.690241 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh" podUID="c455d1bd-98b4-472b-b639-999401359fd4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.61:8080: connect: connection refused" Apr 23 18:56:15.691065 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:56:15.691027 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh" podUID="c455d1bd-98b4-472b-b639-999401359fd4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.61:8080: connect: connection refused" Apr 23 18:56:25.690394 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:56:25.690352 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh" podUID="c455d1bd-98b4-472b-b639-999401359fd4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.61:8080: connect: connection refused" Apr 23 18:56:35.690230 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:56:35.690149 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh" podUID="c455d1bd-98b4-472b-b639-999401359fd4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.61:8080: connect: connection refused" Apr 23 18:56:45.691084 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:56:45.691043 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh" podUID="c455d1bd-98b4-472b-b639-999401359fd4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.61:8080: connect: connection refused" Apr 23 18:56:55.690570 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:56:55.690529 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh" podUID="c455d1bd-98b4-472b-b639-999401359fd4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.61:8080: connect: connection refused" Apr 23 18:57:05.690923 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:05.690887 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh" Apr 23 18:57:15.421156 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:15.421117 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh"] Apr 23 18:57:15.421601 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:15.421453 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh" podUID="c455d1bd-98b4-472b-b639-999401359fd4" containerName="kserve-container" containerID="cri-o://141a9bee373063046264ab26ddf2c2234c919770be0f6516bc5a1320bbc1eafc" gracePeriod=30 Apr 23 18:57:15.421601 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:15.421515 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh" podUID="c455d1bd-98b4-472b-b639-999401359fd4" containerName="kube-rbac-proxy" containerID="cri-o://53ea9a4fed3bdc282ab2faedc32331a57544041ed06ace5c3cd47063f27d3c52" gracePeriod=30 Apr 23 18:57:15.685790 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:15.685677 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh" podUID="c455d1bd-98b4-472b-b639-999401359fd4" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.61:8643/healthz\": dial tcp 10.133.0.61:8643: connect: connection refused" Apr 23 18:57:15.691099 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:15.691061 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh" podUID="c455d1bd-98b4-472b-b639-999401359fd4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.61:8080: connect: connection refused" Apr 23 18:57:15.897797 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:15.897763 2568 generic.go:358] "Generic (PLEG): container finished" podID="c455d1bd-98b4-472b-b639-999401359fd4" containerID="53ea9a4fed3bdc282ab2faedc32331a57544041ed06ace5c3cd47063f27d3c52" exitCode=2 Apr 23 18:57:15.898009 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:15.897838 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh" event={"ID":"c455d1bd-98b4-472b-b639-999401359fd4","Type":"ContainerDied","Data":"53ea9a4fed3bdc282ab2faedc32331a57544041ed06ace5c3cd47063f27d3c52"} Apr 23 18:57:16.492179 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:16.492144 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz"] Apr 23 18:57:16.492604 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:16.492449 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7d553f61-65fc-42d0-9be8-8a8f7c0d622a" containerName="storage-initializer" Apr 23 18:57:16.492604 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:16.492467 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d553f61-65fc-42d0-9be8-8a8f7c0d622a" containerName="storage-initializer" Apr 23 18:57:16.492604 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:16.492480 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7d553f61-65fc-42d0-9be8-8a8f7c0d622a" containerName="storage-initializer" Apr 23 18:57:16.492604 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:16.492488 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d553f61-65fc-42d0-9be8-8a8f7c0d622a" containerName="storage-initializer" Apr 23 18:57:16.492604 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:16.492574 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="7d553f61-65fc-42d0-9be8-8a8f7c0d622a" containerName="storage-initializer" Apr 23 18:57:16.492604 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:16.492584 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="7d553f61-65fc-42d0-9be8-8a8f7c0d622a" containerName="storage-initializer" Apr 23 18:57:16.495701 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:16.495678 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz" Apr 23 18:57:16.497665 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:16.497645 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert\"" Apr 23 18:57:16.497829 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:16.497812 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\"" Apr 23 18:57:16.505923 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:16.505899 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz"] Apr 23 18:57:16.581368 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:16.581329 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/92420c46-08d8-4be9-a193-8ae9341b0822-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz\" (UID: \"92420c46-08d8-4be9-a193-8ae9341b0822\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz" Apr 23 18:57:16.581561 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:16.581375 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/92420c46-08d8-4be9-a193-8ae9341b0822-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz\" (UID: \"92420c46-08d8-4be9-a193-8ae9341b0822\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz" Apr 23 18:57:16.581561 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:16.581399 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwf5q\" (UniqueName: \"kubernetes.io/projected/92420c46-08d8-4be9-a193-8ae9341b0822-kube-api-access-rwf5q\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz\" (UID: \"92420c46-08d8-4be9-a193-8ae9341b0822\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz" Apr 23 18:57:16.581561 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:16.581504 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/92420c46-08d8-4be9-a193-8ae9341b0822-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz\" (UID: \"92420c46-08d8-4be9-a193-8ae9341b0822\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz" Apr 23 18:57:16.682058 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:16.682013 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/92420c46-08d8-4be9-a193-8ae9341b0822-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz\" (UID: \"92420c46-08d8-4be9-a193-8ae9341b0822\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz" Apr 23 18:57:16.682058 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:16.682059 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rwf5q\" (UniqueName: \"kubernetes.io/projected/92420c46-08d8-4be9-a193-8ae9341b0822-kube-api-access-rwf5q\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz\" (UID: \"92420c46-08d8-4be9-a193-8ae9341b0822\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz" Apr 23 18:57:16.682319 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:16.682086 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/92420c46-08d8-4be9-a193-8ae9341b0822-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz\" (UID: \"92420c46-08d8-4be9-a193-8ae9341b0822\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz" Apr 23 18:57:16.682319 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:16.682134 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/92420c46-08d8-4be9-a193-8ae9341b0822-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz\" (UID: \"92420c46-08d8-4be9-a193-8ae9341b0822\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz" Apr 23 18:57:16.682319 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:57:16.682246 2568 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert: secret "isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert" not found Apr 23 18:57:16.682483 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:57:16.682356 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92420c46-08d8-4be9-a193-8ae9341b0822-proxy-tls podName:92420c46-08d8-4be9-a193-8ae9341b0822 nodeName:}" failed. No retries permitted until 2026-04-23 18:57:17.182332255 +0000 UTC m=+3895.102424708 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/92420c46-08d8-4be9-a193-8ae9341b0822-proxy-tls") pod "isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz" (UID: "92420c46-08d8-4be9-a193-8ae9341b0822") : secret "isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert" not found Apr 23 18:57:16.682483 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:16.682466 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/92420c46-08d8-4be9-a193-8ae9341b0822-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz\" (UID: \"92420c46-08d8-4be9-a193-8ae9341b0822\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz" Apr 23 18:57:16.682752 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:16.682733 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/92420c46-08d8-4be9-a193-8ae9341b0822-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz\" (UID: \"92420c46-08d8-4be9-a193-8ae9341b0822\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz" Apr 23 18:57:16.692765 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:16.692738 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwf5q\" (UniqueName: \"kubernetes.io/projected/92420c46-08d8-4be9-a193-8ae9341b0822-kube-api-access-rwf5q\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz\" (UID: \"92420c46-08d8-4be9-a193-8ae9341b0822\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz" Apr 23 18:57:17.186281 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:17.186244 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/92420c46-08d8-4be9-a193-8ae9341b0822-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz\" (UID: \"92420c46-08d8-4be9-a193-8ae9341b0822\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz" Apr 23 18:57:17.188807 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:17.188774 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/92420c46-08d8-4be9-a193-8ae9341b0822-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz\" (UID: \"92420c46-08d8-4be9-a193-8ae9341b0822\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz" Apr 23 18:57:17.406088 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:17.406049 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz" Apr 23 18:57:17.526716 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:17.526683 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz"] Apr 23 18:57:17.530074 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:57:17.530039 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92420c46_08d8_4be9_a193_8ae9341b0822.slice/crio-8c342d75d68799be513db18a3a4342e3b2c4507ade72fd6e4bb96c1a6e8f82c0 WatchSource:0}: Error finding container 8c342d75d68799be513db18a3a4342e3b2c4507ade72fd6e4bb96c1a6e8f82c0: Status 404 returned error can't find the container with id 8c342d75d68799be513db18a3a4342e3b2c4507ade72fd6e4bb96c1a6e8f82c0 Apr 23 18:57:17.904899 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:17.904865 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz" event={"ID":"92420c46-08d8-4be9-a193-8ae9341b0822","Type":"ContainerStarted","Data":"a0f0eb650168b97259005f3458109ae2ad8f05935d9873e7e79a3bf65d499502"} Apr 23 18:57:17.904899 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:17.904902 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz" event={"ID":"92420c46-08d8-4be9-a193-8ae9341b0822","Type":"ContainerStarted","Data":"8c342d75d68799be513db18a3a4342e3b2c4507ade72fd6e4bb96c1a6e8f82c0"} Apr 23 18:57:19.865052 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:19.865024 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh" Apr 23 18:57:19.915210 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:19.915178 2568 generic.go:358] "Generic (PLEG): container finished" podID="c455d1bd-98b4-472b-b639-999401359fd4" containerID="141a9bee373063046264ab26ddf2c2234c919770be0f6516bc5a1320bbc1eafc" exitCode=0 Apr 23 18:57:19.915210 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:19.915215 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh" event={"ID":"c455d1bd-98b4-472b-b639-999401359fd4","Type":"ContainerDied","Data":"141a9bee373063046264ab26ddf2c2234c919770be0f6516bc5a1320bbc1eafc"} Apr 23 18:57:19.915448 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:19.915236 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh" event={"ID":"c455d1bd-98b4-472b-b639-999401359fd4","Type":"ContainerDied","Data":"5154a8edf44022487c530f660b1e7b19842834312d8e5099699999371915fd32"} Apr 23 18:57:19.915448 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:19.915253 2568 scope.go:117] "RemoveContainer" containerID="53ea9a4fed3bdc282ab2faedc32331a57544041ed06ace5c3cd47063f27d3c52" Apr 23 18:57:19.915448 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:19.915254 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh" Apr 23 18:57:19.922694 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:19.922668 2568 scope.go:117] "RemoveContainer" containerID="141a9bee373063046264ab26ddf2c2234c919770be0f6516bc5a1320bbc1eafc" Apr 23 18:57:19.929761 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:19.929739 2568 scope.go:117] "RemoveContainer" containerID="3d4788d8710c35802fa1b425c23eaa011e2cd6912e535a6d8aef6904524c9174" Apr 23 18:57:19.936462 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:19.936441 2568 scope.go:117] "RemoveContainer" containerID="53ea9a4fed3bdc282ab2faedc32331a57544041ed06ace5c3cd47063f27d3c52" Apr 23 18:57:19.936721 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:57:19.936702 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53ea9a4fed3bdc282ab2faedc32331a57544041ed06ace5c3cd47063f27d3c52\": container with ID starting with 53ea9a4fed3bdc282ab2faedc32331a57544041ed06ace5c3cd47063f27d3c52 not found: ID does not exist" containerID="53ea9a4fed3bdc282ab2faedc32331a57544041ed06ace5c3cd47063f27d3c52" Apr 23 18:57:19.936772 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:19.936731 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53ea9a4fed3bdc282ab2faedc32331a57544041ed06ace5c3cd47063f27d3c52"} err="failed to get container status \"53ea9a4fed3bdc282ab2faedc32331a57544041ed06ace5c3cd47063f27d3c52\": rpc error: code = NotFound desc = could not find container \"53ea9a4fed3bdc282ab2faedc32331a57544041ed06ace5c3cd47063f27d3c52\": container with ID starting with 53ea9a4fed3bdc282ab2faedc32331a57544041ed06ace5c3cd47063f27d3c52 not found: ID does not exist" Apr 23 18:57:19.936772 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:19.936750 2568 scope.go:117] "RemoveContainer" containerID="141a9bee373063046264ab26ddf2c2234c919770be0f6516bc5a1320bbc1eafc" Apr 23 18:57:19.937001 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:57:19.936976 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"141a9bee373063046264ab26ddf2c2234c919770be0f6516bc5a1320bbc1eafc\": container with ID starting with 141a9bee373063046264ab26ddf2c2234c919770be0f6516bc5a1320bbc1eafc not found: ID does not exist" containerID="141a9bee373063046264ab26ddf2c2234c919770be0f6516bc5a1320bbc1eafc" Apr 23 18:57:19.937104 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:19.937006 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"141a9bee373063046264ab26ddf2c2234c919770be0f6516bc5a1320bbc1eafc"} err="failed to get container status \"141a9bee373063046264ab26ddf2c2234c919770be0f6516bc5a1320bbc1eafc\": rpc error: code = NotFound desc = could not find container \"141a9bee373063046264ab26ddf2c2234c919770be0f6516bc5a1320bbc1eafc\": container with ID starting with 141a9bee373063046264ab26ddf2c2234c919770be0f6516bc5a1320bbc1eafc not found: ID does not exist" Apr 23 18:57:19.937104 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:19.937022 2568 scope.go:117] "RemoveContainer" containerID="3d4788d8710c35802fa1b425c23eaa011e2cd6912e535a6d8aef6904524c9174" Apr 23 18:57:19.937256 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:57:19.937238 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d4788d8710c35802fa1b425c23eaa011e2cd6912e535a6d8aef6904524c9174\": container with ID starting with 3d4788d8710c35802fa1b425c23eaa011e2cd6912e535a6d8aef6904524c9174 not found: ID does not exist" containerID="3d4788d8710c35802fa1b425c23eaa011e2cd6912e535a6d8aef6904524c9174" Apr 23 18:57:19.937310 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:19.937264 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d4788d8710c35802fa1b425c23eaa011e2cd6912e535a6d8aef6904524c9174"} err="failed to get container status \"3d4788d8710c35802fa1b425c23eaa011e2cd6912e535a6d8aef6904524c9174\": rpc error: code = NotFound desc = could not find container \"3d4788d8710c35802fa1b425c23eaa011e2cd6912e535a6d8aef6904524c9174\": container with ID starting with 3d4788d8710c35802fa1b425c23eaa011e2cd6912e535a6d8aef6904524c9174 not found: ID does not exist" Apr 23 18:57:20.009174 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:20.009136 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c455d1bd-98b4-472b-b639-999401359fd4-cabundle-cert\") pod \"c455d1bd-98b4-472b-b639-999401359fd4\" (UID: \"c455d1bd-98b4-472b-b639-999401359fd4\") " Apr 23 18:57:20.009174 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:20.009180 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c455d1bd-98b4-472b-b639-999401359fd4-proxy-tls\") pod \"c455d1bd-98b4-472b-b639-999401359fd4\" (UID: \"c455d1bd-98b4-472b-b639-999401359fd4\") " Apr 23 18:57:20.009396 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:20.009199 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c455d1bd-98b4-472b-b639-999401359fd4-kserve-provision-location\") pod \"c455d1bd-98b4-472b-b639-999401359fd4\" (UID: \"c455d1bd-98b4-472b-b639-999401359fd4\") " Apr 23 18:57:20.009396 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:20.009227 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c455d1bd-98b4-472b-b639-999401359fd4-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"c455d1bd-98b4-472b-b639-999401359fd4\" (UID: \"c455d1bd-98b4-472b-b639-999401359fd4\") " Apr 23 18:57:20.009396 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:20.009249 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wknnz\" (UniqueName: \"kubernetes.io/projected/c455d1bd-98b4-472b-b639-999401359fd4-kube-api-access-wknnz\") pod \"c455d1bd-98b4-472b-b639-999401359fd4\" (UID: \"c455d1bd-98b4-472b-b639-999401359fd4\") " Apr 23 18:57:20.009654 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:20.009623 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c455d1bd-98b4-472b-b639-999401359fd4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c455d1bd-98b4-472b-b639-999401359fd4" (UID: "c455d1bd-98b4-472b-b639-999401359fd4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:57:20.009654 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:20.009639 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c455d1bd-98b4-472b-b639-999401359fd4-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config") pod "c455d1bd-98b4-472b-b639-999401359fd4" (UID: "c455d1bd-98b4-472b-b639-999401359fd4"). InnerVolumeSpecName "isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:57:20.009769 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:20.009653 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c455d1bd-98b4-472b-b639-999401359fd4-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "c455d1bd-98b4-472b-b639-999401359fd4" (UID: "c455d1bd-98b4-472b-b639-999401359fd4"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:57:20.011412 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:20.011381 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c455d1bd-98b4-472b-b639-999401359fd4-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c455d1bd-98b4-472b-b639-999401359fd4" (UID: "c455d1bd-98b4-472b-b639-999401359fd4"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:57:20.011584 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:20.011555 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c455d1bd-98b4-472b-b639-999401359fd4-kube-api-access-wknnz" (OuterVolumeSpecName: "kube-api-access-wknnz") pod "c455d1bd-98b4-472b-b639-999401359fd4" (UID: "c455d1bd-98b4-472b-b639-999401359fd4"). InnerVolumeSpecName "kube-api-access-wknnz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:57:20.110107 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:20.110066 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wknnz\" (UniqueName: \"kubernetes.io/projected/c455d1bd-98b4-472b-b639-999401359fd4-kube-api-access-wknnz\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:57:20.110107 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:20.110101 2568 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c455d1bd-98b4-472b-b639-999401359fd4-cabundle-cert\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:57:20.110107 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:20.110112 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c455d1bd-98b4-472b-b639-999401359fd4-proxy-tls\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:57:20.110349 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:20.110120 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c455d1bd-98b4-472b-b639-999401359fd4-kserve-provision-location\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:57:20.110349 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:20.110131 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c455d1bd-98b4-472b-b639-999401359fd4-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:57:20.239085 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:20.239046 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh"] Apr 23 18:57:20.243667 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:20.243639 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64b6bb485-rtfwh"] Apr 23 18:57:20.598877 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:20.598838 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c455d1bd-98b4-472b-b639-999401359fd4" path="/var/lib/kubelet/pods/c455d1bd-98b4-472b-b639-999401359fd4/volumes" Apr 23 18:57:22.758300 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:22.758208 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz_92420c46-08d8-4be9-a193-8ae9341b0822/storage-initializer/0.log" Apr 23 18:57:22.763611 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:22.763588 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz_92420c46-08d8-4be9-a193-8ae9341b0822/storage-initializer/0.log" Apr 23 18:57:22.767414 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:22.767397 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwrk7_52f4d4c2-438f-4486-8ffb-4afbaad14bf1/ovn-acl-logging/0.log" Apr 23 18:57:22.771048 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:22.771026 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_8dc4456d0c7e39a0756b16fde0e83318/kube-rbac-proxy-crio/4.log" Apr 23 18:57:22.771908 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:22.771891 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwrk7_52f4d4c2-438f-4486-8ffb-4afbaad14bf1/ovn-acl-logging/0.log" Apr 23 18:57:22.775225 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:22.775209 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-131-144.ec2.internal_8dc4456d0c7e39a0756b16fde0e83318/kube-rbac-proxy-crio/4.log" Apr 23 18:57:22.926338 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:22.926313 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz_92420c46-08d8-4be9-a193-8ae9341b0822/storage-initializer/0.log" Apr 23 18:57:22.926507 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:22.926347 2568 generic.go:358] "Generic (PLEG): container finished" podID="92420c46-08d8-4be9-a193-8ae9341b0822" containerID="a0f0eb650168b97259005f3458109ae2ad8f05935d9873e7e79a3bf65d499502" exitCode=1 Apr 23 18:57:22.926507 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:22.926395 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz" event={"ID":"92420c46-08d8-4be9-a193-8ae9341b0822","Type":"ContainerDied","Data":"a0f0eb650168b97259005f3458109ae2ad8f05935d9873e7e79a3bf65d499502"} Apr 23 18:57:23.930436 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:23.930406 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz_92420c46-08d8-4be9-a193-8ae9341b0822/storage-initializer/0.log" Apr 23 18:57:23.930817 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:23.930497 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz" event={"ID":"92420c46-08d8-4be9-a193-8ae9341b0822","Type":"ContainerStarted","Data":"35389377a541a5912e13b0b59d39292e35bebe02b73411e559fe2b328b82757e"} Apr 23 18:57:26.490871 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:26.490832 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz"] Apr 23 18:57:26.491319 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:26.491176 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz" podUID="92420c46-08d8-4be9-a193-8ae9341b0822" containerName="storage-initializer" containerID="cri-o://35389377a541a5912e13b0b59d39292e35bebe02b73411e559fe2b328b82757e" gracePeriod=30 Apr 23 18:57:27.618200 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:27.618179 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz_92420c46-08d8-4be9-a193-8ae9341b0822/storage-initializer/1.log" Apr 23 18:57:27.618547 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:27.618520 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz_92420c46-08d8-4be9-a193-8ae9341b0822/storage-initializer/0.log" Apr 23 18:57:27.618592 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:27.618579 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz" Apr 23 18:57:27.767421 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:27.767320 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/92420c46-08d8-4be9-a193-8ae9341b0822-kserve-provision-location\") pod \"92420c46-08d8-4be9-a193-8ae9341b0822\" (UID: \"92420c46-08d8-4be9-a193-8ae9341b0822\") " Apr 23 18:57:27.767421 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:27.767395 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/92420c46-08d8-4be9-a193-8ae9341b0822-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"92420c46-08d8-4be9-a193-8ae9341b0822\" (UID: \"92420c46-08d8-4be9-a193-8ae9341b0822\") " Apr 23 18:57:27.767421 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:27.767423 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/92420c46-08d8-4be9-a193-8ae9341b0822-proxy-tls\") pod \"92420c46-08d8-4be9-a193-8ae9341b0822\" (UID: \"92420c46-08d8-4be9-a193-8ae9341b0822\") " Apr 23 18:57:27.767661 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:27.767455 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwf5q\" (UniqueName: \"kubernetes.io/projected/92420c46-08d8-4be9-a193-8ae9341b0822-kube-api-access-rwf5q\") pod \"92420c46-08d8-4be9-a193-8ae9341b0822\" (UID: \"92420c46-08d8-4be9-a193-8ae9341b0822\") " Apr 23 18:57:27.767661 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:27.767609 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92420c46-08d8-4be9-a193-8ae9341b0822-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "92420c46-08d8-4be9-a193-8ae9341b0822" (UID: "92420c46-08d8-4be9-a193-8ae9341b0822"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:57:27.767763 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:27.767741 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92420c46-08d8-4be9-a193-8ae9341b0822-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config") pod "92420c46-08d8-4be9-a193-8ae9341b0822" (UID: "92420c46-08d8-4be9-a193-8ae9341b0822"). InnerVolumeSpecName "isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:57:27.769474 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:27.769455 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92420c46-08d8-4be9-a193-8ae9341b0822-kube-api-access-rwf5q" (OuterVolumeSpecName: "kube-api-access-rwf5q") pod "92420c46-08d8-4be9-a193-8ae9341b0822" (UID: "92420c46-08d8-4be9-a193-8ae9341b0822"). InnerVolumeSpecName "kube-api-access-rwf5q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:57:27.769652 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:27.769634 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92420c46-08d8-4be9-a193-8ae9341b0822-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "92420c46-08d8-4be9-a193-8ae9341b0822" (UID: "92420c46-08d8-4be9-a193-8ae9341b0822"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:57:27.868709 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:27.868678 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/92420c46-08d8-4be9-a193-8ae9341b0822-kserve-provision-location\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:57:27.868709 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:27.868705 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/92420c46-08d8-4be9-a193-8ae9341b0822-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:57:27.868709 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:27.868717 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/92420c46-08d8-4be9-a193-8ae9341b0822-proxy-tls\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:57:27.868958 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:27.868728 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rwf5q\" (UniqueName: \"kubernetes.io/projected/92420c46-08d8-4be9-a193-8ae9341b0822-kube-api-access-rwf5q\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:57:27.942374 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:27.942347 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz_92420c46-08d8-4be9-a193-8ae9341b0822/storage-initializer/1.log" Apr 23 18:57:27.942726 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:27.942709 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz_92420c46-08d8-4be9-a193-8ae9341b0822/storage-initializer/0.log" Apr 23 18:57:27.942779 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:27.942746 2568 generic.go:358] "Generic (PLEG): container finished" podID="92420c46-08d8-4be9-a193-8ae9341b0822" containerID="35389377a541a5912e13b0b59d39292e35bebe02b73411e559fe2b328b82757e" exitCode=1 Apr 23 18:57:27.942815 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:27.942777 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz" event={"ID":"92420c46-08d8-4be9-a193-8ae9341b0822","Type":"ContainerDied","Data":"35389377a541a5912e13b0b59d39292e35bebe02b73411e559fe2b328b82757e"} Apr 23 18:57:27.942851 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:27.942818 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz" event={"ID":"92420c46-08d8-4be9-a193-8ae9341b0822","Type":"ContainerDied","Data":"8c342d75d68799be513db18a3a4342e3b2c4507ade72fd6e4bb96c1a6e8f82c0"} Apr 23 18:57:27.942851 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:27.942834 2568 scope.go:117] "RemoveContainer" containerID="35389377a541a5912e13b0b59d39292e35bebe02b73411e559fe2b328b82757e" Apr 23 18:57:27.942925 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:27.942854 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz" Apr 23 18:57:27.950701 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:27.950679 2568 scope.go:117] "RemoveContainer" containerID="a0f0eb650168b97259005f3458109ae2ad8f05935d9873e7e79a3bf65d499502" Apr 23 18:57:27.957634 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:27.957619 2568 scope.go:117] "RemoveContainer" containerID="35389377a541a5912e13b0b59d39292e35bebe02b73411e559fe2b328b82757e" Apr 23 18:57:27.957855 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:57:27.957840 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35389377a541a5912e13b0b59d39292e35bebe02b73411e559fe2b328b82757e\": container with ID starting with 35389377a541a5912e13b0b59d39292e35bebe02b73411e559fe2b328b82757e not found: ID does not exist" containerID="35389377a541a5912e13b0b59d39292e35bebe02b73411e559fe2b328b82757e" Apr 23 18:57:27.957898 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:27.957863 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35389377a541a5912e13b0b59d39292e35bebe02b73411e559fe2b328b82757e"} err="failed to get container status \"35389377a541a5912e13b0b59d39292e35bebe02b73411e559fe2b328b82757e\": rpc error: code = NotFound desc = could not find container \"35389377a541a5912e13b0b59d39292e35bebe02b73411e559fe2b328b82757e\": container with ID starting with 35389377a541a5912e13b0b59d39292e35bebe02b73411e559fe2b328b82757e not found: ID does not exist" Apr 23 18:57:27.957898 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:27.957879 2568 scope.go:117] "RemoveContainer" containerID="a0f0eb650168b97259005f3458109ae2ad8f05935d9873e7e79a3bf65d499502" Apr 23 18:57:27.958090 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:57:27.958075 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0f0eb650168b97259005f3458109ae2ad8f05935d9873e7e79a3bf65d499502\": container with ID starting with a0f0eb650168b97259005f3458109ae2ad8f05935d9873e7e79a3bf65d499502 not found: ID does not exist" containerID="a0f0eb650168b97259005f3458109ae2ad8f05935d9873e7e79a3bf65d499502" Apr 23 18:57:27.958138 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:27.958096 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0f0eb650168b97259005f3458109ae2ad8f05935d9873e7e79a3bf65d499502"} err="failed to get container status \"a0f0eb650168b97259005f3458109ae2ad8f05935d9873e7e79a3bf65d499502\": rpc error: code = NotFound desc = could not find container \"a0f0eb650168b97259005f3458109ae2ad8f05935d9873e7e79a3bf65d499502\": container with ID starting with a0f0eb650168b97259005f3458109ae2ad8f05935d9873e7e79a3bf65d499502 not found: ID does not exist" Apr 23 18:57:27.978833 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:27.978808 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz"] Apr 23 18:57:27.983555 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:27.983527 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-797c5c84dc-b67pz"] Apr 23 18:57:28.235746 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:28.235715 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-p7cw9/must-gather-m9hbg"] Apr 23 18:57:28.236036 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:28.236022 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="92420c46-08d8-4be9-a193-8ae9341b0822" containerName="storage-initializer" Apr 23 18:57:28.236086 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:28.236037 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="92420c46-08d8-4be9-a193-8ae9341b0822" containerName="storage-initializer" Apr 23 18:57:28.236086 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:28.236050 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="92420c46-08d8-4be9-a193-8ae9341b0822" containerName="storage-initializer" Apr 23 18:57:28.236086 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:28.236055 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="92420c46-08d8-4be9-a193-8ae9341b0822" containerName="storage-initializer" Apr 23 18:57:28.236086 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:28.236065 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c455d1bd-98b4-472b-b639-999401359fd4" containerName="kserve-container" Apr 23 18:57:28.236086 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:28.236071 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="c455d1bd-98b4-472b-b639-999401359fd4" containerName="kserve-container" Apr 23 18:57:28.236086 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:28.236077 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c455d1bd-98b4-472b-b639-999401359fd4" containerName="kube-rbac-proxy" Apr 23 18:57:28.236086 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:28.236082 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="c455d1bd-98b4-472b-b639-999401359fd4" containerName="kube-rbac-proxy" Apr 23 18:57:28.236333 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:28.236090 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c455d1bd-98b4-472b-b639-999401359fd4" containerName="storage-initializer" Apr 23 18:57:28.236333 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:28.236096 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="c455d1bd-98b4-472b-b639-999401359fd4" containerName="storage-initializer" Apr 23 18:57:28.236333 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:28.236134 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="92420c46-08d8-4be9-a193-8ae9341b0822" containerName="storage-initializer" Apr 23 18:57:28.236333 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:28.236144 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="c455d1bd-98b4-472b-b639-999401359fd4" containerName="kube-rbac-proxy" Apr 23 18:57:28.236333 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:28.236152 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="92420c46-08d8-4be9-a193-8ae9341b0822" containerName="storage-initializer" Apr 23 18:57:28.236333 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:28.236160 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="c455d1bd-98b4-472b-b639-999401359fd4" containerName="kserve-container" Apr 23 18:57:28.240261 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:28.240244 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p7cw9/must-gather-m9hbg" Apr 23 18:57:28.242286 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:28.242265 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-p7cw9\"/\"default-dockercfg-kqvhh\"" Apr 23 18:57:28.242415 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:28.242268 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-p7cw9\"/\"openshift-service-ca.crt\"" Apr 23 18:57:28.242415 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:28.242300 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-p7cw9\"/\"kube-root-ca.crt\"" Apr 23 18:57:28.251778 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:28.251756 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-p7cw9/must-gather-m9hbg"] Apr 23 18:57:28.372520 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:28.372470 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g64zh\" (UniqueName: \"kubernetes.io/projected/ad958c5b-e96d-43fe-9be9-d9ca8bc34a28-kube-api-access-g64zh\") pod \"must-gather-m9hbg\" (UID: \"ad958c5b-e96d-43fe-9be9-d9ca8bc34a28\") " pod="openshift-must-gather-p7cw9/must-gather-m9hbg" Apr 23 18:57:28.372706 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:28.372542 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ad958c5b-e96d-43fe-9be9-d9ca8bc34a28-must-gather-output\") pod \"must-gather-m9hbg\" (UID: \"ad958c5b-e96d-43fe-9be9-d9ca8bc34a28\") " pod="openshift-must-gather-p7cw9/must-gather-m9hbg" Apr 23 18:57:28.473453 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:28.473414 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g64zh\" (UniqueName: \"kubernetes.io/projected/ad958c5b-e96d-43fe-9be9-d9ca8bc34a28-kube-api-access-g64zh\") pod \"must-gather-m9hbg\" (UID: \"ad958c5b-e96d-43fe-9be9-d9ca8bc34a28\") " pod="openshift-must-gather-p7cw9/must-gather-m9hbg" Apr 23 18:57:28.473655 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:28.473489 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ad958c5b-e96d-43fe-9be9-d9ca8bc34a28-must-gather-output\") pod \"must-gather-m9hbg\" (UID: \"ad958c5b-e96d-43fe-9be9-d9ca8bc34a28\") " pod="openshift-must-gather-p7cw9/must-gather-m9hbg" Apr 23 18:57:28.473838 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:28.473819 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ad958c5b-e96d-43fe-9be9-d9ca8bc34a28-must-gather-output\") pod \"must-gather-m9hbg\" (UID: \"ad958c5b-e96d-43fe-9be9-d9ca8bc34a28\") " pod="openshift-must-gather-p7cw9/must-gather-m9hbg" Apr 23 18:57:28.481918 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:28.481894 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g64zh\" (UniqueName: \"kubernetes.io/projected/ad958c5b-e96d-43fe-9be9-d9ca8bc34a28-kube-api-access-g64zh\") pod \"must-gather-m9hbg\" (UID: \"ad958c5b-e96d-43fe-9be9-d9ca8bc34a28\") " pod="openshift-must-gather-p7cw9/must-gather-m9hbg" Apr 23 18:57:28.557055 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:28.556968 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p7cw9/must-gather-m9hbg" Apr 23 18:57:28.598811 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:28.598779 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92420c46-08d8-4be9-a193-8ae9341b0822" path="/var/lib/kubelet/pods/92420c46-08d8-4be9-a193-8ae9341b0822/volumes" Apr 23 18:57:28.676345 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:28.676313 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-p7cw9/must-gather-m9hbg"] Apr 23 18:57:28.679811 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:57:28.679784 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad958c5b_e96d_43fe_9be9_d9ca8bc34a28.slice/crio-294c74aee57a91e05502ca95dcfcfb891f1e11c924a7bfd76cccce3d69578004 WatchSource:0}: Error finding container 294c74aee57a91e05502ca95dcfcfb891f1e11c924a7bfd76cccce3d69578004: Status 404 returned error can't find the container with id 294c74aee57a91e05502ca95dcfcfb891f1e11c924a7bfd76cccce3d69578004 Apr 23 18:57:28.947156 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:28.947119 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p7cw9/must-gather-m9hbg" event={"ID":"ad958c5b-e96d-43fe-9be9-d9ca8bc34a28","Type":"ContainerStarted","Data":"294c74aee57a91e05502ca95dcfcfb891f1e11c924a7bfd76cccce3d69578004"} Apr 23 18:57:32.960698 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:32.960657 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p7cw9/must-gather-m9hbg" event={"ID":"ad958c5b-e96d-43fe-9be9-d9ca8bc34a28","Type":"ContainerStarted","Data":"5d8fc28d5698c02f9315a89c18daa0aca566715f861f449c946744b0c2c45085"} Apr 23 18:57:32.960698 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:32.960698 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p7cw9/must-gather-m9hbg" event={"ID":"ad958c5b-e96d-43fe-9be9-d9ca8bc34a28","Type":"ContainerStarted","Data":"9ade19e42abd0cc010002eed63aef43f6dcc47a43f232568068206c966ce4807"} Apr 23 18:57:32.978903 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:32.978819 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-p7cw9/must-gather-m9hbg" podStartSLOduration=1.161132905 podStartE2EDuration="4.978800701s" podCreationTimestamp="2026-04-23 18:57:28 +0000 UTC" firstStartedPulling="2026-04-23 18:57:28.681467982 +0000 UTC m=+3906.601560439" lastFinishedPulling="2026-04-23 18:57:32.49913577 +0000 UTC m=+3910.419228235" observedRunningTime="2026-04-23 18:57:32.97774006 +0000 UTC m=+3910.897832532" watchObservedRunningTime="2026-04-23 18:57:32.978800701 +0000 UTC m=+3910.898893179" Apr 23 18:57:55.025911 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:55.025876 2568 generic.go:358] "Generic (PLEG): container finished" podID="ad958c5b-e96d-43fe-9be9-d9ca8bc34a28" containerID="9ade19e42abd0cc010002eed63aef43f6dcc47a43f232568068206c966ce4807" exitCode=0 Apr 23 18:57:55.026376 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:55.025951 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p7cw9/must-gather-m9hbg" event={"ID":"ad958c5b-e96d-43fe-9be9-d9ca8bc34a28","Type":"ContainerDied","Data":"9ade19e42abd0cc010002eed63aef43f6dcc47a43f232568068206c966ce4807"} Apr 23 18:57:55.026376 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:55.026272 2568 scope.go:117] "RemoveContainer" containerID="9ade19e42abd0cc010002eed63aef43f6dcc47a43f232568068206c966ce4807" Apr 23 18:57:55.466779 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:55.466742 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-p7cw9_must-gather-m9hbg_ad958c5b-e96d-43fe-9be9-d9ca8bc34a28/gather/0.log" Apr 23 18:57:56.185581 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:56.185540 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pzxs4/must-gather-t9wqc"] Apr 23 18:57:56.190851 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:56.190824 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pzxs4/must-gather-t9wqc" Apr 23 18:57:56.192716 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:56.192693 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-pzxs4\"/\"kube-root-ca.crt\"" Apr 23 18:57:56.193149 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:56.193133 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-pzxs4\"/\"openshift-service-ca.crt\"" Apr 23 18:57:56.193223 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:56.193140 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-pzxs4\"/\"default-dockercfg-92lwm\"" Apr 23 18:57:56.197293 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:56.197270 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pzxs4/must-gather-t9wqc"] Apr 23 18:57:56.291315 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:56.291274 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfm5n\" (UniqueName: \"kubernetes.io/projected/3dd65b8d-e1fc-4f82-ba75-4672160cc60e-kube-api-access-mfm5n\") pod \"must-gather-t9wqc\" (UID: \"3dd65b8d-e1fc-4f82-ba75-4672160cc60e\") " pod="openshift-must-gather-pzxs4/must-gather-t9wqc" Apr 23 18:57:56.291315 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:56.291319 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3dd65b8d-e1fc-4f82-ba75-4672160cc60e-must-gather-output\") pod \"must-gather-t9wqc\" (UID: \"3dd65b8d-e1fc-4f82-ba75-4672160cc60e\") " pod="openshift-must-gather-pzxs4/must-gather-t9wqc" Apr 23 18:57:56.392588 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:56.392554 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mfm5n\" (UniqueName: \"kubernetes.io/projected/3dd65b8d-e1fc-4f82-ba75-4672160cc60e-kube-api-access-mfm5n\") pod \"must-gather-t9wqc\" (UID: \"3dd65b8d-e1fc-4f82-ba75-4672160cc60e\") " pod="openshift-must-gather-pzxs4/must-gather-t9wqc" Apr 23 18:57:56.392794 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:56.392594 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3dd65b8d-e1fc-4f82-ba75-4672160cc60e-must-gather-output\") pod \"must-gather-t9wqc\" (UID: \"3dd65b8d-e1fc-4f82-ba75-4672160cc60e\") " pod="openshift-must-gather-pzxs4/must-gather-t9wqc" Apr 23 18:57:56.393014 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:56.392984 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3dd65b8d-e1fc-4f82-ba75-4672160cc60e-must-gather-output\") pod \"must-gather-t9wqc\" (UID: \"3dd65b8d-e1fc-4f82-ba75-4672160cc60e\") " pod="openshift-must-gather-pzxs4/must-gather-t9wqc" Apr 23 18:57:56.403026 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:56.403001 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfm5n\" (UniqueName: \"kubernetes.io/projected/3dd65b8d-e1fc-4f82-ba75-4672160cc60e-kube-api-access-mfm5n\") pod \"must-gather-t9wqc\" (UID: \"3dd65b8d-e1fc-4f82-ba75-4672160cc60e\") " pod="openshift-must-gather-pzxs4/must-gather-t9wqc" Apr 23 18:57:56.501418 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:56.501318 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pzxs4/must-gather-t9wqc" Apr 23 18:57:56.621049 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:56.621020 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pzxs4/must-gather-t9wqc"] Apr 23 18:57:56.623753 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:57:56.623725 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3dd65b8d_e1fc_4f82_ba75_4672160cc60e.slice/crio-93785fce44014a275d8dd099c9712253d4c7ff5baaede207f1c8e77fa56c9b5f WatchSource:0}: Error finding container 93785fce44014a275d8dd099c9712253d4c7ff5baaede207f1c8e77fa56c9b5f: Status 404 returned error can't find the container with id 93785fce44014a275d8dd099c9712253d4c7ff5baaede207f1c8e77fa56c9b5f Apr 23 18:57:57.032540 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:57.032499 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pzxs4/must-gather-t9wqc" event={"ID":"3dd65b8d-e1fc-4f82-ba75-4672160cc60e","Type":"ContainerStarted","Data":"93785fce44014a275d8dd099c9712253d4c7ff5baaede207f1c8e77fa56c9b5f"} Apr 23 18:57:58.039395 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:58.039302 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pzxs4/must-gather-t9wqc" event={"ID":"3dd65b8d-e1fc-4f82-ba75-4672160cc60e","Type":"ContainerStarted","Data":"a52ff13ac3f84e7949094dddd91230a6b1596ed7fa9183bb886b8f934167b11f"} Apr 23 18:57:58.039395 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:58.039359 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pzxs4/must-gather-t9wqc" event={"ID":"3dd65b8d-e1fc-4f82-ba75-4672160cc60e","Type":"ContainerStarted","Data":"4b2daa827e9ca539def5beae3ddf91566d3035edabf9c9d180182213f847cf6c"} Apr 23 18:57:58.057530 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:58.057474 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-pzxs4/must-gather-t9wqc" podStartSLOduration=1.312063406 podStartE2EDuration="2.057458121s" podCreationTimestamp="2026-04-23 18:57:56 +0000 UTC" firstStartedPulling="2026-04-23 18:57:56.625616657 +0000 UTC m=+3934.545709111" lastFinishedPulling="2026-04-23 18:57:57.37101137 +0000 UTC m=+3935.291103826" observedRunningTime="2026-04-23 18:57:58.055752903 +0000 UTC m=+3935.975845378" watchObservedRunningTime="2026-04-23 18:57:58.057458121 +0000 UTC m=+3935.977550597" Apr 23 18:57:59.075956 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:59.075907 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-8vtxf_f51b2065-4e39-46ce-8dda-c894907d1f96/global-pull-secret-syncer/0.log" Apr 23 18:57:59.207519 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:59.207482 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-2sr6m_41a9f941-4751-4a1c-b87e-c4384db8566d/konnectivity-agent/0.log" Apr 23 18:57:59.350025 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:57:59.349943 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-131-144.ec2.internal_68669faf3d2c20c0dc154d751d5f0070/haproxy/0.log" Apr 23 18:58:00.994954 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:00.994572 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-p7cw9/must-gather-m9hbg"] Apr 23 18:58:00.994954 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:00.994839 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-p7cw9/must-gather-m9hbg" podUID="ad958c5b-e96d-43fe-9be9-d9ca8bc34a28" containerName="copy" containerID="cri-o://5d8fc28d5698c02f9315a89c18daa0aca566715f861f449c946744b0c2c45085" gracePeriod=2 Apr 23 18:58:01.003140 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:01.001316 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-p7cw9/must-gather-m9hbg"] Apr 23 18:58:01.373954 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:01.373886 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-p7cw9_must-gather-m9hbg_ad958c5b-e96d-43fe-9be9-d9ca8bc34a28/copy/0.log" Apr 23 18:58:01.375955 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:01.374386 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p7cw9/must-gather-m9hbg" Apr 23 18:58:01.380749 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:01.376134 2568 status_manager.go:895] "Failed to get status for pod" podUID="ad958c5b-e96d-43fe-9be9-d9ca8bc34a28" pod="openshift-must-gather-p7cw9/must-gather-m9hbg" err="pods \"must-gather-m9hbg\" is forbidden: User \"system:node:ip-10-0-131-144.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-p7cw9\": no relationship found between node 'ip-10-0-131-144.ec2.internal' and this object" Apr 23 18:58:01.442988 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:01.441355 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ad958c5b-e96d-43fe-9be9-d9ca8bc34a28-must-gather-output\") pod \"ad958c5b-e96d-43fe-9be9-d9ca8bc34a28\" (UID: \"ad958c5b-e96d-43fe-9be9-d9ca8bc34a28\") " Apr 23 18:58:01.443182 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:01.442903 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad958c5b-e96d-43fe-9be9-d9ca8bc34a28-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "ad958c5b-e96d-43fe-9be9-d9ca8bc34a28" (UID: "ad958c5b-e96d-43fe-9be9-d9ca8bc34a28"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:58:01.443182 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:01.443100 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g64zh\" (UniqueName: \"kubernetes.io/projected/ad958c5b-e96d-43fe-9be9-d9ca8bc34a28-kube-api-access-g64zh\") pod \"ad958c5b-e96d-43fe-9be9-d9ca8bc34a28\" (UID: \"ad958c5b-e96d-43fe-9be9-d9ca8bc34a28\") " Apr 23 18:58:01.444652 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:01.444622 2568 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ad958c5b-e96d-43fe-9be9-d9ca8bc34a28-must-gather-output\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:58:01.455175 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:01.454967 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad958c5b-e96d-43fe-9be9-d9ca8bc34a28-kube-api-access-g64zh" (OuterVolumeSpecName: "kube-api-access-g64zh") pod "ad958c5b-e96d-43fe-9be9-d9ca8bc34a28" (UID: "ad958c5b-e96d-43fe-9be9-d9ca8bc34a28"). InnerVolumeSpecName "kube-api-access-g64zh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:58:01.545790 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:01.545734 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g64zh\" (UniqueName: \"kubernetes.io/projected/ad958c5b-e96d-43fe-9be9-d9ca8bc34a28-kube-api-access-g64zh\") on node \"ip-10-0-131-144.ec2.internal\" DevicePath \"\"" Apr 23 18:58:02.063901 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:02.063869 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-p7cw9_must-gather-m9hbg_ad958c5b-e96d-43fe-9be9-d9ca8bc34a28/copy/0.log" Apr 23 18:58:02.064608 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:02.064282 2568 generic.go:358] "Generic (PLEG): container finished" podID="ad958c5b-e96d-43fe-9be9-d9ca8bc34a28" containerID="5d8fc28d5698c02f9315a89c18daa0aca566715f861f449c946744b0c2c45085" exitCode=143 Apr 23 18:58:02.064608 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:02.064392 2568 scope.go:117] "RemoveContainer" containerID="5d8fc28d5698c02f9315a89c18daa0aca566715f861f449c946744b0c2c45085" Apr 23 18:58:02.064608 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:02.064520 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p7cw9/must-gather-m9hbg" Apr 23 18:58:02.080181 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:02.074669 2568 status_manager.go:895] "Failed to get status for pod" podUID="ad958c5b-e96d-43fe-9be9-d9ca8bc34a28" pod="openshift-must-gather-p7cw9/must-gather-m9hbg" err="pods \"must-gather-m9hbg\" is forbidden: User \"system:node:ip-10-0-131-144.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-p7cw9\": no relationship found between node 'ip-10-0-131-144.ec2.internal' and this object" Apr 23 18:58:02.097071 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:02.097046 2568 scope.go:117] "RemoveContainer" containerID="9ade19e42abd0cc010002eed63aef43f6dcc47a43f232568068206c966ce4807" Apr 23 18:58:02.098702 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:02.098675 2568 status_manager.go:895] "Failed to get status for pod" podUID="ad958c5b-e96d-43fe-9be9-d9ca8bc34a28" pod="openshift-must-gather-p7cw9/must-gather-m9hbg" err="pods \"must-gather-m9hbg\" is forbidden: User \"system:node:ip-10-0-131-144.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-p7cw9\": no relationship found between node 'ip-10-0-131-144.ec2.internal' and this object" Apr 23 18:58:02.136100 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:02.136051 2568 scope.go:117] "RemoveContainer" containerID="5d8fc28d5698c02f9315a89c18daa0aca566715f861f449c946744b0c2c45085" Apr 23 18:58:02.136851 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:58:02.136802 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d8fc28d5698c02f9315a89c18daa0aca566715f861f449c946744b0c2c45085\": container with ID starting with 5d8fc28d5698c02f9315a89c18daa0aca566715f861f449c946744b0c2c45085 not found: ID does not exist" containerID="5d8fc28d5698c02f9315a89c18daa0aca566715f861f449c946744b0c2c45085" Apr 23 18:58:02.137082 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:02.136855 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d8fc28d5698c02f9315a89c18daa0aca566715f861f449c946744b0c2c45085"} err="failed to get container status \"5d8fc28d5698c02f9315a89c18daa0aca566715f861f449c946744b0c2c45085\": rpc error: code = NotFound desc = could not find container \"5d8fc28d5698c02f9315a89c18daa0aca566715f861f449c946744b0c2c45085\": container with ID starting with 5d8fc28d5698c02f9315a89c18daa0aca566715f861f449c946744b0c2c45085 not found: ID does not exist" Apr 23 18:58:02.137082 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:02.136884 2568 scope.go:117] "RemoveContainer" containerID="9ade19e42abd0cc010002eed63aef43f6dcc47a43f232568068206c966ce4807" Apr 23 18:58:02.137529 ip-10-0-131-144 kubenswrapper[2568]: E0423 18:58:02.137506 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ade19e42abd0cc010002eed63aef43f6dcc47a43f232568068206c966ce4807\": container with ID starting with 9ade19e42abd0cc010002eed63aef43f6dcc47a43f232568068206c966ce4807 not found: ID does not exist" containerID="9ade19e42abd0cc010002eed63aef43f6dcc47a43f232568068206c966ce4807" Apr 23 18:58:02.137718 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:02.137670 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ade19e42abd0cc010002eed63aef43f6dcc47a43f232568068206c966ce4807"} err="failed to get container status \"9ade19e42abd0cc010002eed63aef43f6dcc47a43f232568068206c966ce4807\": rpc error: code = NotFound desc = could not find container \"9ade19e42abd0cc010002eed63aef43f6dcc47a43f232568068206c966ce4807\": container with ID starting with 9ade19e42abd0cc010002eed63aef43f6dcc47a43f232568068206c966ce4807 not found: ID does not exist" Apr 23 18:58:02.601701 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:02.601221 2568 status_manager.go:895] "Failed to get status for pod" podUID="ad958c5b-e96d-43fe-9be9-d9ca8bc34a28" pod="openshift-must-gather-p7cw9/must-gather-m9hbg" err="pods \"must-gather-m9hbg\" is forbidden: User \"system:node:ip-10-0-131-144.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-p7cw9\": no relationship found between node 'ip-10-0-131-144.ec2.internal' and this object" Apr 23 18:58:02.608954 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:02.605141 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad958c5b-e96d-43fe-9be9-d9ca8bc34a28" path="/var/lib/kubelet/pods/ad958c5b-e96d-43fe-9be9-d9ca8bc34a28/volumes" Apr 23 18:58:03.093355 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:03.093319 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-64qgx_4e2e3a37-968e-4bc2-87c5-ca830cebbc44/node-exporter/0.log" Apr 23 18:58:03.125855 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:03.125799 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-64qgx_4e2e3a37-968e-4bc2-87c5-ca830cebbc44/kube-rbac-proxy/0.log" Apr 23 18:58:03.152970 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:03.152913 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-64qgx_4e2e3a37-968e-4bc2-87c5-ca830cebbc44/init-textfile/0.log" Apr 23 18:58:03.472474 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:03.472444 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_51d006ec-4b09-46fb-9905-6d8cdcb0a4ff/prometheus/0.log" Apr 23 18:58:03.493026 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:03.492995 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_51d006ec-4b09-46fb-9905-6d8cdcb0a4ff/config-reloader/0.log" Apr 23 18:58:03.519853 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:03.519823 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_51d006ec-4b09-46fb-9905-6d8cdcb0a4ff/thanos-sidecar/0.log" Apr 23 18:58:03.544356 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:03.544325 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_51d006ec-4b09-46fb-9905-6d8cdcb0a4ff/kube-rbac-proxy-web/0.log" Apr 23 18:58:03.573981 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:03.573951 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_51d006ec-4b09-46fb-9905-6d8cdcb0a4ff/kube-rbac-proxy/0.log" Apr 23 18:58:03.600074 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:03.600036 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_51d006ec-4b09-46fb-9905-6d8cdcb0a4ff/kube-rbac-proxy-thanos/0.log" Apr 23 18:58:03.628232 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:03.628201 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_51d006ec-4b09-46fb-9905-6d8cdcb0a4ff/init-config-reloader/0.log" Apr 23 18:58:06.139460 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:06.139424 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pzxs4/perf-node-gather-daemonset-hxcrp"] Apr 23 18:58:06.140184 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:06.140090 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad958c5b-e96d-43fe-9be9-d9ca8bc34a28" containerName="copy" Apr 23 18:58:06.140184 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:06.140117 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad958c5b-e96d-43fe-9be9-d9ca8bc34a28" containerName="copy" Apr 23 18:58:06.140184 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:06.140141 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad958c5b-e96d-43fe-9be9-d9ca8bc34a28" containerName="gather" Apr 23 18:58:06.140184 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:06.140151 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad958c5b-e96d-43fe-9be9-d9ca8bc34a28" containerName="gather" Apr 23 18:58:06.140449 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:06.140216 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad958c5b-e96d-43fe-9be9-d9ca8bc34a28" containerName="copy" Apr 23 18:58:06.140449 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:06.140232 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad958c5b-e96d-43fe-9be9-d9ca8bc34a28" containerName="gather" Apr 23 18:58:06.144794 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:06.144769 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pzxs4/perf-node-gather-daemonset-hxcrp" Apr 23 18:58:06.150676 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:06.150645 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pzxs4/perf-node-gather-daemonset-hxcrp"] Apr 23 18:58:06.183133 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:06.183103 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8d2f174d-c2b1-48ac-911e-f2729565b0a3-podres\") pod \"perf-node-gather-daemonset-hxcrp\" (UID: \"8d2f174d-c2b1-48ac-911e-f2729565b0a3\") " pod="openshift-must-gather-pzxs4/perf-node-gather-daemonset-hxcrp" Apr 23 18:58:06.183301 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:06.183156 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8d2f174d-c2b1-48ac-911e-f2729565b0a3-lib-modules\") pod \"perf-node-gather-daemonset-hxcrp\" (UID: \"8d2f174d-c2b1-48ac-911e-f2729565b0a3\") " pod="openshift-must-gather-pzxs4/perf-node-gather-daemonset-hxcrp" Apr 23 18:58:06.183301 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:06.183239 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tcms\" (UniqueName: \"kubernetes.io/projected/8d2f174d-c2b1-48ac-911e-f2729565b0a3-kube-api-access-9tcms\") pod \"perf-node-gather-daemonset-hxcrp\" (UID: \"8d2f174d-c2b1-48ac-911e-f2729565b0a3\") " pod="openshift-must-gather-pzxs4/perf-node-gather-daemonset-hxcrp" Apr 23 18:58:06.183381 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:06.183304 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8d2f174d-c2b1-48ac-911e-f2729565b0a3-proc\") pod \"perf-node-gather-daemonset-hxcrp\" (UID: \"8d2f174d-c2b1-48ac-911e-f2729565b0a3\") " pod="openshift-must-gather-pzxs4/perf-node-gather-daemonset-hxcrp" Apr 23 18:58:06.183381 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:06.183365 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8d2f174d-c2b1-48ac-911e-f2729565b0a3-sys\") pod \"perf-node-gather-daemonset-hxcrp\" (UID: \"8d2f174d-c2b1-48ac-911e-f2729565b0a3\") " pod="openshift-must-gather-pzxs4/perf-node-gather-daemonset-hxcrp" Apr 23 18:58:06.284065 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:06.284026 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8d2f174d-c2b1-48ac-911e-f2729565b0a3-proc\") pod \"perf-node-gather-daemonset-hxcrp\" (UID: \"8d2f174d-c2b1-48ac-911e-f2729565b0a3\") " pod="openshift-must-gather-pzxs4/perf-node-gather-daemonset-hxcrp" Apr 23 18:58:06.284270 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:06.284095 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8d2f174d-c2b1-48ac-911e-f2729565b0a3-sys\") pod \"perf-node-gather-daemonset-hxcrp\" (UID: \"8d2f174d-c2b1-48ac-911e-f2729565b0a3\") " pod="openshift-must-gather-pzxs4/perf-node-gather-daemonset-hxcrp" Apr 23 18:58:06.284270 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:06.284124 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8d2f174d-c2b1-48ac-911e-f2729565b0a3-podres\") pod \"perf-node-gather-daemonset-hxcrp\" (UID: \"8d2f174d-c2b1-48ac-911e-f2729565b0a3\") " pod="openshift-must-gather-pzxs4/perf-node-gather-daemonset-hxcrp" Apr 23 18:58:06.284270 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:06.284157 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8d2f174d-c2b1-48ac-911e-f2729565b0a3-proc\") pod \"perf-node-gather-daemonset-hxcrp\" (UID: \"8d2f174d-c2b1-48ac-911e-f2729565b0a3\") " pod="openshift-must-gather-pzxs4/perf-node-gather-daemonset-hxcrp" Apr 23 18:58:06.284270 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:06.284168 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8d2f174d-c2b1-48ac-911e-f2729565b0a3-lib-modules\") pod \"perf-node-gather-daemonset-hxcrp\" (UID: \"8d2f174d-c2b1-48ac-911e-f2729565b0a3\") " pod="openshift-must-gather-pzxs4/perf-node-gather-daemonset-hxcrp" Apr 23 18:58:06.284270 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:06.284181 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8d2f174d-c2b1-48ac-911e-f2729565b0a3-sys\") pod \"perf-node-gather-daemonset-hxcrp\" (UID: \"8d2f174d-c2b1-48ac-911e-f2729565b0a3\") " pod="openshift-must-gather-pzxs4/perf-node-gather-daemonset-hxcrp" Apr 23 18:58:06.284270 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:06.284224 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9tcms\" (UniqueName: \"kubernetes.io/projected/8d2f174d-c2b1-48ac-911e-f2729565b0a3-kube-api-access-9tcms\") pod \"perf-node-gather-daemonset-hxcrp\" (UID: \"8d2f174d-c2b1-48ac-911e-f2729565b0a3\") " pod="openshift-must-gather-pzxs4/perf-node-gather-daemonset-hxcrp" Apr 23 18:58:06.284516 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:06.284286 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8d2f174d-c2b1-48ac-911e-f2729565b0a3-lib-modules\") pod \"perf-node-gather-daemonset-hxcrp\" (UID: \"8d2f174d-c2b1-48ac-911e-f2729565b0a3\") " pod="openshift-must-gather-pzxs4/perf-node-gather-daemonset-hxcrp" Apr 23 18:58:06.284516 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:06.284288 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8d2f174d-c2b1-48ac-911e-f2729565b0a3-podres\") pod \"perf-node-gather-daemonset-hxcrp\" (UID: \"8d2f174d-c2b1-48ac-911e-f2729565b0a3\") " pod="openshift-must-gather-pzxs4/perf-node-gather-daemonset-hxcrp" Apr 23 18:58:06.292876 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:06.292849 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tcms\" (UniqueName: \"kubernetes.io/projected/8d2f174d-c2b1-48ac-911e-f2729565b0a3-kube-api-access-9tcms\") pod \"perf-node-gather-daemonset-hxcrp\" (UID: \"8d2f174d-c2b1-48ac-911e-f2729565b0a3\") " pod="openshift-must-gather-pzxs4/perf-node-gather-daemonset-hxcrp" Apr 23 18:58:06.458573 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:06.458482 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pzxs4/perf-node-gather-daemonset-hxcrp" Apr 23 18:58:06.610476 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:06.610441 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pzxs4/perf-node-gather-daemonset-hxcrp"] Apr 23 18:58:06.613978 ip-10-0-131-144 kubenswrapper[2568]: W0423 18:58:06.613943 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8d2f174d_c2b1_48ac_911e_f2729565b0a3.slice/crio-adc77df32f9dfd01e00bdd9e04364db429938b96c4481cdad04e55b5d3a332db WatchSource:0}: Error finding container adc77df32f9dfd01e00bdd9e04364db429938b96c4481cdad04e55b5d3a332db: Status 404 returned error can't find the container with id adc77df32f9dfd01e00bdd9e04364db429938b96c4481cdad04e55b5d3a332db Apr 23 18:58:07.084834 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:07.084795 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pzxs4/perf-node-gather-daemonset-hxcrp" event={"ID":"8d2f174d-c2b1-48ac-911e-f2729565b0a3","Type":"ContainerStarted","Data":"cfa1ddb1f83f72a1a4f6db8df69f174255a7f3255365fc03ef67cabd8e6714c5"} Apr 23 18:58:07.084834 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:07.084841 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pzxs4/perf-node-gather-daemonset-hxcrp" event={"ID":"8d2f174d-c2b1-48ac-911e-f2729565b0a3","Type":"ContainerStarted","Data":"adc77df32f9dfd01e00bdd9e04364db429938b96c4481cdad04e55b5d3a332db"} Apr 23 18:58:07.085143 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:07.084988 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-pzxs4/perf-node-gather-daemonset-hxcrp" Apr 23 18:58:07.105562 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:07.105494 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-pzxs4/perf-node-gather-daemonset-hxcrp" podStartSLOduration=1.105474909 podStartE2EDuration="1.105474909s" podCreationTimestamp="2026-04-23 18:58:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:58:07.103917824 +0000 UTC m=+3945.024010299" watchObservedRunningTime="2026-04-23 18:58:07.105474909 +0000 UTC m=+3945.025567387" Apr 23 18:58:07.489324 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:07.489240 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-w9st4_cf0eb35a-032a-4b56-a1d6-f08cffb1de45/dns/0.log" Apr 23 18:58:07.516774 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:07.516749 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-w9st4_cf0eb35a-032a-4b56-a1d6-f08cffb1de45/kube-rbac-proxy/0.log" Apr 23 18:58:07.572812 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:07.572780 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-r69nr_c7659d0d-36b6-4e60-9b82-1870686d5685/dns-node-resolver/0.log" Apr 23 18:58:08.139492 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:08.139464 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-lw625_48f03a68-806b-44d8-b107-0af334f9ad51/node-ca/0.log" Apr 23 18:58:09.419872 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:09.419834 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-xmnns_d8390fd8-fb61-4d17-af7d-e393c2393937/serve-healthcheck-canary/0.log" Apr 23 18:58:10.038978 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:10.038948 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xjc8h_66fc82e8-4b83-46d1-beec-4b668ecd1eeb/kube-rbac-proxy/0.log" Apr 23 18:58:10.070361 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:10.070316 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xjc8h_66fc82e8-4b83-46d1-beec-4b668ecd1eeb/exporter/0.log" Apr 23 18:58:10.101001 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:10.100970 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xjc8h_66fc82e8-4b83-46d1-beec-4b668ecd1eeb/extractor/0.log" Apr 23 18:58:12.289033 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:12.289000 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-6fc5d867c5-gddln_5b72518b-17cb-450f-a719-9ea96e0f11a3/manager/0.log" Apr 23 18:58:12.782813 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:12.782783 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-tz8tt_cf98f8c0-a0ce-4624-80ee-cbc30558c504/s3-init/0.log" Apr 23 18:58:12.810081 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:12.810040 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-custom-nlrj7_f056c1db-118b-4363-a38e-e50097295112/s3-tls-init-custom/0.log" Apr 23 18:58:12.843739 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:12.843712 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-serving-shx7t_4aad7b75-b3fc-4a86-abf3-cd9964ed28a3/s3-tls-init-serving/0.log" Apr 23 18:58:12.881788 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:12.881744 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-bjccc_5a16254d-c5e1-4567-8042-5cdab52e09a5/seaweedfs/0.log" Apr 23 18:58:12.913823 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:12.913788 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-custom-5c88b85bb7-lxbm9_96c11d8f-83f1-4985-a4bb-dd57e96de581/seaweedfs-tls-custom/0.log" Apr 23 18:58:13.097150 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:13.097119 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-pzxs4/perf-node-gather-daemonset-hxcrp" Apr 23 18:58:19.270236 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:19.270202 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-f8xkt_23f0cb8e-457e-471c-aba7-ab0e123e6f30/kube-multus-additional-cni-plugins/0.log" Apr 23 18:58:19.294982 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:19.294948 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-f8xkt_23f0cb8e-457e-471c-aba7-ab0e123e6f30/egress-router-binary-copy/0.log" Apr 23 18:58:19.318844 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:19.318816 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-f8xkt_23f0cb8e-457e-471c-aba7-ab0e123e6f30/cni-plugins/0.log" Apr 23 18:58:19.342405 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:19.342376 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-f8xkt_23f0cb8e-457e-471c-aba7-ab0e123e6f30/bond-cni-plugin/0.log" Apr 23 18:58:19.369487 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:19.369457 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-f8xkt_23f0cb8e-457e-471c-aba7-ab0e123e6f30/routeoverride-cni/0.log" Apr 23 18:58:19.393133 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:19.393101 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-f8xkt_23f0cb8e-457e-471c-aba7-ab0e123e6f30/whereabouts-cni-bincopy/0.log" Apr 23 18:58:19.416460 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:19.416429 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-f8xkt_23f0cb8e-457e-471c-aba7-ab0e123e6f30/whereabouts-cni/0.log" Apr 23 18:58:19.680987 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:19.680954 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kljn6_e172a861-bbd7-4d2d-9045-1bfe854b4621/kube-multus/0.log" Apr 23 18:58:19.706033 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:19.706008 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-2dh7j_9fe17f32-d37d-48dc-b42b-74bbc44d26d2/network-metrics-daemon/0.log" Apr 23 18:58:19.726012 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:19.725979 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-2dh7j_9fe17f32-d37d-48dc-b42b-74bbc44d26d2/kube-rbac-proxy/0.log" Apr 23 18:58:21.426656 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:21.426613 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwrk7_52f4d4c2-438f-4486-8ffb-4afbaad14bf1/ovn-controller/0.log" Apr 23 18:58:21.448228 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:21.448199 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwrk7_52f4d4c2-438f-4486-8ffb-4afbaad14bf1/ovn-acl-logging/0.log" Apr 23 18:58:21.485435 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:21.485399 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwrk7_52f4d4c2-438f-4486-8ffb-4afbaad14bf1/ovn-acl-logging/1.log" Apr 23 18:58:21.508661 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:21.508634 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwrk7_52f4d4c2-438f-4486-8ffb-4afbaad14bf1/kube-rbac-proxy-node/0.log" Apr 23 18:58:21.532650 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:21.532619 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwrk7_52f4d4c2-438f-4486-8ffb-4afbaad14bf1/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 18:58:21.557657 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:21.557620 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwrk7_52f4d4c2-438f-4486-8ffb-4afbaad14bf1/northd/0.log" Apr 23 18:58:21.583687 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:21.583661 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwrk7_52f4d4c2-438f-4486-8ffb-4afbaad14bf1/nbdb/0.log" Apr 23 18:58:21.609455 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:21.609426 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwrk7_52f4d4c2-438f-4486-8ffb-4afbaad14bf1/sbdb/0.log" Apr 23 18:58:21.807961 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:21.807848 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwrk7_52f4d4c2-438f-4486-8ffb-4afbaad14bf1/ovnkube-controller/0.log" Apr 23 18:58:22.915798 ip-10-0-131-144 kubenswrapper[2568]: I0423 18:58:22.915722 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-7vcgr_705ed263-d57f-4121-a61c-7fca69acb455/network-check-target-container/0.log"