Full Text:  <7>

CLC number: 

On-line Access: 2025-10-20

Received: 2025-04-30

Revision Accepted: 2025-09-05

Crosschecked: 0000-00-00

Cited: 0

Clicked: 22

Citations:  Bibtex RefMan EndNote GB/T7714

-   Go to

Article info.
Open peer comments

Frontiers of Information Technology & Electronic Engineering 

Accepted manuscript available online (unedited version)


Knowledge distillation for financial large language models: a systematic review of strategies, applications, and evaluation


Author(s):  Jiaqi SHI1, 2, Xulong ZHANG1, Xiaoyang QU1, Junfei XIE1, 2, Jianzong WANG1

Affiliation(s):  1Ping An Technology (Shenzhen) Co., Ltd., Shenzhen 518000, China 2University of Science and Technology of China, Hefei 230027, China

Corresponding email(s):  civilizwa@mail.ustc.edu.cn, zhangxulong@ieee.org, quxiaoy@gmail.com, xiejunfei@mail.ustc.edu.cn, jzwang@188.com

Key Words:  Financial large language models; Knowledge distillation; Model compression; Quantitative trading


Share this article to: More <<< Previous Paper|Next Paper >>>

Jiaqi SHI1,2, Xulong ZHANG1, Xiaoyang QU1, Junfei XIE1,2, Jianzong WANG1. Knowledge distillation for financial large language models: a systematic review of strategies, applications, and evaluation[J]. Frontiers of Information Technology & Electronic Engineering,in press.https://doi.org/10.1631/FITEE.2500282

@article{title="Knowledge distillation for financial large language models: a systematic review of strategies, applications, and evaluation",
author="Jiaqi SHI1,2, Xulong ZHANG1, Xiaoyang QU1, Junfei XIE1,2, Jianzong WANG1",
journal="Frontiers of Information Technology & Electronic Engineering",
year="in press",
publisher="Zhejiang University Press & Springer",
doi="https://doi.org/10.1631/FITEE.2500282"
}

%0 Journal Article
%T Knowledge distillation for financial large language models: a systematic review of strategies, applications, and evaluation
%A Jiaqi SHI1
%A
2
%A Xulong ZHANG1
%A Xiaoyang QU1
%A Junfei XIE1
%A
2
%A Jianzong WANG1
%J Frontiers of Information Technology & Electronic Engineering
%P
%@ 2095-9184
%D in press
%I Zhejiang University Press & Springer
doi="https://doi.org/10.1631/FITEE.2500282"

TY - JOUR
T1 - Knowledge distillation for financial large language models: a systematic review of strategies, applications, and evaluation
A1 - Jiaqi SHI1
A1 -
2
A1 - Xulong ZHANG1
A1 - Xiaoyang QU1
A1 - Junfei XIE1
A1 -
2
A1 - Jianzong WANG1
J0 - Frontiers of Information Technology & Electronic Engineering
SP -
EP -
%@ 2095-9184
Y1 - in press
PB - Zhejiang University Press & Springer
ER -
doi="https://doi.org/10.1631/FITEE.2500282"


Abstract: 
Financial large language models (FinLLMs) offer immense potential for financial applications. While excessive deployment expenditures and considerable inference latency constitute major obstacles. As a prominent compression methodology, knowledge distillation (KD) offers an effective solution to these difficulties. A comprehensive survey is conducted in this work on how KD interacts with FinLLMs, covering three core aspects such an strategy, application, and evaluation. At the strategy level, this review introduces a structured taxonomy to comparatively analyze existing distillation pathways. At the application level, this review puts forward a logical "upstream-midstream-downstream" framework to systematically explain the practical value of distilled models in the financial field. At the evaluation level, to tackle the absence of standards in the financial field, this review constructs a comprehensive evaluation framework that proceeds from multiple dimensions such as financial accuracy, reasoning delity, and robustness. In summary, this research aims to provide a clear roadmap for this interdisciplinary field, to accelerate development of distilled FinLLMs.

Darkslateblue:Affiliate; Royal Blue:Author; Turquoise:Article

Reference

Open peer comments: Debate/Discuss/Question/Opinion

<1>

Please provide your name, email address and a comment





Journal of Zhejiang University-SCIENCE, 38 Zheda Road, Hangzhou 310027, China
Tel: +86-571-87952783; E-mail: cjzhang@zju.edu.cn
Copyright © 2000 - 2025 Journal of Zhejiang University-SCIENCE