CLC number:
On-line Access: 2025-10-20
Received: 2025-04-30
Revision Accepted: 2025-09-05
Crosschecked: 0000-00-00
Cited: 0
Clicked: 23
Jiaqi SHI1,2, Xulong ZHANG1, Xiaoyang QU1, Junfei XIE1,2, Jianzong WANG1. Knowledge distillation for financial large language models: a systematic review of strategies, applications, and evaluation[J]. Frontiers of Information Technology & Electronic Engineering, 1998, -1(-1): .
@article{title="Knowledge distillation for financial large language models: a systematic review of strategies, applications, and evaluation",
author="Jiaqi SHI1,2, Xulong ZHANG1, Xiaoyang QU1, Junfei XIE1,2, Jianzong WANG1",
journal="Frontiers of Information Technology & Electronic Engineering",
volume="-1",
number="-1",
pages="",
year="1998",
publisher="Zhejiang University Press & Springer",
doi="10.1631/FITEE.2500282"
}
%0 Journal Article
%T Knowledge distillation for financial large language models: a systematic review of strategies, applications, and evaluation
%A Jiaqi SHI1
%A 2
%A Xulong ZHANG1
%A Xiaoyang QU1
%A Junfei XIE1
%A 2
%A Jianzong WANG1
%J Journal of Zhejiang University SCIENCE C
%V -1
%N -1
%P
%@ 2095-9184
%D 1998
%I Zhejiang University Press & Springer
%DOI 10.1631/FITEE.2500282
TY - JOUR
T1 - Knowledge distillation for financial large language models: a systematic review of strategies, applications, and evaluation
A1 - Jiaqi SHI1
A1 - 2
A1 - Xulong ZHANG1
A1 - Xiaoyang QU1
A1 - Junfei XIE1
A1 - 2
A1 - Jianzong WANG1
J0 - Journal of Zhejiang University Science C
VL - -1
IS - -1
SP -
EP -
%@ 2095-9184
Y1 - 1998
PB - Zhejiang University Press & Springer
ER -
DOI - 10.1631/FITEE.2500282
Abstract: financial large language models (FinLLMs) offer immense potential for financial applications. While excessive deployment expenditures and considerable inference latency constitute major obstacles. As a prominent compression methodology, knowledge distillation (KD) offers an effective solution to these difficulties. A comprehensive survey is conducted in this work on how KD interacts with FinLLMs, covering three core aspects such an strategy, application, and evaluation. At the strategy level, this review introduces a structured taxonomy to comparatively analyze existing distillation pathways. At the application level, this review puts forward a logical "upstream-midstream-downstream" framework to systematically explain the practical value of distilled models in the financial field. At the evaluation level, to tackle the absence of standards in the financial field, this review constructs a comprehensive evaluation framework that proceeds from multiple dimensions such as financial accuracy, reasoning delity, and robustness. In summary, this research aims to provide a clear roadmap for this interdisciplinary field, to accelerate development of distilled FinLLMs.
Open peer comments: Debate/Discuss/Question/Opinion
<1>