CLC number:
On-line Access: 2025-03-07
Received: 2024-04-16
Revision Accepted: 2024-12-03
Crosschecked: 2025-03-07
Cited: 0
Clicked: 1453
Siyao SONG, Guoao SUN, Yifan CHANG, Nengwen ZHAO, Yijun YU. TSNet: a foundation model for wireless network status prediction in digital twins[J]. Frontiers of Information Technology & Electronic Engineering,in press.https://doi.org/10.1631/FITEE.2400295 @article{title="TSNet: a foundation model for wireless network status prediction in digital twins", %0 Journal Article TY - JOUR
TSNet:用于数字孪生无线网络状态预测的基础模型华为技术有限公司,中国深圳市,518129 摘要:预测网络未来状态是数字孪生网络的一个关键能力,可帮助运维人员估计网络性能变化,以提前采取相关操作。现有预测方法--包括统计方法、机器学习方法和深度学习方法--在泛化能力和训练数据依赖上存在诸多限制。为解决这些问题,受自然语言处理和计算机视觉领域预训练与微调框架启发,提出一个基于Transformer的基础模型TSNet,用于预测多样化的网络性能指标。为了利用Transformer架构更好地建模时间序列,引入频域注意力机制和时序分解。此外,设计了一种轻量的微调策略,使TSNet可以快速泛化到新数据或新场景。实验结果表明,基于零样本的TSNet预测(无需任何训练数据)表现优于有监督的基线方法。使用少样本的微调策略,预测准确性可进一步提升。整体而言,TSNet在多种数据上表现出较高的准确性和泛化能力。 关键词组: Darkslateblue:Affiliate; Royal Blue:Author; Turquoise:Article
Reference[1]Breiman L, 2001. Random forests. Mach Learn, 45(1):5-32. ![]() [2]Challu C, Olivares KG, Oreshkin BN, et al., 2022. N-HiTS: neural hierarchical interpolation for time-series forecasting. ![]() [3]Chang C, Wang WY, Peng WC, et al., 2024. LLM4TS: aligning pre-trained LLMs as data-efficient time-series forecasters. ![]() [4]Das A, Kong WH, Sen R, et al., 2024. A decoder-only foundation model for time-series forecasting. ![]() [5]Houlsby N, Giurgiu A, Jastrzebski S, et al., 2019. Parameter-efficient transfer learning for NLP. Proc 36th Int Conf on Machine Learning, p.2790-2799. ![]() [6]Hyndman RJ, Athanasopoulos G, 2018. Forecasting: Principles and Practice (2nd Ed.). OTexts, Melbourne, Australia. ![]() [7]Khan LU, Han Z, Saad W, et al., 2022. Digital twin of wireless systems: overview, taxonomy, challenges, and opportunities. IEEE Commun Surv Tut, 24(4):2230-2254. ![]() [8]Lin TY, Wang YX, Liu XY, et al., 2022. A survey of Transformers. AI Open, 3:111-132. ![]() [9]Liu S, Yu H, Liao C, et al., 2022. Pyraformer: low-complexity pyramidal attention for long-range time-series modeling and forecasting. 10th Int Conf on Learning Representations. ![]() [10]Liu Y, Wu HX, Wang JM, 2022. Non-stationary Transformers: exploring the stationarity in time-series forecasting. Proc 36th Int Conf on Neural Information Processing Systems, p.9881-9893. ![]() [11]Liu YJ, Dong HB, Wang XM, et al., 2019. Time-series prediction based on temporal convolutional network. IEEE/ACIS 18th Int Conf on Computer and Information Science, p.300-305. ![]() [12]Lu K, Grover A, Abbeel P, et al., 2022. Frozen pretrained Transformers as universal computation engines. Proc 36th AAAI Conf on Artificial Intelligence, p.7628-7636. ![]() [13]Makridakis S, Spiliotis E, Assimakopoulos V, 2018. The M4 competition: results, findings, conclusion and way forward. Int J Forecast, 34(4):802-808. ![]() [14]Mehrmolaei S, Keyvanpour MR, 2016. Time-series forecasting using improved ARIMA. Artificial Intelligence and Robotics (IRANOPEN), p.92-97. ![]() [15]Nie YQ, Nguyen NH, Sinthong P, et al., 2023. A time-series is worth 64 words: long-term forecasting with Transformers. ![]() [16]Paaß G, Giesselbach S, 2023. Foundation Models for Natural Language Processing: Pre-trained Language Models Integrating Media. Springer, Cham. ![]() [17]Rebuffi SA, Bilen H, Vedaldi A, 2017. Learning multiple visual domains with residual adapters. Proc 31st Int Conf on Neural Information Processing Systems, p.506-516. ![]() [18]Salinas D, Flunkert V, Gasthaus J, et al., 2020. DeepAR: probabilistic forecasting with autoregressive recurrent networks. Int J Forecast, 36(3):1181-1191. ![]() [19]Wu HX, Hu TG, Liu Y, et al., 2023. TimesNet: temporal 2D-variation modeling for general time-series analysis. ![]() [20]Zhou HY, Zhang SH, Peng JQ, et al., 2021. Informer: beyond efficient Transformer for long sequence time-series forecasting. ![]() [21]Zhou T, Ma ZQ, Wen QS, et al., 2022. FEDformer: frequency enhanced decomposed Transformer for long-term series forecasting. Proc 39th Int Conf on Machine Learning, p.27268-27286. ![]() [22]Zhou T, Niu PS, Wang X, et al., 2023. One fits all: power general time-series analysis by pretrained LM. ![]() Journal of Zhejiang University-SCIENCE, 38 Zheda Road, Hangzhou
310027, China
Tel: +86-571-87952783; E-mail: cjzhang@zju.edu.cn Copyright © 2000 - 2025 Journal of Zhejiang University-SCIENCE |
Open peer comments: Debate/Discuss/Question/Opinion
<1>