CLC number:
On-line Access: 2024-08-27
Received: 2023-10-17
Revision Accepted: 2024-05-08
Crosschecked: 2024-01-03
Cited: 0
Clicked: 2110
Citations: Bibtex RefMan EndNote GB/T7714
https://orcid.org/0000-0002-2454-4219
Mingyuan BAI, Derun ZHOU, Qibin ZHAO. TendiffPure: a convolutional tensor-train denoising diffusion model for purification[J]. Frontiers of Information Technology & Electronic Engineering,in press.https://doi.org/10.1631/FITEE.2300392 @article{title="TendiffPure: a convolutional tensor-train denoising diffusion model for purification", %0 Journal Article TY - JOUR
TendiffPure:一种用于纯化的卷积张量链去噪扩散模型1理化学研究所革新知能统合研究项目组,日本东京市,1030027 2东京工业大学環境社会理工学院,日本东京市,1528550 摘要:扩散模型是有效的纯化方法,在现有分类器执行分类任务之前,使用生成方法去除噪声或对抗性攻击。然而,扩散模型的效率仍然是一个问题,现有的解决方案基于知识蒸馏,由于生成步骤较少,可能会危及生成质量。因此,我们提出TendiffPure,一种用于纯化的张量化和压缩的扩散模型。与知识蒸馏方法不同,我们直接使用张量链分解压缩扩散模型的U-Net骨干网络,减少参数数量,并在多维数据(如图像)中捕获更多的空间信息。空间复杂度从O(N2)减少到O(NR2),其中R≤4为张量序列秩,N为通道数。实验结果表明,基于CIFAR-10、Fashion-MNIST和MNIST数据集,TendiffPure可以更有效地生成高质量的净化结果,并在两种噪声和一次对抗性攻击下优于基线纯化方法。 关键词组: Darkslateblue:Affiliate; Royal Blue:Author; Turquoise:Article
Reference[1]Carroll JD, Chang JJ, 1970. Analysis of individual differences in multidimensional scaling via an N-way generalization of “Eckart-Young” decomposition. Psychometrika, 35(3):283-319. ![]() [2]Croce F, Hein M, 2020. Reliable evaluation of adversarial robustness with an ensemble of diverse parameter-free attacks. Proc 37th Int Conf on Machine Learning, Article 206. ![]() [3]Dhariwal P, Nichol A, 2021. Diffusion models beat GANs on image synthesis. Proc 35th Conf on Neural Information Processing Systems, p.8780-8794. ![]() [4]Gao Q, Li ZL, Zhang JP, et al., 2023. CoreDiff: contextual error-modulated generalized diffusion model for low-dose CT denoising and generalization. IEEE Trans Med Imag, early access. ![]() [5]Giovannetti V, Montangero S, Fazio R, 2008. Quantum multiscale entanglement renormalization ansatz channels. Phys Rev Lett, 101(18):180503. ![]() [6]Hitchcock FL, 1927. The expression of a tensor or a polyadic as a sum of products. J Math Phys, 6(1-4):164-189. ![]() [7]Ho J, Salimans T, 2021. Classifier-free diffusion guidance. Proc Workshop on Deep Generative Models and Downstream Applications. ![]() [8]Ho J, Jain A, Abbeel P, 2020. Denoising diffusion probabilistic models. Proc 34th Int Conf on Neural Information Processing Systems, Article 574. ![]() [9]Hu EJ, Shen YL, Wallis P, et al., 2022. LoRA: low-rank adaptation of large language models. Proc 10th Int Conf on Learning Representations. ![]() [10]Krizhevsky A, Hinton G, 2009. Learning Multiple Layers of Features from Tiny Images. Technical Report. University of Toronto, Toronto, Canada. ![]() [11]LeCun Y, Bottou L, Bengio Y, et al., 1998. Gradient-based learning applied to document recognition. Proc IEEE, 86(11):2278-2324. ![]() [12]Li C, Sun Z, Yu JS, et al., 2019. Low-rank embedding of kernels in convolutional neural networks under random shuffling. Proc IEEE Int Conf on Acoustics, p.3022-3026. ![]() [13]Luo YS, Zhao XL, Meng DY, et al., 2022. HLRTF: hierarchical low-rank tensor factorization for inverse problems in multi-dimensional imaging. Proc IEEE/CVF Conf on Computer Vision and Pattern Recognition, p.19281-19290. ![]() [14]Meng CL, Rombach R, Gao RQ, et al., 2023. On distillation of guided diffusion models. Proc IEEE/CVF Conf on Computer Vision and Pattern Recognition, p.14297-14306. ![]() [15]Nichol AQ, Dhariwal P, 2021. Improved denoising diffusion probabilistic models. Proc 38th Int Conf on Machine Learning, p.8162-8171. ![]() [16]Nie WL, Guo B, Huang YJ, et al., 2022. Diffusion models for adversarial purification. Proc 39th Int Conf on Machine Learning, p.16805-16827. ![]() [17]Oseledets IV, 2011. Tensor-train decomposition. SIAM J Sci Comput, 33(5):2295-2317. ![]() [18]Ronneberger O, Fischer P, Brox T, 2015. U-Net: convolutional networks for biomedical image segmentation. Proc 18th Int Conf on Medical Image Computing and Computer-Assisted Intervention, p.234-241. ![]() [19]Song JM, Meng CL, Ermon S, 2021. Denoising diffusion implicit models. Proc 9th Int Conf on Learning Representations. ![]() [20]Song Y, Garg S, Shi JX, et al., 2020. Sliced score matching: a scalable approach to density and score estimation. Proc 35th Uncertainty in Artificial Intelligence Conf, p.574-584. ![]() [21]Song Y, Dhariwal P, Chen M, et al., 2023. Consistency models. Proc 40th Int Conf on Machine Learning, Article 1335. ![]() [22]Su JH, Byeon W, Kossaifi J, et al., 2020. Convolutional tensor-train LSTM for spatio-temporal learning. Proc 34th Int Conf on Neural Information Processing Systems, Article 1150. ![]() [23]Tucker LR, 1966. Some mathematical notes on three-mode factor analysis. Psychometrika, 31(3):279-311. ![]() [24]Vahdat A, Kreis K, Kautz J, 2021. Score-based generative modeling in latent space. Proc 35th Conf on Neural Information Processing Systems. ![]() [25]Vincent P, 2011. A connection between score matching and denoising autoencoders. Neur Comput, 23(7):1661-1674. ![]() [26]Xiao H, Rasul K, Vollgraf R, 2017. Fashion-MNIST: a novel image dataset for benchmarking machine learning algorithms. https://arxiv.org/abs/1708.07747 ![]() [27]Zhao QB, Zhou GX, Xie SL, et al., 2016. Tensor ring decomposition. https://arxiv.org/abs/1606.05535 ![]() Journal of Zhejiang University-SCIENCE, 38 Zheda Road, Hangzhou
310027, China
Tel: +86-571-87952783; E-mail: cjzhang@zju.edu.cn Copyright © 2000 - 2025 Journal of Zhejiang University-SCIENCE |
Open peer comments: Debate/Discuss/Question/Opinion
<1>