Full Text:   <3>

Summary:  <2>

CLC number: TP391.7

On-line Access: 2026-03-23

Received: 2025-11-28

Revision Accepted: 2026-02-03

Crosschecked: 2026-03-23

Cited: 0

Clicked: 5

Citations:  Bibtex RefMan EndNote GB/T7714

 ORCID:

Lingyun SUN

https://orcid.org/0000-0002-5561-0493

Hanfei ZHU

https://orcid.org/0000-0001-6953-5212

Wei XIANG

https://orcid.org/0000-0003-2058-5379

Yifu ZHANG

https://orcid.org/0000-0003-0646-6322

Ziyue LEI

https://orcid.org/0000-0001-9568-6823

-   Go to

Article info.
Open peer comments

ENGINEERING Information Technology & Electronic Engineering  2026 Vol.27 No.3 P.1-17

http://doi.org/10.1631/ENG.ITEE.2025.0159


Leveraging peripheral interactions to improve drivers’ situation awareness and NDRT efficiency


Author(s):  Hanfei ZHU, Wei XIANG, Yifu ZHANG, Ziyue LEI, Lingyun SUN

Affiliation(s):  1. International Design Institute, Zhejiang University, Hangzhou 310058, China

Corresponding email(s):   sunly@zju.edu.cn

Key Words:  Automated driving, Situation awareness, Non-driving-related tasks, Peripheral interaction, Surround sound, Airflow


Hanfei ZHU, Wei XIANG, Yifu ZHANG, Ziyue LEI, Lingyun SUN. Leveraging peripheral interactions to improve drivers’ situation awareness and NDRT efficiency[J]. Journal of Zhejiang University Science C, 2026, 27(3): 1-17.

@article{title="Leveraging peripheral interactions to improve drivers’ situation awareness and NDRT efficiency",
author="Hanfei ZHU, Wei XIANG, Yifu ZHANG, Ziyue LEI, Lingyun SUN",
journal="Journal of Zhejiang University Science C",
volume="27",
number="3",
pages="1-17",
year="2026",
publisher="Zhejiang University Press & Springer",
doi="10.1631/ENG.ITEE.2025.0159"
}

%0 Journal Article
%T Leveraging peripheral interactions to improve drivers’ situation awareness and NDRT efficiency
%A Hanfei ZHU
%A Wei XIANG
%A Yifu ZHANG
%A Ziyue LEI
%A Lingyun SUN
%J Frontiers of Information Technology & Electronic Engineering
%V 27
%N 3
%P 1-17
%@ 1869-1951
%D 2026
%I Zhejiang University Press & Springer
%DOI 10.1631/ENG.ITEE.2025.0159

TY - JOUR
T1 - Leveraging peripheral interactions to improve drivers’ situation awareness and NDRT efficiency
A1 - Hanfei ZHU
A1 - Wei XIANG
A1 - Yifu ZHANG
A1 - Ziyue LEI
A1 - Lingyun SUN
J0 - Frontiers of Information Technology & Electronic Engineering
VL - 27
IS - 3
SP - 1
EP - 17
%@ 1869-1951
Y1 - 2026
PB - Zhejiang University Press & Springer
ER -
DOI - 10.1631/ENG.ITEE.2025.0159


Abstract: 
L3 automated driving has introduced a trend of drivers engaging in non-driving-related tasks (NDRTs), but it also poses safety challenges for reconstructing drivers’ situation awareness (SA). Two consecutive empirical studies in a driving simulator were conducted to investigate the effect of two peripheral interactions (airflow conveying the intended behaviors of vehicles and surround sound conveying the information of road users) on drivers’ SA performance, NDRT efficiency, workload, and user experience. The first study (n=21) explored the differential effects of airflow, surround sound, and their integration. The second study (n=30) investigated how the integrated interaction performed across different NDRT difficulties. Results demonstrated that airflow and surround sound could significantly improve drivers’ SA when used individually, each having distinct advantages. The integration of these two interactions yielded the best results. Notably, the integrated interaction showed greater effectiveness in improving SA during hard NDRT compared to the easy one. Furthermore, drivers reported reduced subjective workloads and enhanced user experience when leveraging these peripheral interaction methods. Our work offers insights for designing in-vehicle interaction systems that not only reconstruct drivers’ SA but also support NDRT participation, ensuring safety and productivity.

利用边缘交互提升驾驶员态势感知能力与非驾驶相关任务效率

朱晗飞,向为,张艺敷,雷子悦,孙凌云
浙江大学国际设计研究院,中国杭州市,310058
摘要:L3级自动驾驶促使驾驶员在行驶过程中参与非驾驶相关任务(NDRT)成为一种趋势,同时也给驾驶员态势感知(SA)重建带来安全挑战。为此,我们在驾驶模拟器中开展了两项连续的实证研究,考察两种边缘交互方式--以气流传达车辆的意图行为、以环绕声传达道路参与者信息--对驾驶员SA表现、NDRT效率、工作负荷与用户体验的影响。第一项研究(n=21)探究气流、环绕声及其融合使用的差异化效果;第二项研究(n=30)进一步检验融合交互在不同NDRT难度下的表现。结果表明,气流与环绕声在单独使用时均可显著提升驾驶员的SA,且各具优势;二者融合可获得最佳效果。尤其值得注意的是,相比于简单NDRT,融合交互在高难度NDRT条件下对SA的提升更为显著。此外,驾驶员在使用这些边缘交互方法时报告主观工作负荷降低、用户体验提升。本研究为车载交互系统设计提供了启示:在支持驾驶员参与NDRT、提升效率的同时,能够有效重建其态势感知,从而兼顾安全与生产力。

关键词:自动驾驶;态势感知;非驾驶相关任务;边缘交互;环绕声;气流

Darkslateblue:Affiliate; Royal Blue:Author; Turquoise:Article

Reference

[1]Bakker S, 2013. Design for Peripheral Interaction. PhD Dissemination, Technische Universiteit Eindhoven, Eindhoven, The Netherlands.

[2]Bakker S, van den Hoven E, Eggen B, 2015. Peripheral interaction: characteristics and considerations. Pers Ubiq Comput, 19(1):239-254.

[3]Beattie D, Baillie L, Halvey M, 2015. A comparison of artificial driving sounds for automated vehicles. Proc ACM Int Joint Conf on Pervasive and Ubiquitous Computing, p.451-462.

[4]Borojeni SS, Wallbaum T, Heuten W, et al., 2017. Comparing shape-changing and vibro-tactile steering wheels for take-over requests in highly automated driving. Proc 9th Int Conf on Automotive User Interfaces and Interactive Vehicular Applications, p.221-225.

[5]Cai Y, Jin SY, Chen ZH, et al., 2025. Measuring human perception of airflow for natural motion simulation in virtual reality. IEEE Trans Vis Comput Graph, 31(5):2943-2953.

[6]Chai CL, Lei Y, Wei HR, et al., 2024. The effects of various auditory takeover requests: a simulated driving study considering the modality of non-driving-related tasks. Appl Ergon, 118:104252.

[7]Chen HL, Zhao XH, Li ZL, et al., 2024. Construction and analysis of driver takeover behavior modes based on situation awareness theory. IEEE Trans Intell Veh, 9(2):4040-4054.

[8]Cheng SY, Dong HM, Yue YF, et al., 2021. Automated driving: acceptance and chances for young people. 13th Int Conf on Cross-Cultural Design, p.182-194.

[9]Clark H, McLaughlin AC, Feng J, 2017. Situational awareness and time to takeover: exploring an alternative method to measure engagement with high-level automation. Proc Hum Factors Ergon Soc Annu Meet, 61(1):1452-1456.

[10]Cohen-Lazry G, Katzman N, Borowsky A, et al., 2019. Directional tactile alerts for take-over requests in highly-automated driving. Transp Res Part F Traff Psychol Behav, 65:217-226.

[11]Colley M, Eder B, Rixen JO, et al., 2021a. Effects of semantic segmentation visualization on trust, situation awareness, and cognitive load in highly automated vehicles. Proc CHI Conf on Human Factors in Computing Systems, Article 155.

[12]Colley M, Gruler L, Woide M, et al., 2021b. Investigating the design of information presentation in take-over requests in automated vehicles. Proc 23rd Int Conf on Mobile Human-Computer Interaction, Article 22.

[13]Colley M, Speidel O, Strohbeck J, et al., 2024. Effects of uncertain trajectory prediction visualization in highly automated vehicles on trust, situation awareness, and cognitive load. Proc ACM Interact Mob Wear Ubiq Technol, 7(4):153.

[14]DeGuzman CA, Donmez B, 2024. Training benefits driver behaviour while using automation with an attention monitoring system. Transp Res Part C Emerg Technol, 165:104752.

[15]Deligiannidis L, Jacob RJK, 2006. The VR scooter: wind and tactile feedback improve user performance. 3D User Interfaces, p.143-150.

[16]Detjen H, Salini M, Kronenberger J, et al., 2021. Towards transparent behavior of automated vehicles: design and evaluation of HUD concepts to support system predictability through motion intent communication. Proc 23rd Int Conf on Mobile Human-Computer Interaction, Article 19.

[17]Ding YH, Jia LS, Du N, 2024. One size does not fit all: designing and evaluating criticality-adaptive displays in highly automated vehicles. Proc CHI Conf on Human Factors in Computing Systems, Article 92.

[18]Du N, Kim J, Zhou F, et al., 2020. Evaluating effects of cognitive load, takeover request lead time, and traffic density on drivers’ takeover performance in conditionally automated driving. 12th Int Conf on Automotive User Interfaces and Interactive Vehicular Applications, p.66-73.

[19]Durso FT, Bleckley MK, Dattel AR, 2006. Does situation awareness add to the validity of cognitive tests? Hum Factors, 48(4):721-733.

[20]Edworthy J, Loxley S, Dennis I, 1991. Improving auditory warning design: relationship between warning sound parameters and perceived urgency. Hum Factors, 33(2):205-231.

[21]Endsley MR, 1988. Situation awareness global assessment technique (SAGAT). Proc IEEE National Aerospace and Electronics Conf, p.789-795.

[22]Endsley MR, 1995. Toward a theory of situation awareness in dynamic systems. Hum Factors, 37(1):32-64.

[23]Endsley MR, 2001. Designing for situation awareness in complex system. Proc 2nd Int Workshop on Symbiosis of Humans, Artifacts and Environment, p.1-14.

[24]Endsley MR, 2020. Situation awareness in driving. In: Fisher DL, Horrey WJ, Lee JD, et al. (Eds.), Handbook of Human Factors for Automated, Connected, and Intelligent Vehicles. CRC Press, Boca Raton, USA.

[25]Endsley MR, 2021. A systematic review and meta-analysis of direct objective measures of situation awareness: a comparison of SAGAT and SPAM. Hum Factors, 63(1):124-150.

[26]Epple S, Roche F, Brandenburg S, 2018. The sooner the better: drivers’ reactions to two-step take-over requests in highly automated driving. Proc Hum Factors Ergon Soc Annu Meet, 62(1):1883-1887.

[27]Fagerlönn J, Lindberg S, Sirkka A, 2015. Combined auditory warnings for driving-related information. Proc Audio Mostly on Interaction with Sound, Article 11.

[28]Greenlee MW, Frank SM, Kaliuzhna M, et al., 2016. Multisensory integration in self motion perception. Multisens Res, 29(6-7):525-556.

[29]Hart SG, Staveland LE, 1988. Development of NASA-TLX (task load index): results of empirical and theoretical research. Adv Psychol, 52:139-183.

[30]Hasegawa K, Wu YB, Kihara K, 2024. Two-stage transition procedure reduces potential hazards on planned transitions in automated driving. Transp Res Part F Traff Psychol Behav, 107:924-936.

[31]Hermann T, Hunt A, Neuhoff JG, 2011. The Sonification Handbook. Logos Publishing House, Berlin, Germany.

[32]Hungund AP, Pradhan AK, 2023. Impact of non-driving related tasks while operating automated driving systems (ADS): a systematic review. Accid Anal Prev, 188:107076.

[33]International Organization for Standardization (ISO), 2018. Road Vehicles Functional Safety. ISO 26262:2018. Geneva, Switzerland. https://www.iso.org/standard/68383.html

[34]Isherwood SJ, McKeown D, 2017. Semantic congruency of auditory warnings. Ergonomics, 60(7):1014-1023.

[35]Karatas N, Tanaka T, Fujikakc K, et al., 2020. Evaluation of AR-HUD interface during an automated intervention in manual driving. IEEE Intelligent Vehicles Symp, p.2158-2164.

[36]Kern D, Marshall P, Hornecker E, et al., 2009. Enhancing navigation information with tactile output embedded into the steering wheel. 7th Int Conf on Pervasive Computing, p.42-58.

[37]Kim S, van Egmond R, Happee R, 2024a. How manoeuvre information via auditory (spatial and beep) and visual UI can enhance trust and acceptance in automated driving. Transp Res Part F Traff Psychol Behav, 100:22-36.

[38]Kim S, He XL, van Egmond R, et al., 2024b. Designing user interfaces for partially automated vehicles: effects of information and modality on trust and acceptance. Transp Res Part F Traff Psychol Behav, 103:404-419.

[39]Koo J, Kwac J, Ju W, et al., 2015. Why did my car just do that? Explaining semi-autonomous driving actions to improve driver understanding, trust, and performance. Int J Interact Des Manuf, 9(4):269-275.

[40]Kooijman L, Asadi H, Mohamed S, et al., 2022. A systematic review and meta-analysis on the use of tactile stimulation in vection research. Atten Percept Psychophys, 84(1):300-320.

[41]Kooijman L, Asadi H, Arango CG, et al., 2024. Investigating the influence of neck muscle vibration on illusory self-motion in virtual reality. Virtual Real, 28(2):76.

[42]Kurosawa M, Ikei Y, Suzuki Y, et al., 2018. Airflow for body motion virtual reality. 20th Int Conf on Human Interface and the Management of Information, p.395-402.

[43]Laugwitz B, Held T, Schrepp M, 2008. Construction and evaluation of a user experience questionnaire. In: Holzinger A (Ed.), HCI and Usability for Education and Work. USAB 2008. Lecture Notes in Computer Science, Vol. 5298. Springer, Berlin, Heidelberg.

[44]Lerner N, Singer J, Kellman D, et al., 2015. In-vehicle noise alters the perceived meaning of auditory signals. Proc 8th Int Driving Symp on Human Factors in Driver Assessment, Training and Vehicle Design, p.401-407.

[45]Li MY, Holthausen BE, Stuck RE, et al., 2019. No risk no trust: investigating perceived risk in highly automated driving. Proc 11th Int Conf on Automotive User Interfaces and Interactive Vehicular Applications, p.177-185.

[46]Liu WM, Li QK, Wang ZY, et al., 2023. A literature review on additional semantic information conveyed from driving automation systems to drivers through advanced in-vehicle HMI just before, during, and right after takeover request. Int J Hum-Comput Int, 39(10):1995-2015.

[47]Löcken A, Frison AK, Fahn V, et al., 2020. Increasing user experience and trust in automated vehicles via an ambient light display. 22nd Int Conf on Human-Computer Interaction with Mobile Devices and Services, Article 38.

[48]Loft S, Bowden V, Braithwaite J, et al., 2015. Situation awareness measures for simulated submarine track management. Hum Factors, 57(2):298-310.

[49]Lu ZJ, Coster X, de Winter J, 2017. How much time do drivers need to obtain situation awareness? A laboratory-based study of automated driving. Appl Ergon, 60:293-304.

[50]Ma S, Zhang W, Yang Z, et al., 2021. Take over gradually in conditional automated driving: the effect of two-stage warning systems on situation awareness, driving stress, takeover performance, and acceptance. Int J Hum-Comput Int, 37(4):352-362.

[51]McNeer RR, Bohórquez J, Özdamar Ö, et al., 2007. A new paradigm for the design of audible alarms that convey urgency information. J Clin Monit Comput, 21(6):353-363.

[52]Meinhardt LM, Rück M, Zähnle J, et al., 2024. Hey, what’s going on? Conveying traffic information to people with visual impairments in highly automated vehicles: introducing onboard. Proc ACM Interact Mob Wear Ubiq Technol, 8(2):67.

[53]Müller J, Geier M, Dicke C, et al., 2014. The BoomRoom: mid-air direct interaction with virtual sound sources. Proc SIGCHI Conf on Human Factors in Computing Systems, p.247-256.

[54]Nahin Ch NA, Fortier J, Janssen CP, et al., 2024. Text a bit longer or drive now? Resuming driving after texting in conditionally automated cars. Proc 16th Int Conf on Automotive User Interfaces and Interactive Vehicular Applications, p.13-22.

[55]National Transportation Safety Board, 2018. Collision Between a Sport Utility Vehicle Operating with Partial Driving Automation and a Crash Attenuator Mountain View, California. National Transportation Safety Board, Washington, D.C., USA.

[56]Naujoks F, Neukum A, 2014. Specificity and timing of advisory warnings based on cooperative perception. Mensch & Computer 2014-Workshopband, p.229-238.

[57]Nordhoff S, de Winter J, Kyriakidis M, et al., 2018. Acceptance of driverless vehicles: results from a large cross-national questionnaire study. J Adv Transp, 2018:5382192.

[58]Park S, Xing YL, Akash K, et al., 2022. The impact of environmental complexity on drivers’ situation awareness. Proc 14th Int Conf on Automotive User Interfaces and Interactive Vehicular Applications, p.131-138.

[59]Pfleging B, Rang M, Broy N, 2016. Investigating user needs for non-driving-related activities during automated driving. Proc 15th Int Conf on Mobile and Ubiquitous Multimedia, p.91-99.

[60]Poletti MA, Betlehem T, Abhayapala T, 2012. Higher order loudspeakers for improved surround sound reproduction in rooms. 133rd Audio Engineering Society Convention, p.513-525.

[61]Riecke BE, Murovec B, Campos JL, et al., 2023. Beyond the eye: multisensory contributions to the sensation of illusory self-motion (Vection). Multisens Res, 36(8):827-864.

[62]Rietzler M, Plaumann K, Kränzle T, et al., 2017. VaiR: simulating 3D airflows in virtual reality. Proc CHI Conf on Human Factors in Computing Systems, p.5669-5677.

[63]Šabić E, Chen J, MacDonald JA, 2021. Toward a better understanding of in-vehicle auditory warnings and background noise. Hum Factors, 63(2):312-335.

[64]SAE, 2021. Taxonomy and Definitions for Terms Related to Driving Automation Systems for On-road Motor Vehicles. SAE International, Warrendale, USA.

[65]Samuel S, Borowsky A, Zilberstein S, et al., 2016. Minimum time to situation awareness in scenarios involving transfer of control from an automated driving suite. Transp Res Rec, 2602(1):115-120.

[66]Schartmüller C, Weigl K, Löcken A, et al., 2021. Displays for productive non-driving related tasks: visual behavior and its impact in conditionally automated driving. Multim Technol Interact, 5(4):21.

[67]Schoop E, Smith J, Hartmann B, 2018. HindSight: enhancing spatial awareness by sonifying detected objects in real-time 360-degree video. Proc CHI Conf on Human Factors in Computing Systems, Article 143.

[68]Seno T, Ogawa M, Ito H, et al., 2011. Consistent air flow to the face facilitates vection. Perception, 40(10):1237-1240.

[69]Shi E, Bengler K, 2022. Non-driving related tasks’ effects on takeover and manual driving behavior in a real driving setting: a differentiation approach based on task switching and modality shifting. Accid Anal Prev, 178:106844.

[70]Shi JL, Zhang W, Wei HR, et al., 2024. Investigating looming tactile takeover requests with various levels of urgency in automated vehicles. Accid Anal Prev, 208:107790.

[71]Song JQ, Wang YW, An XJ, et al., 2022. Novel sonification designs: compressed, iconic, and pitch-dynamic auditory icons boost driving behavior. Appl Ergon, 103:103797.

[72]Stapel J, Mullakkal-Babu FA, Happee R, 2019. Automated driving reduces perceived workload, but monitoring causes higher cognitive load than manual driving. Transp Res Part F Traff Psychol Behav, 60:590-605.

[73]Steinbach R, Tefft BC, 2023. American Driving Survey: 2022. Research Brief. AAA Foundation for Traffic Safety, Washington, D.C., USA.

[74]Tefft BC, 2022. American Driving Survey, 2020–2021. Research Brief. AAA Foundation for Traffic Safety, Washington, D.C., USA.

[75]Tseng CM, Chen PY, Lin SC, et al., 2022. Headwind: enhancing teleportation experience in VR by simulating air drag during rapid motion. Proc CHI Conf on Human Factors in Computing Systems, Article 518.

[76]Vogel K, 2003. A comparison of headway and time to collision as safety indicators. Accid Anal Prev, 35(3):427-433.

[77]Wandtner B, Schömig N, Schmidt G, 2018. Effects of non-driving related task modalities on takeover performance in highly automated driving. Hum Factors, 60(6):870-881.

[78]Wang JM, Yang JY, Fu QW, et al., 2024. A new dynamic spatial information design framework for AR-HUD to evoke drivers’ instinctive responses and improve accident prevention. Int J Hum-Comput Stud, 183:103194.

[79]Wang MJ, Lyckvi SL, Chen F, 2016. Why and how traffic safety cultures matter when designing advisory traffic information systems. Proc CHI Conf on Human Factors in Computing Systems, p.2808-2818.

[80]Wang MJ, Lyckvi SL, Chen CH, et al., 2017. Using advisory 3D sound cues to improve drivers’ performance and situation awareness. Proc CHI Conf on Human Factors in Computing Systems, p.2814-2825.

[81]Wang MJ, Liao Y, Lyckvi SL, et al., 2020. How drivers respond to visual vs. auditory information in advisory traffic information systems. Behav Inform Technol, 39(12):1308-1319.

[82]Wintersberger P, Riener A, Schartmüller C, et al., 2018. Let me finish before I take over: towards attention aware device integration in highly automated vehicles. Proc 10th Int Conf on Automotive User Interfaces and Interactive Vehicular Applications, p.53-65.

[83]Wörle J, Metz B, 2020. Driving with an L3-motorway chauffeur: how do drivers use their driving time? Proc Human Factors and Ergonomics Society Europe, p.53-62.

[84]Xing HN, Qin H, Niu JW, 2017. Driver’s information needs in automated driving. 9th Int Conf on Cross-Cultural Design, p.736-744.

[85]Xu CL, Li PH, Li YB, et al., 2022. Takeover performance and workload under varying automation levels, time budget and road curvature. IEEE Asia-Pacific Conf on Image Processing, Electronics and Computers, p.1379-1385.

[86]Xu LL, Guo L, Ge PS, et al., 2022. Effect of multiple monitoring requests on vigilance and readiness by measuring eye movement and takeover performance. Transp Res Part F Traff Psychol Behav, 91:179-190.

[87]Yang J, Barde A, Billinghurst M, 2022. Audio augmented reality: a systematic review of technologies, applications, and future research directions. J Audio Eng Soc, 70(10):788-809.

[88]Yang YC, Karakaya B, Dominioni GC, et al., 2018. An HMI concept to improve driver’s visual behavior and situation awareness in automated vehicle. 21st Int Conf on Intelligent Transportation Systems, p.650-655.

[89]Yang Z, Shi JL, Zhang Y, et al., 2019. Head-up display graphic warning system facilitates simulated driving performance. Int J Hum-Comput Int, 35(9):796-803.

[90]Yeo D, Kim G, Oh M, et al., 2025. AttraCar: multisensory in-car VR with thermal, airflow, and motion feedback through built-in vehicle systems. Proc 38th Annual ACM Symp on User Interface Software and Technology, Article 163.

[91]Zangi N, Srour-Zreik R, Ridel D, et al., 2022. Driver distraction and its effects on partially automated driving performance: a driving simulator study among young-experienced drivers. Accid Anal Prev, 166:106565.

[92]Zeeb K, Buchner A, Schrauf M, 2015. What determines the take-over time? An integrated model approach of driver take-over after automated driving. Accid Anal Prev, 78:212-221.

[93]Zhang N, Fard M, Xu J, et al., 2023. Influence of non-driving related tasks on driving performance after takeover transition in conditionally automated driving. Transp Res Part F Traff Psychol Behav, 96:248-264.

[94]Zhang TR, Li WT, Huang WX, et al., 2024. Critical roles of explainability in shaping perception, trust, and acceptance of autonomous vehicles. Int J Ind Ergonom, 100:103568.

[95]Zhang YW, Wang WJ, Zhou XY, et al., 2023. Tactical-level explanation is not enough: effect of explaining AV’s lane-changing decisions on drivers’ decision-making, trust, and emotional experience. Int J Hum-Comput Int, 39(7):1438-1454.

[96]Zhao YH, Bennett CL, Benko H, et al., 2018. Enabling people with visual impairments to navigate virtual reality with a haptic and auditory cane simulation. Proc CHI Conf on Human Factors in Computing Systems, Article 116.

Open peer comments: Debate/Discuss/Question/Opinion

<1>

Please provide your name, email address and a comment





Journal of Zhejiang University-SCIENCE, 38 Zheda Road, Hangzhou 310027, China
Tel: +86-571-87952783; E-mail: cjzhang@zju.edu.cn
Copyright © 2000 - 2026 Journal of Zhejiang University-SCIENCE