Page 162 - Read Online
P. 162
Tu et al. Soft Sci 2023;3:25 Soft Science
DOI: 10.20517/ss.2023.15
Perspective Open Access
Electronic skins with multimodal sensing and
perception
1,2
3
1,2
2
2
2
2,5
Jiaqi Tu , Ming Wang , Wenlong Li , Jiangtao Su , Yanzhen Li , Zhisheng Lv , Haicheng Li , Xue
Feng 6,7,* , Xiaodong Chen 2,3,4,*
1
Institute of Flexible Electronics Technology of THU, Jiaxing 314000, Zhejiang, China.
2
Innovative Center for Flexible Devices (iFLEX), Max Planck - NTU Joint Lab for Artificial Senses, School of Materials Science and
Engineering, Nanyang Technological University, Singapore 639798, Singapore.
3
Institute of Materials Research and Engineering, Agency for Science, Technology and Research (A*STAR), Singapore 138634,
Singapore.
4
Institute for Digital Molecular Analytics and Science (IDMxS), Nanyang Technological University, Singapore 639798, Singapore.
5
Frontier Institute of Chip and System, Fudan University, Shanghai 200433, China.
6
AML, Department of Engineering Mechanics, Tsinghua University, Beijing 100084, China.
7
Laboratory of Flexible Electronics Technology, Tsinghua University, Beijing 100084, China.
* Correspondence to: Prof. Xiaodong Chen, Innovative Center for Flexible Devices (iFLEX), Max Planck - NTU Joint Lab for
Artificial Senses, School of Materials Science and Engineering, Nanyang Technological University, 50 Nanyang Avenue,
Singapore 639798, Singapore. E-mail: chenxd@ntu.edu.sg; Prof. Xue Feng, Laboratory of Flexible Electronics Technology,
Tsinghua University, Beijing 100084, China. E-mail: fengxue@tsinghua.edu.cn
How to cite this article: Tu J, Wang M, Li W, Su J, Li Y, Lv Z, Li H, Feng X, Chen X. Electronic skins with multimodal sensing and
perception. Soft Sci 2023;3:25. https://dx.doi.org/10.20517/ss.2023.15
Received: 24 Mar 2023 First Decision: 25 Apr 2023 Revised: 7 Jun 2023 Accepted: 13 Jun 2023 Published: 11 Jul 2023
Academic Editors: Dae-Hyeong Kim, Zhifeng Ren Copy Editor: Pei-Yun Wang Production Editor: Pei-Yun Wang
Abstract
Multiple types of sensory information are detected and integrated to improve perceptual accuracy and sensitivity
in biological cognition. However, current studies on electronic skin (e-skin) systems have mainly focused on the
optimization of the modality-specific data acquisition and processing. Endowing e-skins with the abilities of
multimodal sensing and even perception that can achieve high-level perception behaviors has been insufficiently
explored. Moreover, the perception progress of multisensory e-skin systems is faced with challenges at both device
and software levels. Here, we provide a perspective on the multisensory fusion of e-skins. The recent progress in
e-skins realizing multimodal sensing is reviewed, followed by bottom-up and top-down multimodal perception.
With the deepening understanding of neuroscience and the rapid advance of novel algorithms and devices,
multimodal perception function becomes possible and will promote the development of highly intelligent e-skin
systems.
Keywords: Electronic skins, multimodal sensing, perception fusion
© The Author(s) 2023. Open Access This article is licensed under a Creative Commons Attribution 4.0
International License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, sharing,
adaptation, distribution and reproduction in any medium or format, for any purpose, even commercially, as
long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and
indicate if changes were made.
www.softscijournal.com

