Page 51 - Read Online
P. 51

Wei et al. Soft Sci 2023;3:17  https://dx.doi.org/10.20517/ss.2023.09           Page 23 of 38

               conversation medium. Therefore, electronic textiles used for the recognition of hand gestures, sign-to-
               speech translation, and human-machine gesture collaboration have been broadly explored in recent
               years [88,188-192] . Liu et al. developed fiber-type sensors made of the Ecoflex/CNT composite and integrated
                                                                         [193]
               them with gloves through sewing and printing to make a smart glove . With the help of deep learning and
               control systems, a smart glove can precisely identify gestures with a high accuracy of 98.4%, as shown in
               Figure 11A.

               The accurate recognition of gestures lays a solid foundation for gesture interaction and sign language
               communication. To meet the urgent demand for gesture interaction, Veeramuthu et al. proposed
               conductive  fibers  produced  through  electrospinning . The  conductive  fibers  are  mounted  on  a
                                                                [194]
               commercial glove to design a hysteresis-free smart glove that converts biomechanical gestures into electrical
               signals to establish a wearable gesture interaction interface between people (as shown in Figure 11B).
               Furthermore, using a continuous, mass-producible, and low-cost spinning technology, a full-fiber
               auxetic-interlaced yarn sensor is designed by Wu et al. . With the sensor array, an ultrafast full-letter
                                                               [195]
               sign-language translation glove is developed to translate daily dialogues and complex sentences, which can
               eliminate the communication barriers between signers and non-signers (as shown in Figure 11C). The
               overall accuracy of all letters is 99.8%, and the average recognition time is less than 0.25 s, demonstrating
               excellent potential for practical applications. Also, in sign-to-speech translation, Zhou et al. demonstrated a
               translation system consisting of yarn-based stretchable sensor arrays and a wireless printed circuit
                    [144]
               board . Assisted by machine learning, the wearable sign-to-speech translation system allowed real-time
               translation of signs into spoken words with an accuracy of 98.63%.

               In order to achieve intelligent development of machines, human-machine gesture collaboration is another
               focus of researchers in addition to gesture interaction between people. As shown in Figure 11D, Yang et al.
               reported scalable fiber electronics that could be designed as an optoelectronic synergistic smart data glove
                                          [196]
               for human-machine interaction . The smart glove could manipulate hands in virtual space and further
               control manipulators in real-life scenarios. Moreover, Zhang et al. designed a textile-based electronic device
               that can control machine hands by human hand gestures , showing the significant potential of wearable
                                                                [197]
               electronic textiles for reliable human-robot interaction.

               VR and AR control
               The rapid development of VR and AR technologies has paved the way for diverse applications in social
               activities, sports training, leisure and entertainment, games, and other fields [198-202] . Smart textiles represent
               an ideal human-machine interface for VR/AR applications. As shown in Figure 12A, a wearable
               human-machine interface smart textile, driven optically, was developed by Ma et al., which could feel slight
               finger slip and classify the touch manners with the help of machine learning, achieving a recognition
               accuracy as high as 98.1% . When the smart textile was attached to a doll, the virtual doll on the computer
                                     [203]
               could express various emotional expressions according to the touch mode perceived by the real doll. To
               realize AI-enabled sign language recognition and VR space bidirectional communication, Wen et al.
               proposed an intelligent system comprising sensing gloves, an AI block, and a VR interaction interface . It
                                                                                                     [204]
               is worth noting that the intelligent system can recognize new sentences created by recombining new-order
               word elements, with an average accuracy rate of 86.67%. The results of sign language recognition in the real
               world were mapped in virtual space and translated into visual text or voice, showing the potential
               applications of intelligent sign language recognition and communication systems in the future [Figure 12B].

               As the standard of living increases, people’s expectations for entertainment services are also increasing.
               Mapping human motion signals into virtual space to enable VR games is currently the key direction for the
   46   47   48   49   50   51   52   53   54   55   56