Page 15 - Read Online
P. 15

Jin et al. Soft Sci 2023;3:8  https://dx.doi.org/10.20517/ss.2022.34            Page 13 of 26




























                Figure 7. The principle of robotic object properties recognition: (A) robotic hand with tactile sensors to grasp object; (B) detect object
                external local properties including surface texture, stiffness, thermal conductivity, and chemical substance; (C) detect object global
                properties such as weight, position of centroid, shape, and pose; (D) detect object internal properties.


               tactile methods can serve as an effective complement when operated in dimly lit or occluded environments.
               Besides, the fusion of vision and tactile methods can significantly increase the precision rate in object
                         [12]
               recognition . Figure 7C illustrates the tactile exploring actions and signals to detect some global properties
               such as inertial parameters, shape, and pose.

               Inertial parameters play an important role in robotic manipulation since accurate measurement of weight
               serves as an auxiliary estimation for force control, and grasping at the center of mass (CoM) prevents large
                                            [133]
               torque applied to the manipulator . Most researchers lift the target objects using the manipulator with 6D
                                                                                  [46]
               force and torque sensors, and methods such as Cross-correlation analysis  and feedforward neural
               network (FNN)  are used to estimate inertial parameters. Besides, CoM measurement is more accurate
                             [134]
                                                                                    [133]
               when combining the tactile signal with a priori estimation from the visual patterns .
               Shape and pose provide a most intuitive impression of the interactive target, and they are traditionally
               detected by visual devices in most cases. A more complete description of the objects can significantly
               improve the robotic manipulation stability and controllability. For haptic shape perception, large-scale force
               arrays are always used in the grasping manipulators to generate tactile images, which can be further
                                                           [91]
               processed with advanced algorithms such as k-NN , SVM , and so on. Combing with vision image or
                                                                  [136]
               kinaesthetic data, particularly joint angle, a space cloud point of the object can be further fitted using
               methods such as interactive closest labeled point (iCLAP) , Gaussian process (GP)  et al. For pose
                                                                   [137]
                                                                                          [135]
               sensing, a more accurate estimation of object poses contributes to stable robotic manipulation. Since the
               action of touch is intrusive to cause rotation and translation of the object, sometimes proximity sensors are
               utilized to capture pose parameters . It gives a more accurate pose estimation when combining vision and
                                             [3]
                                                                                        [138]
               tactile data using algorithms including translation invariant quaternion filter (TIQF)  or convolutional
               neural network (CNN) . Besides, vision-tactile devices, especially Gelsight, using advanced image
                                    [139]
               processing methods have advantages in the shape and pose sensing of small objects .
                                                                                     [42]
   10   11   12   13   14   15   16   17   18   19   20