Page 41 - Read Online
P. 41
Page 190 Chen et al. Intell Robot 2024;4:179-95 I http://dx.doi.org/10.20517/ir.2024.11
Figure 8. The experiment setup and absolute trajectory error results on the actual robotic transfemoral prosthesis.
3.3 Evaluation on a robotic transfemoral prosthesis
The proposed approach has also been evaluated on an actual robotic transfemoral prosthesis to demonstrate
real-world viability. As shown in Figure 8, the camera was mounted on the knee joint of the prosthesis. An
amputee subject was instructed to wear the prosthesis to climb stairs. The estimated camera trajectory was
also recorded and compared with the ground truth. The results are shown in Figure 8. The estimated camera
motion trajectory by the proposed method still aligns well with the ground truth, affirming its viability for ap-
plication on actual robots. The error is a bit larger than that obtained on the healthy subject due to unavoidable
mechanical shaking of the joints and also the adapting piece connecting the camera and the prosthesis.
4. DISCUSSION
The findings presented in this study on improved staircase shape feature extraction for walking-aid robots
hold significant implications for the field of walking-aid robots. The successful development of an algorithm
capableofaccuratelyperceivingandinterpretingstaircaseshapesrepresentsanotableadvancementtoenhance
the autonomy, safety, and adaptability of robotic systems navigating complex environments. It is also crucial
to explore the practical application scenarios where our staircase shape feature extraction method significantly
enhancestheoperationofwalking-aidrobots. Bydelvingintotheseaspects, weaimtounderscorethemethod’s
impact on robot navigation and safety in real-world environments.
4.1 Key findings of this work
The key findings of this work in staircase shape feature extraction for walking-aid robots are:
1. Robust Feature Extraction: The developed method successfully overcomes the limitations of existing ap-
proaches, ensuring reliable extraction of staircase features even in challenging scenarios, such as restricted
viewpoints and rapid movements of the robot.
2. Improved Point Cloud Registration: The integration of RANSAC and KNN-augmented ICP algorithms
significantly enhances the point cloud registration process, leading to more accurate and efficient handling of
environmental data.
3. Enhanced Navigation Capabilities: The advancements in feature extraction and point cloud processing con-
tribute to the improved navigation capabilities of walking-aid robots, particularly in complex environments
with staircases.
Overall, these findings represent a substantial step forward in robotics, particularly in enhancing the environ-
mental perception and navigational proficiency of walking-aid robots.