Speech-Based Human-Exoskeleton Interaction for Lower Limb Motion Planning
Publication Date
1-1-2024
Document Type
Conference Proceeding
Publication Title
2024 IEEE 4th International Conference on Human-Machine Systems, ICHMS 2024
DOI
10.1109/ICHMS59971.2024.10555587
Abstract
This study presents a speech-based motion planning strategy (SBMP) developed for lower limb exoskeletons to facilitate safe and compliant human-robot interaction. A speech processing system, finite state machine (FSM), and central pattern generator (CPG) are the building blocks of the proposed strategy for online planning of the exoskeleton's trajectory. A novel set of CPG dynamics is proposed to synchronize time-continuous transitions between exoskeleton locomotion states (e.g., sit, stand, walk) in response to discrete user inputs, while speech inputs are processed through an FSM. According to experimental evaluations, this speech-processing system achieved low levels of word and intent errors. Regarding locomotion, the completion time for users with voice commands was 54% faster than that using a mobile app interface. With the proposed SBMP, users are able to maintain their postural stability with both hands free. This supports its use as an effective motion planning method for the assistance and rehabilitation of individuals with lower-limb impairments.
Department
Mechanical Engineering
Recommended Citation
Eddie Guo, Christopher Perlette, Mojtaba Sharifi, Lukas Grasse, Matthew Tata, Vivian K. Mushahwar, and Mahdi Tavakoli. "Speech-Based Human-Exoskeleton Interaction for Lower Limb Motion Planning" 2024 IEEE 4th International Conference on Human-Machine Systems, ICHMS 2024 (2024). https://doi.org/10.1109/ICHMS59971.2024.10555587