SoftBoMI: A Non-invasive Wearable Body-Machine Interface for Mapping Movement of Shoulder to Commands

Journal of Neural Engineering, 2024

Objective: Customized human-machine interfaces for controlling assistive devices are vital in improving the self-help ability of upper limb amputees and tetraplegic patients. Given that most of them possess residual shoulder mobility, using it to generate commands to operate assistive devices can serve as a complementary approach to brain-computer interfaces.

Approach: We propose a hybrid body-machine interface prototype that integrates soft sensors and an inertial measurement unit. This study introduces both a rule-based data decoding method and a user intent inference-based decoding method to map human shoulder movements into continuous commands. Additionally, by incorporating prior knowledge of the user's operational performance into a shared autonomy framework, we implement an adaptive switching command mapping approach. This approach enables seamless transitions between the two decoding methods, enhancing their adaptability across different tasks.

Main results: The proposed method has been validated on individuals with cervical spinal cord injury, bilateral arm amputation, and healthy subjects through a series of center-out target reaching tasks and a virtual powered wheelchair driving task. The experimental results show that using both the soft sensors and the gyroscope exhibits the most well-rounded performance in intent inference. Additionally, the rule-based method demonstrates better dynamic performance for wheelchair operation, while the intent inference method is more accurate but has higher latency. Adaptive switching decoding methods offer the best adaptability by seamlessly transitioning between decoding methods for different tasks. Furthermore, we discussed the differences and characteristics among the various types of participants in the experiment

Significance: The proposed method has the potential to be integrated into clothing, enabling non-invasive interaction with assistive devices in daily life, and could serve as a tool for rehabilitation assessment in the future.

Remapping Movement of Shoulder with Soft Sensors: A Non-invasive Wearable Body-Machine Interface

IEEE Sesnors Journal, 2024

Soft sensors are typically composed of flexible materials capable of bending and stretching, thereby accommodating various human movements. This study presents a human-machine interface utilizing silicone-based soft sensors and inertial measurement units (IMUs) to convert shoulder movements of amputees and quadriplegics into continuous two-dimensional commands for controlling assistive devices. We have developed a rule-based data decoding method and a intent-inference-based data decoding method to generate real-time commands. Furthermore, by integrating the user's manipulation performance prior knowledge into a shared autonomy framework, we have implemented a shared command mapping approach to improve overall command generation performance. To validate the proposed interface prototype, we conducted center-out target reaching tasks and a virtual wheelchair driving task involving nine healthy subjects. The experimental results indicate that using soft sensors can effectively capture human shoulder movements, thereby generating reliable interaction commands.

Real-time Gait Phase Estimation Based on Multi-source Flexible Sensors Fusion

RobCE 23: Proceedings of the 2023 3rd International Conference on Robotics and Control Engineering, 2023

The real-time gait phase of human lower extremity is the foundation for wearable robots to provide precise and complex assistance strategies in human-robot interaction. In addition to strengths in estimation performance, it is crucial to make the devices portable and user-friendly that can drive the adoption in the unstructured environments. In this paper, we present an online continuous gait phase estimation system based on multi-source flexible sensors that address this issue. Specifically, we utilize two soft bend sensors mounted around the hip joint and a set of flexible pressure sensors mounted on the bottom of the foot to track the real-time motion of the lower limbs. The adaptive nonlinear frequency oscillators (ANFOs) are used to couple with the captured motion to generate a sequential, linearly growing gait phase. Moreover, heel strike events are detected to calculate phase shift and synchronize the phase with practical action. A uniform walking experiment validates the performance of the proposed method. The experiment results demonstrate that our approach could provide accurate gait phase information and has the potential to improve the interaction transparency of exoskeleton robots in the future.

Recommended Citation:

X. Zhao, R. Liu, T. Ma, H. Li, and Q. Song, “Real-time Gait Phase Estimation Based on Multi-source Flexible Sensors Fusion,” in Proceedings of the 2023 3rd International Conference on Robotics and Control Engineering, Nanjing China: ACM, May 2023, pp. 113–118. doi: 10.1145/3598151.3598223.

Adaptive Symmetry Reference Trajectory Generation in Shared Autonomy for Active Knee Orthosis

IEEE Robotics and Automation Letters, IROS 2023

Gait symmetry training plays an essential role in the rehabilitation of hemiplegic patients. Robotics-based gait training has been widely accepted by patients and clinicians. Reference trajectory generation for the affected side using the motion data of the unaffected side is an important way to achieve this. However, online generation of gait reference trajectory requires the algorithm to provide correct gait phase delay and could reduce the impact of measurement noise from sensors and input uncertainty from users. Based on an active knee orthosis (AKO) prototype, this work presents an adaptive symmetric gait trajectory generation framework for the gait rehabilitation of hemiplegic patients. Using the adaptive nonlinear frequency oscillators (ANFO) and movement primitives, we implement online gait pattern encoding and adaptive phase delay according to the real-time user input. A shared autonomy (SA) module with online input validation and arbitration has been designed to prevent undesired movements from being transmitted to the actuator on the affected side. The experimental results demonstrate the feasibility of the framework. Overall, this work suggests that the proposed method has the potential to perform gait symmetry rehabilitation in an unstructured environment and provide a kinematic reference for torque-assistance AKO.

Recommended Citation:

R. Liu et al., “Adaptive Symmetry Reference Trajectory Generation in Shared Autonomy for Active Knee Orthosis,” IEEE Robotics and Automation Letters, vol. 8, no. 6, pp. 3118–3125, Jun. 2023, doi: 10.1109/LRA.2023.3264767.

LSTM-Based Lower Limbs Motion Reconstruction Using Low-Dimensional Input of Inertial Motion Capture System

IEEE Sensors Journal, 2020

Motion capture system has been widely used in virtual reality and rehabilitation area. This study proposed a data-driven method using low-dimensional input of inertial motion capture system to reconstruct human lower-limb motions. The long short-term memory (LSTM) neural network was used and an ensemble LSTM architecture was involved to improve reconstruction performance. Besides, the selection of optimal sensor configuration scheme and time-step parameters of LSTM network was discussed in detail. The reconstruction experiment shows that the method could get the lowest reconstruction joint angle root mean square (RMS) errors of 4.031◦ on separated motion dataset, and 5.105◦ on completely new dataset of synthetic motions using ensemble LSTM model with 18 base learner and three sensors units. The computational consumption test shows that the single and ensemble LSTM model spend 0.15ms and 0.91ms respectively to predict next frame. These findings demonstrate that the proposed method is effective and efficient for motions reconstruction of lower limbs.

Recommended Citation:

L. Tong, R. Liu, and L. Peng, “LSTM-Based Lower Limbs Motion Reconstruction Using Low-Dimensional Input of Inertial Motion Capture System,” IEEE Sensors J., vol. 20, no. 7, pp. 3667–3677, Apr. 2020, doi: 10.1109/JSEN.2019.2959639.

A Novel Method for Parkinson’s Disease Classification and Dyskinesia Quantification Using Wearable Inertial Sensors

IEEE 9th Annual International Conference on CYBER Technology in Automation, Control, and Intelligent Systems (CYBER), 2019

This work introduces a novel method for classification and the dyskinesia quantification of Parkinson’s disease patients. We merely utilized the data from two 9-axis inertial sensors (only use 3D gyroscope and 3D accelerometer) located at wrist and ankle. And an SVM classifier with RBF kernel was applied for classifying the data frame of PD subjects and controls and finally got an accuracy of 87.1%. Additionally, a pr value was proposed for quantification the dyskinesia into three levels in a fixed sample time period. Combining with the quadratic weighted Kappa coefficient, we evaluated the interrater reliability of artificial quantification and our proposed method in the end (k=0.961). The outcome indicates that our method is effective for identifying the PD patients and healthy individuals and quantitative evaluating the dyskinesia severity levels of PD patients, which has the application prospect of continual illness monitoring and levodopa dosing assessment for PD patients in the future.

Recommended Citation:

R. Liu, L. Peng, L. Tong, and Y. Wu, “A Novel Method for Parkinson’s Disease Classification and Dyskinesia Quantification Using Wearable Inertial Sensors,” in 2019 IEEE 9th Annual International Conference on CYBER Technology in Automation, Control, and Intelligent Systems (CYBER), Suzhou, China: IEEE, Jul. 2019, pp. 1022–1026. doi: 10.1109/CYBER46603.2019.9066660. 

The Design of Wearable Wireless Inertial Measurement Unit for Body motion Capture System

IEEE International Conference on Intelligence and Safety for Robotics (ISR), 2018

Motion capture system serves as a critical technology in a wide range of applications. Nowadays, motion capture system based on inter components has become a new research hotspot. This paper provides a new design of low cost wearable wireless inertia measurement unit based on a 9-DOF micro inertia sensor MPU9250 and STM32F103C8T6 MCU. Furthermore, an ultra-low power (ULP) 2Mbps RF transceiver IC NRF24L01 was used as RF transceiver, which operates in 2.4GHz ISM (Industrial, Scientific and Medical) band, with peak RX/TX currents lower than 14mA. Finally, a program was built by Unity3D 5.6 on PC to display sensors real-time attitude and the experiment shows it has good performance in static and dynamic state.

Recommended Citation:

R. Liu, L. Peng, L. Tong, K. Yang, and B. Liu, “The Design of Wearable Wireless Inertial Measurement Unit for Body motion Capture System,” in 2018 IEEE International Conference on Intelligence and Safety for Robotics (ISR), Shenyang: IEEE, Aug. 2018, pp. 557–562. doi: 10.1109/IISR.2018.8535742.