The Neuromuscular Rehabilitation Engineering Laboratory (NREL) from the joint department of Biomedical Engineering co-located at North Carolina State University and the University of North Carolina at Chapel Hill develops innovative technologies for prosthetic limbs. The long-term research goal of NREL is to improve the quality of life of persons with limb amputations.
Emerging powered lower limb prostheses have enabled lower limb amputees to efficiently perform various locomotor tasks, e.g. climbing stairs and even running. The research team at NREL has developed a novel neural-machine interface (NMI) to further advance the function of these robotic prosthetic legs [1-2]. This interface deciphers the neuromuscular signals, recognizes the wearers’ intent regarding locomotion tasks, and operates the prostheses to enable the wearers to switch between locomotor tasks intuitively and seamlessly. However, intent recognition errors from NMI have been observed. The effects of these errors on prosthesis operation and the wearers' task performance are unknown. Towards development of robust neural control for powered artificial legs, the objective of the presented study was to quantify the effects of intent recognition errors, generated from the NMI for powered transfemoral prostheses, on the balance of the prosthesis wearers. The Xsens MVN Analyze motion capture system was applied to quantify the wearers' walking balance.
Method
Subjects
Two transfemoral amputees participated in this study with IRB approval. Informed consent was signed and collected from each subject.
Measurements
The subjects wore 17 Xsens MTx sensors, mounted on the MVN straps, during the experiment (Figure 1). The kinematic data were sampled at 100 Hz.
Experimental protocol
During the experiments, the subjects wore a tethered, powered transfemoral prosthesis [3]. They were instructed to walk on a 15-feet level-ground pathway, ascend or descend a 10-feet ramp with a platform. The intent recognition errors were generated by a neural control simulator programmed in LabView (National Instruments, TX, US).
Data processing
The position and angular velocity of each body segment, measured by the Xsens MVN Analyze motion capture system, were used to compute the whole-body angular momentum in the sagittal plane, a parameter for evaluating walking balance [4].
Results
In Figure 2, examples of whole-body angular momentum of a subject when two intent recognition errors were simulated at heel contact can be seen. The angular momentum was not affected when the error pulse was 100ms in length (Figure 2A), while significant forward momentum was observed when the duration of the error was 300ms (Figure 2B).
Discussion
The Xsens MVN Analyze motion capture system provided reliable, good-quality kinematics recordings when the transfemoral amputees walked on uneven terrains with a powered prosthesis. The study results showed that not all the intent recognition errors caused unstable balance in the prosthesis wearers. The effects of errors depended on the timing of error occurrence and the error duration. These results suggested that the future engineering efforts should be focused on eliminating the effects of critical NMI errors that disturb the wearer’s balance in order to develop robust neural control for powered lower limb prostheses.
[1] Huang, He, Todd A. Kuiken, and Robert D. Lipschutz. "A strategy for identifying locomotion modes using surface electromyography." Biomedical Engineering, IEEE Transactions on 56.1 (2009): 65-73.
[2] Huang, He, et al. "Continuous Locomotion-Mode Identification for Prosthetic Legs Based on Neuromuscular–Mechanical Fusion." Biomedical Engineering, IEEE Transactions on 58.10 (2011): 2867-2875.
[3] Liu, Ming, et al. "Improving Finite State Impedance Control of Active-Transfemoral Prosthesis Using Dempster-Shafer Based State Transition Rules."Journal of Intelligent & Robotic Systems (2013): 1-14.
[4] Herr, Hugh, and Marko Popovic. "Angular momentum in human walking."Journal of Experimental Biology 211.4 (2008): 467-481.
Are you interested our solutions? Please click on the button below to contact us.