Paper
1 April 2016 Classification of motor intent in transradial amputees using sonomyography and spatio-temporal image analysis
Harishwaran Hariharan, Nima Aklaghi, Clayton A. Baker, Huzefa Rangwala, Jana Kosecka, Siddhartha Sikdar
Author Affiliations +
Abstract
In spite of major advances in biomechanical design of upper extremity prosthetics, these devices continue to lack intuitive control. Conventional myoelectric control strategies typically utilize electromyography (EMG) signal amplitude sensed from forearm muscles. EMG has limited specificity in resolving deep muscle activity and poor signal-to-noise ratio. We have been investigating alternative control strategies that rely on real-time ultrasound imaging that can overcome many of the limitations of EMG. In this work, we present an ultrasound image sequence classification method that utilizes spatiotemporal features to describe muscle activity and classify motor intent. Ultrasound images of the forearm muscles were obtained from able-bodied subjects and a trans-radial amputee while they attempted different hand movements. A grid-based approach is used to test the feasibility of using spatio-temporal features by classifying hand motions performed by the subjects. Using the leave-one-out cross validation on image sequences acquired from able-bodied subjects, we observe that the grid-based approach is able to discern four hand motions with 95.31% accuracy. In case of the trans-radial amputee, we are able to discern three hand motions with 80% accuracy. In a second set of experiments, we study classification accuracy by extracting spatio-temporal sub-sequences the depict activity due to the motion of local anatomical interfaces. Short time and space limited cuboidal sequences are initially extracted and assigned an optical flow behavior label, based on a response function. The image space is clustered based on the location of cuboids and features calculated from the cuboids in each cluster. Using sequences of known motions, we extract feature vectors that describe said motion. A K-nearest neighbor classifier is designed for classification experiments. Using the leave-one-out cross validation on image sequences for an amputee subject, we demonstrate that the classifier is able to discern three important hand motions with an accuracy of 93.33% accuracy, 91–100% precision and 80–100% recall rate. We anticipate that ultrasound imaging based methods will address some limitations of conventional myoelectric sensing, while adding advantages inherent to ultrasound imaging.
© (2016) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Harishwaran Hariharan, Nima Aklaghi, Clayton A. Baker, Huzefa Rangwala, Jana Kosecka, and Siddhartha Sikdar "Classification of motor intent in transradial amputees using sonomyography and spatio-temporal image analysis", Proc. SPIE 9790, Medical Imaging 2016: Ultrasonic Imaging and Tomography, 97901Q (1 April 2016); https://doi.org/10.1117/12.2217174
Lens.org Logo
CITATIONS
Cited by 1 scholarly publication.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Ultrasonography

Electromyography

Image classification

Optical flow

Visualization

Feature extraction

Control systems

Back to Top