Battelle used MATLAB to develop signal processing and machine learning algorithms and run the algorithms in real time.
The participant was shown a computer-generated virtual hand performing movements such as wrist flexion and extension, thumb flexion and extension, and hand opening and closing, and instructed to think about making the same movements with his own hand.
Working in MATLAB, the team developed algorithms to analyze data from the 96 channels in the implanted electrode array. Using Wavelet Toolbox™, they performed wavelet decomposition to isolate the frequency ranges of the brain signals that govern movement.
They performed transforms on the results of the decomposition in MATLAB to calculate mean wavelet power (MWP), reducing the 3000 features captured during each 100 millisecond window for a single channel to a single value.
The resulting 96 MWP values were used as feature vectors for machine learning algorithms that translate the features into individual movements.
The team used MATLAB to test several machine learning techniques, including discriminant analysis and support vector machines (SVM), settling on a custom SVM optimized for performance.
During testing sessions, the team trained the SVM by having the participant attempt the movements shown in the videos. They used the trained SVM’s output to animate a computer-generated virtual hand that the participant could manipulate on screen. The same SVM output was scaled and used to control the 130 channels of the NMES sleeve.
While the participant moved his arm and hand to perform simple movements, all signal processing, decoding, and machine learning algorithms were run in MATLAB in real time on a desktop computer.
Battelle engineers are currently using MATLAB to develop algorithms for the second- generation NeuroLife system, which will incorporate accelerometers and other sensors to enable the control algorithms to monitor the position of the arm and detect fatigue.