Sparse Ensemble Machine Learning to Improve Robustness of Long-term Decoding in iBMIs

Sparse Ensemble Machine Learning to Improve Robustness of Long-term Decoding in iBMIs
Title:
Sparse Ensemble Machine Learning to Improve Robustness of Long-term Decoding in iBMIs
Other Titles:
IEEE Transactions on Neural Systems and Rehabilitation Engineering
Keywords:
Publication Date:
27 December 2019
Citation:
S. Shaikh, R. So, T. Sibindi, C. Libedinsky and A. Basu, "Sparse Ensemble Machine Learning to Improve Robustness of Long-term Decoding in iBMIs," in IEEE Transactions on Neural Systems and Rehabilitation Engineering. doi: 10.1109/TNSRE.2019.2962708
Abstract:
This paper presents a novel sparse ensemble based machine learning approach to enhance robustness of intracortical Brain Machine Interfaces (iBMIs) in the face of non-stationary distribution of input neural data across time. Each classifier in the ensemble is trained on a randomly sampled (with replacement) set of input channels. These sparse connections ensure that with a high chance, few of the base classifiers should be less affected by the variations in some of the recording channels. We have tested the generality of this technique on different base classifiers - linear discriminant analysis (LDA), support vector machine (SVM), extreme learning machine (ELM) and multilayer perceptron (MLP). Results show decoding accuracy improvements of up to ≈ 21%, 13%, 19%, 10% in non-human primate (NHP) A and 7%, 9%, 7%, 9% in NHP B across test days while using the sparse ensemble approach over a single classifier model for LDA, SVM, ELM and MLP algorithms respectively. Furthermore, improvements of up to ≈ 7(14)%, 8(15)%, 9(19)%, 7(15)% in NHP A and 8(15)%, 12(20)%, 15(23)%, 12(19)% in NHP B over Random Forest (Long-short Term Memory) have been obtained by sparse ensemble LDA, SVM, ELM, MLP respectively.
License type:
PublisherCopyrights
Funding Info:
Description:
© 2019 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
ISSN:
1534-4320
1558-0210
Files uploaded:
File Size Format Action
There are no attached files.