Domain Adaptation and Transfer Learning methods enhance Deep Learning Models used in Inner Speech Based Brain Computer Interfaces

  • Luciano Ivan Zablocki Facultad de Ingeniería y Ciencias Hídricas
  • Agustín Nicolás Mendoza Facultad de Ingeniería y Ciencias Hídricas - Univ. Nac. Litoral
  • Nicolás Nieto Instituto de Matemática Aplicada del Litoral - Inst. de Investigación en Señales, Sistemas e Inteligencia Computacional - Univ. Nac. Litoral-CONICET
Palabras clave: Deep Learning, Domain Adaptation, Transfer Learning, Convolutional Neural Network

Resumen

Brain Computer Interfaces are useful devices that can partially restore the communication from severe compromised patients. Although the advances in deep learning have significantly improved brain pattern recognition, a large amount of data is required for training these deep architectures. In the last years, the inner speech paradigm has drew much attention, as it can potentially allow a natural control of different devices. However, as of the date of this publication, there is only a small amount of data available in this paradigm. In this work we show that it is possible, by means of transfer learning and domain adaptation methods, to make the most of the scarce data, enhancing the training process of a deep learning architecture used in brain computer interfaces. 

Publicado
2022-12-14
Cómo citar
Zablocki, L., Mendoza, A., & Nieto, N. (2022). Domain Adaptation and Transfer Learning methods enhance Deep Learning Models used in Inner Speech Based Brain Computer Interfaces. Memorias De Las JAIIO, 8(2), 54-60. Recuperado a partir de https://ojs.sadio.org.ar/index.php/JAIIO/article/view/263
Sección
ASAI - Simposio Argentino de Inteligencia Artificial