Visual Input Affects the Decoding of Imagined Movements of the Same Limb

Research output: Contribution to conferencePosterResearchpeer-review

Abstract

A better understanding how movements are encoded in electroencephalography (EEG) signals is required to develop a more natural control for motor neuroprostheses. We decoded imagined hand close and supination movements from seven healthy subjects and investigated the influence of the visual input. We found that motor imagination of these movements can be decoded from low-frequency time-domain EEG signals with a maximum average classification accuracy of 57.3 +/- 5.0%. The simultaneous observation of congruent hand movements increased the classification accuracy to 64.1 +/- 8.3%. Furthermore, the sole observation of hand movements yielded discriminable brain patterns (61.9 +/- 5.5%). These findings show that for low-frequency time-domain EEG signals, the type of visual input during classifier training affects the performance and has to be considered in future studies.
Original languageEnglish
Publication statusPublished - 18 Sep 2017
Event7th Graz BCI Conference 2017: From Vision to Reality - Graz, Austria
Duration: 18 Sep 201722 Sep 2017

Conference

Conference7th Graz BCI Conference 2017
CountryAustria
CityGraz
Period18/09/1722/09/17

Fingerprint

Electroencephalography
Decoding
Brain
Classifiers

Fields of Expertise

  • Human- & Biotechnology

Treatment code (Nähere Zuordnung)

  • Basic - Fundamental (Grundlagenforschung)

Cite this

Ofner, P., Kersch, P., & Müller-Putz, G. (2017). Visual Input Affects the Decoding of Imagined Movements of the Same Limb. Poster session presented at 7th Graz BCI Conference 2017, Graz, Austria.

Visual Input Affects the Decoding of Imagined Movements of the Same Limb. / Ofner, Patrick; Kersch, Philipp; Müller-Putz, Gernot.

2017. Poster session presented at 7th Graz BCI Conference 2017, Graz, Austria.

Research output: Contribution to conferencePosterResearchpeer-review

Ofner, P, Kersch, P & Müller-Putz, G 2017, 'Visual Input Affects the Decoding of Imagined Movements of the Same Limb' 7th Graz BCI Conference 2017, Graz, Austria, 18/09/17 - 22/09/17, .
Ofner P, Kersch P, Müller-Putz G. Visual Input Affects the Decoding of Imagined Movements of the Same Limb. 2017. Poster session presented at 7th Graz BCI Conference 2017, Graz, Austria.
Ofner, Patrick ; Kersch, Philipp ; Müller-Putz, Gernot. / Visual Input Affects the Decoding of Imagined Movements of the Same Limb. Poster session presented at 7th Graz BCI Conference 2017, Graz, Austria.
@conference{d6f4d0601d9c4fbda67b68e883030e26,
title = "Visual Input Affects the Decoding of Imagined Movements of the Same Limb",
abstract = "A better understanding how movements are encoded in electroencephalography (EEG) signals is required to develop a more natural control for motor neuroprostheses. We decoded imagined hand close and supination movements from seven healthy subjects and investigated the influence of the visual input. We found that motor imagination of these movements can be decoded from low-frequency time-domain EEG signals with a maximum average classification accuracy of 57.3 +/- 5.0{\%}. The simultaneous observation of congruent hand movements increased the classification accuracy to 64.1 +/- 8.3{\%}. Furthermore, the sole observation of hand movements yielded discriminable brain patterns (61.9 +/- 5.5{\%}). These findings show that for low-frequency time-domain EEG signals, the type of visual input during classifier training affects the performance and has to be considered in future studies.",
author = "Patrick Ofner and Philipp Kersch and Gernot M{\"u}ller-Putz",
year = "2017",
month = "9",
day = "18",
language = "English",
note = "7th Graz BCI Conference 2017 : From Vision to Reality ; Conference date: 18-09-2017 Through 22-09-2017",

}

TY - CONF

T1 - Visual Input Affects the Decoding of Imagined Movements of the Same Limb

AU - Ofner, Patrick

AU - Kersch, Philipp

AU - Müller-Putz, Gernot

PY - 2017/9/18

Y1 - 2017/9/18

N2 - A better understanding how movements are encoded in electroencephalography (EEG) signals is required to develop a more natural control for motor neuroprostheses. We decoded imagined hand close and supination movements from seven healthy subjects and investigated the influence of the visual input. We found that motor imagination of these movements can be decoded from low-frequency time-domain EEG signals with a maximum average classification accuracy of 57.3 +/- 5.0%. The simultaneous observation of congruent hand movements increased the classification accuracy to 64.1 +/- 8.3%. Furthermore, the sole observation of hand movements yielded discriminable brain patterns (61.9 +/- 5.5%). These findings show that for low-frequency time-domain EEG signals, the type of visual input during classifier training affects the performance and has to be considered in future studies.

AB - A better understanding how movements are encoded in electroencephalography (EEG) signals is required to develop a more natural control for motor neuroprostheses. We decoded imagined hand close and supination movements from seven healthy subjects and investigated the influence of the visual input. We found that motor imagination of these movements can be decoded from low-frequency time-domain EEG signals with a maximum average classification accuracy of 57.3 +/- 5.0%. The simultaneous observation of congruent hand movements increased the classification accuracy to 64.1 +/- 8.3%. Furthermore, the sole observation of hand movements yielded discriminable brain patterns (61.9 +/- 5.5%). These findings show that for low-frequency time-domain EEG signals, the type of visual input during classifier training affects the performance and has to be considered in future studies.

M3 - Poster

ER -