Brain-Computer Engagement

Viewing 0 reply threads
  • Author
    Posts
    • #79830
      Louis Scapillato
      Participant

      Title

      Towards an Affective Brain-Computer Interface Monitoring Musical Engagement

      Abstract

      A non-invasive way to monitor a music listener’s level of engagement could give us a valuable tool for music classification, technology, and therapy. To investigate whether musical engagement can be monitored, we developed an experimental protocol using the mobile brain/body imagine (MoBI) paradigm in which participants make expressive rhythmic arm gestures to encourage and/or index musical engagement. Participants communicate the feeling pulse of music they are hearing via simple rhythmic U-shaped back-and-forth video display in front of them. Participants are asked to imagine that this display is also being viewed remotely by a deaf friend to whom they are attempting to communicate the feeling of the music they are hearing. In an engaged condition, listener are encouraged to fully engage themselves in this musical/emotional commuication task. In a not engaged condition, a concurrent internal arithmetic distractor task is introduced to induce less fully engaged listening. Here, we report results of training a classifier using a frequency-based common spatial patterns (FBCSP) approach to correctly distinguish engaged and not engaged conditions from concurrently recorded EEG data. Here the approach gave 67% classification accuracy across subjects (versus 50% chance), and 85% accuracy within subjects, cross-validated using a block wise paradigm.

      Authors

      Grace Leslie, Alejandro Ojeda, Scott Makeig

      Citation

      Leslie, G., Ojeda A., & Makeig, S. (2013). Towards an affective brain computer interface monitoring musical engagement. 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction, 871-875.

Viewing 0 reply threads
  • You must be logged in to reply to this topic.