Towards brain-activity-controlled information retrieval: Decoding image relevance from MEG signals.

Jukka-Pekka Kauppi, Melih Kandemir, Veli-Matti Saarinen, Lotta Hirvenkari, Lauri Parkkonen, Arto Klami, Riitta Hari, Samuel Kaski

Research output: Contribution to journalJournal articleResearchpeer-review

Abstract

We hypothesize that brain activity can be used to control future information retrieval systems. To this end, we conducted a feasibility study on predicting the relevance of visual objects from brain activity. We analyze both magnetoencephalographic (MEG) and gaze signals from nine subjects who were viewing image collages, a subset of which was relevant to a predetermined task. We report three findings: i) the relevance of an image a subject looks at can be decoded from MEG signals with performance significantly better than chance, ii) fusion of gaze-based and MEG-based classifiers significantly improves the prediction performance compared to using either signal alone, and iii) non-linear classification of the MEG signals using Gaussian process classifiers outperforms linear classification. These findings break new ground for building brain-activity-based interactive image retrieval systems, as well as for systems utilizing feedback both from brain activity and eye movements.
Original languageEnglish
JournalNeuroImage
Volume112
Pages (from-to)288-298
ISSN1053-8119
DOIs
Publication statusPublished - 2015
Externally publishedYes

Fingerprint

Dive into the research topics of 'Towards brain-activity-controlled information retrieval: Decoding image relevance from MEG signals.'. Together they form a unique fingerprint.

Cite this