The Multimodal interaction group (MMI), part of the Computer vision and multimedia laboratory (CVML), is interested in the use of various modalities for human-computer interaction and affective computing. The currently studied modalities are visual (eye-gaze tracking), auditory (3D sonification), haptic, physiological (EEG’s, peripheral physiological signals). The three main research and development topics, concerning interaction using physiological signals and eInclusion, are: 1. brain-computer interaction (BCI) based on EEG’s, and brain sources reconstruction (forward and inverse problem); 2. affective computing and emotions assessment; 3. electronic aids for visually handicapped people and for the elderly.
Our specialized equipment includes acquisition systems for EEG’s (Biosemi, GTec, Emotiv), peripheral physiological signals (EMG, blood and breathing rates, EDR, skin temperature), eye gaze trackers (Tobii), stereo and 3D+ cameras, etc.
A Comparative Survey of Methods for Remote Heart Rate Detection From Frontal Face Videos.
The impact of denoising on independent component analysis of functional magnetic resonance imaging data.
Computational aspects of the EEG forward problem solution for real head model using finite element method.
EEG-based synchronized brain-computer interfaces: a model for optimizing the number of mental tasks.
Faculté des sciences
Université de Genève