CISA affiliated groups


Multimedia Affective Analysis Group

 

Our work is focuses on emotional understanding of multimedia content and automatic emotion recognition.  Emotional understanding of multimedia content involves developing models that can automatically predict the emotion expressed by multimedia content. For example emotion express in a song can be estimated from its acoustic content.  We are also interested in multimodal emotion recognition from facial expressions and physiological responses. Our team is also affiliated with the CVML Lab at the Computer Science department.

TEAM

Dr. Mohammad Soleymani
PhD, Senior researcher (Group Leader)
E-mail

More...

Dr. Anna Aljanaki
PhD, Postdoctoral researcher
E-mail

More...

Soheil Rayatdoost
PhD student in Computer science
E-mail

More...

Stan Blachut
MSc student in Neuroscience

More...

Manuel Ogi
Research assistant

More...

PROJECTS

Automatic recognition of visual interest and interestingness

This project’s main aims are twofold; first, studying knowledge emotions in multimedia search and browsing; second, developing tools for automatic recognition of knowledge emotions. We studied   the underlying attributes that construct visual interest, e.g., novelty, coping potential, quality. We then learn these sub-components both from the visual content and users’ spontaneous reactions. The analysis of image and GIF interestingness demonstrated the feasibility of predicting their overall interestingness from their visual content.

Funded by: Swiss National Science Foundation

setup-small.jpg entertainment_overlaid-small.jpg

Inter-modality interaction between EEG signals and facial expressions for emotion recognition

The goal of this project is to identify the spatio-temporal patterns of EEG artifacts caused by facial expressions. We record EEG signals and facial expressions from participants in different conditions. We then use signal processing and machine learning models to automatically learn and separate muscular from cerebral activities. The muscular activities recorded on EEG signals can be then discarded for EEG analysis and used separately for emotion recognition.

Funded by: Hasler stiftung

 

mogroup4.jpg

 

Emotional analysis in music

Emotional expression is a very important component of music, which is central to music listening habits within a context, and therefore automatic music emotion recognition is a key to successful music recommendations and playlist generation. Analogously to such broad concepts as genre or world music styles, musical emotions are influenced by every element of musical audio. In this project we focus on computational extraction of the elements that we call "mid-level features", because they are situated between the low-level timbres, chords, beats, and high-level styles and emotions. These elements, such as melodiousness, rhythmic complexity, harmoniousness or atonality, are created through musical structure, both vertical (harmonic structure) and horizontal (repetition). We are working on methods to extract these mid-level features in western music.

Funded by: Swiss excellence scholarship awarded to A. Aljanaki

mid-level-project-small.jpg

 

REPRESENTATIVE PUBLICATIONS

  1. A. Aljanaki, Y.-H. Yang, M. Soleymani. Developing a Benchmark for Emotional Analysis in Music, PLOS ONE, to appear, 2017.
  2. M. Soleymani, F. Villaro-Dixon, T. Pun, G. Chanel. Toolbox for Emotional fEAture extraction from Physiological signals (TEAP) Frontiers in ICT - Human-Media Interaction, 2017.
  3. M. Soleymani, S. Asghari-Esfeden, Y. Fu, M. Pantic. Analysis of EEG signals and facial expressions for continuous emotion detection, IEEE Transactions on Affective Computing, 7(1): pp. 17-28, 2016.
  4. M. Gygli, M. Soleymani. Analyzing and Predicting GIF Interestingness, ACM International Conference on Multimedia (MM), Amsterdam, the Netherlands, 2016.
  5. M. Soleymani. The Quest for Visual Interest, ACM International Conference on Multimedia (MM), Brisbane, Australia, 2015.
  6. M. Soleymani, A. Aljanaki, F. Wiering, R.C. Veltkamp. Content-based music recommendation using underlying music preference structure, IEEE International Conference on Multimedia and Expo (ICME), Torino, Italy, 2015.

RESEARCH RESOURCES

Toolbox for Emotion Analysis using Physiological signals (TEAP)

DEAM dataset - The MediaEval Database for Emotional Analysis of Music

GIF interestingness database (ACM MM 2016)

Visual Interest database - 1005 images from Flickr with interest and emotional ratings + workers' personality traits (ACM MM 2015 short paper)

Emotion in Music or 1000 songs (Mediaeval 2013)

MAHNOB HCI Impilcit tagging database

(DEAP) A database for emotion analysis using EEG, physiological signals and video

Violent scenes in movies (Mediaeval 2011 and 2012)

top