This project focuses on the emotional analysis of music, and how such techniques can enable better music recommendations. You will likely work on existing datasets e. PMEmo dataset , however you can choose to collect your own if it aligns better with your research interests. You will investigate how DNNs can be used for music emotion recognition, and explore fusion methods for fusing multimodal music data e. Contact: Abdallah El Ali aea cwi. Emotion recognition has moved away from the desktop, and on to the road, whether in automated or non-automated vehicles.
This requires collecting precise ground truth labels in such settings, that do not pose driver distraction whether the primary task is driving or situation monitoring in the case of automated driving. This project asks: How can drivers continuously annotate how they are feeling while driving? How can we ensure that providing such annotation is not distracting from their primary task e. It will require prototyping emotion input techniques on the steering wheel, and evaluating them in a desktop-based driving simulator study to ensure high usability of the wheel concept and high quality of the collected annotations.
Emotion recognition has moved away from the desktop, and on to virtual environments. This requires collecting ground truth labels in such settings. This project asks: How can we continuously annotate how we are feeling while immersed in a mixed or fully virtual environment?