Bethge, David (2023): Machine learning systems for human emotional states. Dissertation, LMU München: Fakultät für Mathematik, Informatik und Statistik |
Vorschau |
PDF
Bethge_David.pdf 50MB |
Abstract
Human emotions play a vital role in our everyday lives and influence our communication, perception and experience in the real world. However, detecting, modeling, and predicting human emotional states remains challenging. Emotions are subjective, ambiguous, volatile, and current sensing modalities lack robustness in realistic environments and are potentially intrusive. The central question of this thesis is: How can we design emotion-aware systems under these constraints that utilize recent advances in machine learning? We present novel input for human emotions based on multimodal and contextual data. We furthermore propose machine learning frameworks that provide more explanatory, generalizable, privacy-preserving, and multi-sensory emotion state predictions. We introduce two ubiquitous mobile sensing applications for human emotional states [P1, P6]. The applications are designed to be minimally intrusive and do not require body contact by acquiring contextual data only to infer emotions. We show that the re-formalization of the classification problem to an indirect estimation context-emotion pattern recognition works better than existing (facial expressions) baseline approaches. Our findings have major implications for future emotional state recognition systems because ubiquitous devices can easily acquire contextual data and transform them into smart emotion-estimation sensors. In the second pillar of this thesis, we focus on machine learning architectures for human emotional states. We present three different deep learning architectures for learning emotion representations from multivariate time-series data [P4 ,P3, P2]. We show how the representations can be designed to be emotion predictive but also participant-invariant [P2]. We furthermore propose an architecture that can learn a shared emotional representation from multiple noisy datasets with different inherent sensing characteristics [P3, P4]. In summary, the proposed approaches are tailored to capture detailed semantics in noisy input data and also propose measures for making the prediction more interpretable, domain-invariant and scalable. Furthermore, we present a novel transformer-based deep learning architecture for emotion prediction that is able to derive explanations alongside its predictions [P7]. Thereby, a better understanding of the link between emotion sensor input and the model decision can be derived that ultimately helps designing appropriate user interfaces powered by emotion prediction methods. Ultimately, we show a machine learning application of contextual emotion prediction that enables an emotion-aware experience in a dynamic driving environment [P5]. We present the first navigation application that is able to route after emotions. We thereby leverage the concept of context-emotion linking to predict emotional-states on future road-segments and optimize after travel-time and predicted emotional states. We show in a blind-user study, that emotional navigation has a positive effect on valence after riding the predicted happy route. Although our work is concentrated on human emotional states, we argue that this enables other human-states (e.g., fatigue, cognitive load) prediction systems to be designed to be more robust, scalable, privacy-aware. These systems also allow the user to understand and investigate the decision process and offer explanations. Furthermore, our work provides a foundation for emotion-aware systems in unconstrained, dynamic environments.
Dokumententyp: | Dissertationen (Dissertation, LMU München) |
---|---|
Keywords: | Machine Learning, Ubiquitous Computing, Automotive, Emotions, Affective Computing, Representation Learning, HCI |
Themengebiete: | 000 Allgemeines, Informatik, Informationswissenschaft
000 Allgemeines, Informatik, Informationswissenschaft > 004 Informatik |
Fakultäten: | Fakultät für Mathematik, Informatik und Statistik |
Sprache der Hochschulschrift: | Englisch |
Datum der mündlichen Prüfung: | 26. Mai 2023 |
1. Berichterstatter:in: | Schmidt, Albrecht |
MD5 Prüfsumme der PDF-Datei: | 1ba6548f7c7a127478de7e46012e256f |
Signatur der gedruckten Ausgabe: | 0001/UMC 29789 |
ID Code: | 32270 |
Eingestellt am: | 18. Aug. 2023 13:50 |
Letzte Änderungen: | 22. Aug. 2023 11:14 |