- AutorIn
- Valentin Forch Technische Universität Chemnitz
- Dr. habil. Julien VitayTechnische Universität Chemnitz
- Prof. Dr. Fred H. HamkerTechnische Universität Chemnitz
- Titel
- Recurrent Spatial Attention for Facial Emotion Recognition
- Zitierfähige Url:
- https://nbn-resolving.org/urn:nbn:de:bsz:ch1-qucosa2-724532
- Konferenz
- LocalizeIT Workshop, Chemnitzer Linux-Tage. Chemnitz, 16.-17. 3. 2019
- Quellenangabe
- Chemnitzer Informatik-Berichte
Herausgeber: Kowerko, Danny
Erscheinungsort: Chemnitz
Erscheinungsjahr: 2020
Titel Schriftenreihe: Chemnitzer Informatik-Berichte
Seiten: 1-8
ISSN: 0947-5125 - Erstveröffentlichung
- 2020
- Abstract (EN)
- Automatic processing of emotion information through deep neural networks (DNN) can have great benefits (e.g., for human-machine interaction). Vice versa, machine learning can profit from concepts known from human information processing (e.g., visual attention). We employed a recurrent DNN incorporating a spatial attention mechanism for facial emotion recognition (FER) and compared the output of the network with results from human experiments. The attention mechanism enabled the network to select relevant face regions to achieve state-of-the-art performance on a FER database containing images from realistic settings. A visual search strategy showing some similarities with human saccading behavior emerged when the model’s perceptive capabilities were restricted. However, the model then failed to form a useful scene representation.
- Freie Schlagwörter (EN)
- emotion recognition, attention, LSTM
- Klassifikation (DDC)
- 004.152
- Normschlagwörter (GND)
- Deep Learning
- Publizierende Institution
- Technische Universität Chemnitz, Chemnitz
- Version / Begutachtungsstatus
- publizierte Version / Verlagsversion
- URN Qucosa
- urn:nbn:de:bsz:ch1-qucosa2-724532
- Veröffentlichungsdatum Qucosa
- 15.10.2020
- Dokumenttyp
- Konferenzbeitrag
- Sprache des Dokumentes
- Englisch
- Lizenz / Rechtehinweis
CC BY-SA 4.0