REAL-TIME RECOGNITION OF AFFECTIVE RESPONSES TO CLIMATE-CHANGE IMAGES: A DEEP-LEARNING MODEL A collaboration between the CREx and LPL engineer Gilles Pouchoulin (IR)
Résumé
A model of emotion recognition is applied to test the images of French Affective Images of Climate Change (FAICC) database (Otavi, Roussel & Syssau, 2021) via real-time EEG recording. Multi-channel EEG is recorded while participants are presented images from the FAICC. In real time, brain maps of the frequency-band features of the recorded EEG are presented to a CNN (Convolutional Neural Network) . The CNN was trained and tested on the EEG data from the DEAP dataset (Koelstra et al, 2012); specifically the model was trained on EEG features related to Arousal and Valence. The ultimate aim of this work is to test, in an objective manner, the response to climate change images, with a view to ascertaining their effectiveness in inciting people to act positively.
Origine | Fichiers produits par l'(les) auteur(s) |
---|