Current studies in the field of Human-Computer Interaction have drawn attention to the importance of emotional factors while interacting with computer systems. It is believed that if intelligent agents are employed to determine the nature of a user's emotions, they can induce and arouse them in a way that acts as a stimulus while interacting with computers. For example, if the computer finds out that the user is tired, under stress or depressed, then it attempts to ``persuade him'' to overcome this emotional state through a song, video and/or a game. A major challenge for researchers in human-computer interaction is to provide systems that are capable of recognizing, interpreting and reacting intelligently and sensitively to the emotions of the user, and thus meet the requirements of the largest possible number of individuals. One of the ways to achieve this in this project is to develop flexible systems that can handle a large number of emotions/types of behavior. The main purpose in carrying out this emotional interaction is to improve the coherence, consistency and credibility of the reactions and computational responses that occur during a human interaction via a human-computer interface. In this context, there is an opportunity to explore computational systems that are able to discover or infer the emotional state of the user at runtime; for example, if the software is aware of a "sad" emotional state of an elderly individual, it may suggest a song, a movie or an activity which can this change to a "happy" emotional state ,and thus determine and/or prevent emotional disorders. This project aims to develop and evaluate a model that can: i) find out the emotional state of elderly users; ii) explore multiple sensors for the classification of emotion, and; iii) to perform the relation of the responses of multiple sensors to determine the emotion of the user.
News published in Agência FAPESP Newsletter about the scholarship: