Advanced search
Start date

An attentional model for videos classification

Grant number: 18/10027-7
Support Opportunities:Scholarships in Brazil - Master
Effective date (Start): January 01, 2019
Effective date (End): September 30, 2020
Field of knowledge:Physical Sciences and Mathematics - Computer Science - Computer Systems
Principal Investigator:Gerberth Adín Ramírez Rivera
Grantee:Darley Freire Barreto
Host Institution: Instituto de Computação (IC). Universidade Estadual de Campinas (UNICAMP). Campinas , SP, Brazil


An important property of human perception is that one does not tend to process a whole scene in its entirety at once, for example, if we want to tell which model is a car, we do not need to look at people or animals in the environment. Likewise, when doing classification in pictures, we need to “pay attention” in certain parts of the image. Attention is an important component of neural networks that adapts them to understand sequences such, as video, voice, text, etc. In recent years, researchers have using attention to yield state-of-art models for tasks such, as image classification, image captioning, image generation, object recognition, action recognition, image segmentation, video captioning, machine translation, visual question answering, etc. One big challenge is that the majority of attention-based models focus on Natural Language or related tasks, and video classification is not that much explored. Thus in this Masters project, we propose to evaluate state-of-art models and to create a novel model to classify video based on attention. (AU)

News published in Agência FAPESP Newsletter about the scholarship:
Articles published in other media outlets (0 total):
More itemsLess items

Academic Publications
(References retrieved automatically from State of São Paulo Research Institutions)
BARRETO, Darley Freire. Um modelo de atenção visual hierárquico. 2021. Master's Dissertation - Universidade Estadual de Campinas (UNICAMP). Instituto de Computação Campinas, SP.

Please report errors in scientific publications list by writing to: