Advanced search
Start date

Exploring visual analytics for supporting the user in active learning

Grant number: 22/12668-5
Support Opportunities:Scholarships abroad - Research Internship - Doctorate
Effective date (Start): January 01, 2023
Effective date (End): December 31, 2023
Field of knowledge:Physical Sciences and Mathematics - Computer Science - Computing Methodologies and Techniques
Principal Investigator:Alexandre Xavier Falcão
Grantee:Bárbara Caroline Benato
Supervisor: Alexandru-Cristian Telea
Host Institution: Instituto de Computação (IC). Universidade Estadual de Campinas (UNICAMP). Campinas , SP, Brazil
Research place: Utrecht University (UU), Netherlands  
Associated to the scholarship:19/10705-8 - Visual Active Learning guided by Feature Projections, BP.DR


Deep neural networks can be very effective for image classification. However, they usually rely on large supervised training sets to reduce/avoid high classification accuracy on the training data with low accuracy on unseen test sets --- a phenomenon known as \emph{data overfitting}. Large supervised (labeled) training sets are usually impractical in applications that require human expertise for data supervision. Traditional Active Learning (AL) strategies are commonly used to increase the number of supervised samples with the assistance of the apprentice classifier. The AL process indicates the most important samples for user annotation at each iteration, and the classifier is retrained with the new supervised set. Deep AL approaches have been considered since they allow for jointly learning the feature space and classifier. However, they still rely on untrustworthy assumptions: large labeled training sets, softmax layer to provide the confidence information, and fine-tuning to avoid the model retraining. To circumvent those issues, we propose to use two distinct learning steps: (i) label propagation from a few supervised samples and confidence-based sample selection, and (ii) feature and classifier learning using the labeled samples to improve the feature space of the deep model. For that, we have studied solutions in the main project (#2019/10705-8) that iteratively combine deep feature learning and feature space projections to label large unsupervised datasets. We have investigated the best automatic technique for each step of our proposed methodology. We are next interested in adding Active Learning procedures through Visual Analytics tools to increase the number of supervised samples and/or improve confidence when selecting labeled samples to train the deep model. As earlier proposed by the candidate, the user can visualize supervised (small set) and unsupervised (large set) training samples on a 2D projection space and propagate labels to the unsupervised ones, with and without the assistance of a pattern classifier, but with no further object supervision. Briefly, the goal of this BEPE project is to design a Visual Analytics workflow for interactive-and-incremental machine learning focused on AL. Validation studies will be conducted with image datasets from distinct applications related to the thematic project (#2014/12236-1), coordinated by the advisor. Also, this BEPE project will allow a joint PhD degree for the candidate at the Utrecht University, under supervision of Prof. Alexandru Telea (the co-advisor). (AU)

News published in Agência FAPESP Newsletter about the scholarship:
Articles published in other media outlets (0 total):
More itemsLess items

Scientific publications
(References retrieved automatically from Web of Science and SciELO through information on FAPESP grants and their corresponding numbers as mentioned in the publications by the authors)
BENATO, BARBARA C.; FALCAO, ALEXANDRE X.; TELEA, ALEXANDRU C.. Measuring the quality of projections of high-dimensional labeled data. COMPUTERS & GRAPHICS-UK, v. 116, p. 11-pg., . (22/12668-5, 14/12236-1, 19/10705-8)
BENATO, BARBARA C.; TELEA, ALEXANDRU C.; FALCAO, ALEXANDRE X.. Deep feature annotation by iterative meta-pseudo-labeling on 2D projections. PATTERN RECOGNITION, v. 141, p. 16-pg., . (22/12668-5)

Please report errors in scientific publications list using this form.