Advanced search
Start date
Betweenand


Iterative Pseudo-Labeling with Deep Feature Annotation and Confidence-Based Sampling

Full text
Author(s):
Benato, Barbara C. ; Telea, Alexandra C. ; Falcao, Alexandre X. ; IEEE Comp Soc
Total Authors: 4
Document type: Journal article
Source: 2021 34TH SIBGRAPI CONFERENCE ON GRAPHICS, PATTERNS AND IMAGES (SIBGRAPI 2021); v. N/A, p. 7-pg., 2021-01-01.
Abstract

Training deep neural networks is challenging when large and annotated datasets are unavailable. Extensive manual annotation of data samples is time-consuming, expensive, and error-prone, notably when it needs to be done by experts. To address this issue, increased attention has been devoted to techniques that propagate uncertain labels (also called pseudo labels) to large amounts of unsupervised samples and use them for training the model. However, these techniques still need hundreds of supervised samples per class in the training set and a validation set with extra supervised samples to tune the model. We improve a recent iterative pseudo-labeling technique, Deep Feature Annotation (DeepFA), by selecting the most confident unsupervised samples to iteratively train a deep neural network. Our confidence-based sampling strategy relies on only dozens of annotated training samples per class with no validation set, considerably reducing user effort in data annotation. We first ascertain the best configuration for the baseline - a self-trained deep neural network - and then evaluate our confidence DeepFA for different confidence thresholds. Experiments on six datasets show that DeepFA already outperforms the self-trained baseline, but confidence DeepFA can considerably outperform the original DeepFA and the baseline. (AU)

FAPESP's process: 19/10705-8 - Visual Active Learning guided by Feature Projections
Grantee:Bárbara Caroline Benato
Support Opportunities: Scholarships in Brazil - Doctorate
FAPESP's process: 14/12236-1 - AnImaLS: Annotation of Images in Large Scale: what can machines and specialists learn from interaction?
Grantee:Alexandre Xavier Falcão
Support Opportunities: Research Projects - Thematic Grants