Advanced search
Start date

Hyperparameter Optimization in Deep Learning

Grant number: 23/03479-7
Support Opportunities:Scholarships in Brazil - Scientific Initiation
Effective date (Start): May 01, 2023
Effective date (End): March 31, 2024
Field of knowledge:Physical Sciences and Mathematics - Computer Science - Computing Methodologies and Techniques
Principal Investigator:Antonio Rafael Sabino Parmezan
Grantee:João Pedro Ribeiro da Silva
Host Institution: Instituto de Ciências Matemáticas e de Computação (ICMC). Universidade de São Paulo (USP). São Carlos , SP, Brazil
Associated research grant:19/17721-9 - The role of Chemistry in holobiont adaptation, AP.TEM


Deep learning is an artificial neural network-based machine learning technique that has been successfully employed in many applications, including image analysis, speech recognition, and text comprehension. It consists of multi-level layers, many capable of concurrently running processes with high-level features defined and automatically extracted from low-level ones. A major challenge in building deep learning models is determining the best setting for their hyperparameters, i.e., finding the combination of adjustment values that maximizes the model's performance during the training phase. In this project, we aim to investigate hyperparameter optimization methods in deep learning models, especially in fine-tuning pre-trained convolutional neural networks, which will be used to classify small and large microorganism image sets. The efficiency of the researched hyperparameter optimization methods will be evaluated in terms of accuracy, running time, and number of iterations considering custom hardware.

News published in Agência FAPESP Newsletter about the scholarship:
Articles published in other media outlets (0 total):
More itemsLess items

Please report errors in scientific publications list by writing to: