Machine learning techniques have been widely used in a wide range of applications, especially those based on deep learning. However, these techniques have several hyperparameters that require their adjustment in a personalized way for each base, being essential for a good performance of the technique. The present research project aims to introduce a Genetic Programming (GP) approach for the fine-tuning of hyperparameters of recurrent neural networks. More specifically, word representation, number of layers, number of hidden units and word batch size processed for LSTMs (Long Short-Term Memory) will be optimized, and the results will be validated in text-driven databases. The task to be studied is the grammatical class recognition of words, known as Part-of-Speech (POS) Tagging. For purposes of comparison, widely known public databases such as the Brown corpus will be used. Besides, the project also has a period of an internship abroad through the FAPESP/BEPE.
News published in Agência FAPESP Newsletter about the scholarship: