Research and Innovation: VIVER - visão para aprender - Multimodal platform using Artificial Intelligence for people with language, communication, learning disorders and visual impairments.
Advanced search
Start date
Betweenand

VIVER - visão para aprender - Multimodal platform using Artificial Intelligence for people with language, communication, learning disorders and visual impairments.

Abstract

This project proposes an incremental innovation by enhancing the interface of a platform primarily developed for therapeutic stimulation of visual abilities, in a model implementing AI (Artificial Intelligence) based on the use of computer vision, convolutional neural networks, and machine learning to provide multimodal games. By multimodality, we understand the enhancement of visual, motor, auditory, and cognitive skills in an integrated manner within the same resource, catering to the needs of different age groups. This project aims to benefit individuals with language disorders, communication difficulties, learning disabilities, and visual impairments.By capturing digital interactions in the games using eye tracking, gesture recognition, and voice capture, the AI will provide data to support, guide, evaluate, and direct intervention for visually impaired individuals, those with reading and writing disorders, acquired neurological conditions, autism spectrum disorder, attention deficit hyperactivity disorder, speech impairments, both in face-to-face and remote therapeutic and educational settings. Also, the platform will provide the first library of specific pictograms in augmentative and alternative communication (AAC) for visually impaired individuals, as well as an integrated reward screen in the games.Regarding commercialization, based on robust marketing practices and business planning, the multimodal games will be offered through monthly/annual subscriptions, using a freemium model for healthcare professionals, educators, educational institutions, and rehabilitation centers, targeting the Ibero-Latin American market two years after project implementation. The domestic market (SOM) in the healthcare sector projects 27,000 professionals, indicating a potential niche for sales of multimodal stimulation resources. As for the target beneficiaries, the IBGE (2022) reports 18.9 million people with disabilities in Brazil, with 7.2 million with two or more functions altered.During the execution of this project, we plan to commercialize 2050 subscriptions (20,500 beneficiaries) for the visual skills platform, and at the end of 24 months, the project foresees migrating subscriptions to multimodal games with AI, with a 30% annual increment. It will be possible to benefit 72,000 people (1% of the Brazilian population with disabilities with one or more functions) by reaching 7,200 subscription sales. (AU)

Articles published in Agência FAPESP Newsletter about the research grant:
More itemsLess items
Articles published in other media outlets ( ):
More itemsLess items
VEICULO: TITULO (DATA)
VEICULO: TITULO (DATA)

Please report errors in scientific publications list using this form.
X

Report errors in this page


Error details: