ROBERTO NAVARRO DE MESQUITA

Resumo

Possui graduaĆ§Ć£o em FĆ­sica pela Universidade Estadual de Campinas (1987), mestrado em FĆ­sica pela Universidade Estadual de Campinas (1991) e doutorado em Engenharia MecĆ¢nica pela Universidade de SĆ£o Paulo (2002). Atualmente Ć© tecnologista pleno da ComissĆ£o Nacional de Energia Nuclear. Tem experiĆŖncia na Ć”rea de CiĆŖncia da ComputaĆ§Ć£o, com ĆŖnfase em Sistemas de InteligĆŖncia Artificial, atuando principalmente nos seguintes temas: inteligĆŖncia artificial, diagnĆ³stico de defeitos em tubos, correntes parasitas (ECT), reconhecimento de padrƵes em imagens. (Texto extraĆ­do do CurrĆ­culo Lattes em 27 dez. 2021)

Projetos de Pesquisa
Unidades Organizacionais
Cargo

Resultados de Busca

Agora exibindo 1 - 1 de 1
  • Artigo IPEN-doc 27565
    Nutritional evaluation of Brachiaria brizantha cv. marandu using convolutional neural networks
    2020 - DAL PRƁ, BRUNO R.; MESQUITA, ROBERTO N. de; MENEZES, MARIO O. de; ANDRADE, DELVONEI A. de
    The identification of plant nutritional stress based on visual symptoms is predominantly done manually and is performed by trained specialists to identify such anomalies. In addition, this process tends to be very time consuming, has a variability between crop areas and is often required for analysis at various points of the property. This work proposes an image recognition system that analyzes the nutritional status of the plant to help solve these problems. The methodology uses deep learning that automates the process of identifying and classifying nutritional stress of Brachiaria brizantha cv. marandu. An image recognition system was built and analyzes the nutritional status of the plant using the digital images of its leaves. The system identifies and classifies Nitrogen and Potassium deficiencies. Upon receiving the image of the pasture leaf, after a classification performed by a convolutional neural network (CNN), the system presents the result of the diagnosed nutritional status. Tests performed to identify the nutritional status of the leaves presented an accuracy of 96%. We are working to expand the data of the image database to obtain an increase in the accuracy levels, aiming at the training with a larger amount of information presented to CNN and, thus, obtaining results that are more expressive.