Bourse de MASTER 2 #DigitAg (2022)

Caractérisation de l’état de santé du pêcher par imagerie RGB de proxidétection : application à la teneur en chlorophylle et aux criblures foliaires

Developing agro-ecological practices in fruit tree production implies reducing pesticide use, however to date only few cultivars baring disease and pests resistances are available on the market. Quantifying resistances or tolerances among breeding material or genetic resources in the orchard is challenging because damages are typically assessed visually with ordinal scale and thus lack resolution and preciseness. In addition, only few tools allow measuring integrative traits (tree vigor, photosynthetic activity) in a quick and reliable way. Relying on the multidisciplinary expertise of a research institute, INRAE and a private company, HIPHEN, this master thesis will explore whether data acquired with close-range sensing can allow for the characterization of tree health components and whether these methods can be more efficient than usual ones. To this aim, one specific trait, shot hole symptoms (caused by Coryneum beijerinckii), and an integrative trait, chlorophyll content, have been chosen. These traits will be measured with standard methods (visually and with a chlorophyll-meter respectively) and using close-range imaging in two peach orchards managed under low phytosanitary protection. Image acquisition will be done with a RGB camera hold on a PHENOMAN pole. The student will participate to data acquisition and to pre-processing to build a reference dataset. He/she will then estimate relevant phenotypic variables using these images. For shot hole, deep learning algorithms will be trained on manually annotated pictures. For chlorophyll content, machine learning methods will link chlorophyll-meter measurements with one or several variables obtained from images. After a critical analysis of the results, the methods elaborated in this project could be deployed to a broader audience via acquisition protocols and analytic pipelines.

Date de modification : 21 juin 2023 | Date de création : 22 juin 2022 | Rédaction : SLP