To enhance our scientific understanding of the spatio-temporal dynamics that underlay plant environmental interactions such as nutrient imbalance or the spreading of plant diseases, we need to monitor single plant organs and elements in time and space. Thus, a key objective in this core project is the registration of image and range data of individual plants that are measured at different points in time. We are using different autonomous sensor platforms, i.e., ground and aerial vehicles, to acquire phenotypic data on different spatial resolution and coverage ranging from the single organ over the experimental plot to the field scale. Measurements of these systems have to be acquired autonomously and have to be precise enough to cover single plants in a field but also fast enough to allow the mapping of larger fields and plot experiments.
With these high-resolution measurements, we aim at reconstructing high structural representations of the plants, their geometric shape and their appearance. These structural 3D representations will be registered in a non-rigid way over time to facilitate the tracking of single plant organs and surface elements leading to a 4D plant representation. In addition to that, multiple sensor data has to be merged to exploit the full information content of the measurable spectral bands and various sensor readings. The crop data will be analyzed using modern machine learning approaches and radiative transfer inversion schemes to calculate a new generation of maps of plant traits, such as photosynthetic capacity, early stress parameters, canopy light absorption, or early signs of nutrient imbalance, and pest infestation.
We expect new insights by combining the model built in this project with relevance analysis as well as management decision from Core Project 2 and 4. This will also support the scientific understanding of the interaction between crops and their interaction with the environment.