Core Project 4 aims at converting the collected data into robotic actions in the fields, exploiting digital avatars. Precise robotic weeding, for example, seeks to intervene in a minimally invasive way, reducing the amount of inputs such as herbicides. This project develops autonomous field aerial and ground robots that detect and identify individual plants, weed mapping the field to treat individual plants with the most appropriate intervention. The robots precisely apply nitrogen fertilizer enabled by digital avatars that predict the plant nutrient demand and probable losses in the field.

Research Videos

Image-based Plant Phenotyping

Image-based Plant Phenotyping

PhenoRob PhD Student Jan Weyler talks about his research within Core Project 4: Autonomous In-Field Intervention.

Precision Weed Management Enabled by Robotics and Robotics Vision

Precision Weed Management Enabled by Robotics and Robotics Vision

PhenoRob PhD Student Alireza Ahmadi talks about his research within Core Project 4: Autonomous In-Field Intervention.

Topic Introduction: UAV Remote Sensing for Improving Crop Models

Topic Introduction: UAV Remote Sensing for Improving Crop Models

PhenoRob PhD Student Jordan Bates talks about his research within Core Project 4: Autonomous In-Field Intervention.

AgroC Model Development and Parameterization to Characterize Plant-soil System

AgroC Model Development and Parameterization to Characterize Plant-soil System

PhenoRob PhD Student Rajina Bajracharya talks about her research within Core Project 4: Autonomous In-Field Intervention.

Developing New Algorithms for Autonomous Decision Making to Optimize Robotic Farming

Developing New Algorithms for Autonomous Decision Making to Optimize Robotic Farming

PhenoRob Junior Research Group Leader Marija Popovic talks about her research within Core Project 4: Autonomous In-Field Intervention.

Developing Weed Management Strategies for Autonomous Field Robots

Developing Weed Management Strategies for Autonomous Field Robots

PhenoRob PhD Student Marie Zingsheim talks about her research within Core Project 4: Autonomous In-Field Intervention.

Virtual Temporal Samples for RNNs: applied to semantic segmentation in agriculture

Virtual Temporal Samples for RNNs: applied to semantic segmentation in agriculture

Normally, to train a recurrent neural network (RNN), labeled samples from a video (temporal) sequence are required which is laborious and has stymied work in this direction. By generating virtual temporal samples, we demonstrate that it is possible to train a lightweight RNN to perform semantic segmentation on two challenging agricultural datasets. full text in arxiv: https://arxiv.org/abs/2106.10118 check My GitHub for interesting ROS-based projects: https://github.com/alirezaahmadi

Talk by R. Sheikh on Gradient and Log-based Active Learning for Semantic Segmentation... (ICRA'20)

Talk by R. Sheikh on Gradient and Log-based Active Learning for Semantic Segmentation… (ICRA’20)

ICRA 2020 talk about the paper: R. Sheikh, A. Milioto, P. Lottes, C. Stachniss, M. Bennewitz, and T. Schultz, “Gradient and Log-based Active Learning for Semantic Segmentation of Crop and Weed for Agricultural Robots,” in Proceedings of the IEEE Int. Conf. on Robotics & Automation (ICRA), 2020. PDF: https://www.ipb.uni-bonn.de/wp-content/papercite-data/pdf/sheikh2020icra.pdf

IROS'20: Domain Transfer for Semantic Segmentation of LiDAR Data using DNNs presented by J. Behley

IROS’20: Domain Transfer for Semantic Segmentation of LiDAR Data using DNNs presented by J. Behley

F. Langer, A. Milioto, A. Haag, J. Behley, and C. Stachniss, “Domain Transfer for Semantic Segmentation of LiDAR Data using Deep Neural Networks,” in Proceedings of the IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS), 2020. Paper: https://www.ipb.uni-bonn.de/wp-content/papercite-data/pdf/langer2020iros.pdf

ICRA'2020: Visual Servoing-based Navigation for Monitoring Row-Crop Fields

ICRA’2020: Visual Servoing-based Navigation for Monitoring Row-Crop Fields

Visual Servoing-based Navigation for Monitoring Row-Crop Fields by A. Ahmadi et al. In Proc. of the IEEE Intl. Conf. on Robotics & Automation (ICRA) , 2020.

ICRA'19: Robot Localization Based on Aerial Images for Precision Agriculture by Chebrolu et al.

ICRA’19: Robot Localization Based on Aerial Images for Precision Agriculture by Chebrolu et al.

Robot Localization Based on Aerial Images for Precision Agriculture Tasks in Crop Fields by N. Chebrolu, P. Lottes, T. Laebe, and C. Stachniss In Proc. of the IEEE Intl. Conf. on Robotics & Automation (ICRA) , 2019. Paper: http://www.ipb.uni-bonn.de/wp-content/papercite-data/pdf/chebrolu2019icra.pdf

Towards Autonomous Visual Navigation in Arable Fields

Towards Autonomous Visual Navigation in Arable Fields

Rou pare can be found in Arxiv at: [Towards Autonomous Crop-Agnostic Visual Navigation in Arable Fields](https://arxiv.org/abs/2109.11936) You can find the implementation in : [visual-multi-crop-row-navigation](https://github.com/Agricultural-Robotics-Bonn/visual-multi-crop-row-navigation) more detail about our project BonnBot-I and Phenorob at: https://www.phenorob.de/ http://agrobotics.uni-bonn.de/

Adaptive-Resolution Field Mapping Using Gaussian Process Fusion With Integral Kernels by L.Jin et al

Adaptive-Resolution Field Mapping Using Gaussian Process Fusion With Integral Kernels by L.Jin et al

This short paper trailer is based on the following publication: L. Jin, J. Rückin, S. H. Kiss, T. Vidal-Calleja, and M. Popović, “Adaptive-Resolution Field Mapping Using Gaussian Process Fusion With Integral Kernels,” IEEE Robotics and Automation Letters, vol. 7, pp. 7471-7478, 2022. doi:10.1109/LRA.2022.3183797

IROS 2022 - Towards Autonomous Visual Navigation in Arable Fields

IROS 2022 – Towards Autonomous Visual Navigation in Arable Fields

Presented at IROS 2022 Kyoto-Japan Alireza Ahmadi, Michael Halstead, Chris McCool Agricultural Robotics and Engineering University of Bonn Fully autonomous vision-only row-crop field traversal scheme. Proposed a novel multi-crop-row detection and recognition method tested in real fields with cluttered weedy scenes. Autonomously switches between lanes of crops using only RGB cameras fixed to the front and back of the robot. Average navigation deviation from the GPS groundtruth of 3.82cm or approximately 10% of the crop-row distance across the five real crop types. Group Website: http://agrobotics.uni-bonn.de/publications/ Github implementation: First version: https://github.com/PRBonn/visual-crop-row-navigation Second Version: https://github.com/Agricultural-Robotics-Bonn/visual-multi-crop-row-navigation @article{ahmadi2021towards, title={Towards Autonomous Crop-Agnostic Visual Navigation in Arable Fields}, author={Ahmadi, Alireza and Halstead, Michael and McCool, Chris}, journal={arXiv preprint arXiv:2109.11936}, year={2021} } @inproceedings{ahmadi2020visual, title={Visual servoing-based navigation for monitoring row-crop fields}, author={Ahmadi, Alireza and Nardi, Lorenzo and Chebrolu, Nived and Stachniss, Cyrill}, booktitle={2020 IEEE International Conference on Robotics and Automation (ICRA)}, pages={4920–4926}, year={2020}, organization={IEEE} }