A Multi-Agent Framework for Automated Agriculture Applications

Kumar Ankit

Robert Bosch Center for Cyber Physical Systems

Basic Research Proposal

The popularity of UAVs in scientific data gathering and applications, especially the use of small multi-rotor UAVs is quite widespread. There has also been a sudden spurt of UAV use in niche domains such as agriculture. Agriculturalists are choosing UAV-based field operations and remote sensing over the satellite-based ones, especially for local-scale and high spatiotemporal resolution imagery. However, the combination of aerial survey capabilities of Unmanned Aerial Vehicles with targeted intervention abilities of agricultural Unmanned Ground Vehicles can significantly improve the effectiveness of robotic systems applied to automated agriculture. In this research report a framework for such collaboration has been proposed.

A)Multi-Agent Collaborative Framework for Agriculture: IoT networks for farms end up in messy arrangements that need to be put into a systematic framework. At the other end of the spectrum, usage of heavy UGVs compacts the topsoil and hampers its productivity. A hybrid approach combining UAVs and light-weight UGVs could be employed where UAVs will be the master of the operations (along with standard operations like surveillance, spraying, etc.), with heavier work like tilling and sowing done by the UGVs. Submodules for the framework are listed below

a.Multi-vehicle co-ordination: Cooperation between aerial and ground robots undoubtedly offers benefits to many applications, thanks to the complementarity of the characteristics of these robots. UAVs can hover and provide the visionary data to detect and locate the events whereas UGVs can be employed to serve the task with suitable equipment.

b.Mission planner: For a centralized multi-agent setup, it is necessary to have a mission planner to demarcate and allocate the tasks in an optimized fashion.

c.Decision making: To optimize the operation of the robots, the decision needs to be made while allocating the tasks. This is done based on costs and rewards which must be designed for each task and finally for overall operation.

d.Task allocation and handling: This module will take care of proper allocation, operation and completion of the task allocated to the robots. In case of a fault, the module will be responsible to tackle it in an optimized and safe way.

e.Path planning: To optimize the fleet performance, it needs to be directed properly to right places at right times. Hence, the fleet needs an efficient routing protocol to follow. It will again save a lot of time and energy of the overall system.

f.Load transportation: The ground robot can operate for long periods of time, carry high payloads, perform targeted actions, such as fertilizer application or selective weed treatment, on the areas selected by the UAV.

B)Computer Vision in Agriculture: In agriculture, a robot can help to perform various tasks like planting, weeding, harvesting and plant health detection. Such robots can detect plants, weeds and fruits or vegetables with the power of analyzing the health condition and fructify level to determine the harvesting time with the reaping capability of such crops.

a.Feature engineering: Environment like agriculture possess repetitive feature bundles that makes it hard to be tracked and detected distinctly. One needs to engineer a feature that can be calculated based on the data and can be tracked easily.

b.Event detection and tracking: Events such as pest/disease detection, intruder detection needs to be tracked in real time as such events can cause plenty of damage. For this a UAV needs to map, detect, track and share the information to the ground vehicle for further treatment.

c.Precise localization: For anomaly detection, UAV/UGVs need precise localization of interest points and themselves. Present sensors mounted on UAVs are now capable to achieving sub-pixel accuracy but still require heavy computation. This can be dealt with light and fast algorithms.

d.Mapping: The robots can also cooperate to generate 3D maps of the environment annotated with parameters, such as crop density and weed pressure, suitable for supporting the farmer’s decision making. The UAV can quickly provide a coarse reconstruction of a large area, that can be updated with more detailed and higher resolution map portions generated by the UGV visiting selected areas.

e.Surveillance: To deal with events such as pest infestation, heavy rainfall, intruder admission, etc., the farm needs to be constantly surveyed (every 24 hours or with some defined periodicity). In these cases, a static camera or CCTV may not be able to provide adequate coverage whereas a UAV can hover and provide almost complete coverage of the field.

f.Visual servoing: Vision based robot control is required for precise manipulation (for instance, while carrying out spraying operation) and movement of robots in the field. This can help reduce the wastage and improve accuracy.

C)Machine/Deep Learning for Agriculture: Conventional (linear, statistical) methods fail to meet real time performance and accuracy when there are is a large multiplicity of identifiers (classes/species/diseases, etc.) to handle. These cases can be dealt with using learning techniques as they offer massive parallelization. These techniques are generally used as classifiers and predictors. Advantages of these methods (over conventional methods) can be listed as follows:

a.Feature extraction: Extracting feature/pattern from large datasets (e.g., Hyperspectral) can be cumbersome manually and sometimes not even possible. This can be dealt with using enough feature extraction layer (e.g., CNN) and let the model learn from the data itself.

b.Multiple/overlapping classes: Conventional methods fails to detect patterns among highly overlapping classes or multiple classes at a time. This can be dealt with by using a probabilistic method such as used in any generic classifier.

c.Flexible and adaptable: Same model can be applied to different types of crops and diseases. It needs to be trained once again on new datasets. For conventional methods, new features need to be engineered to detect different diseases/crops.

d.Real-time and accurate: Training takes considerable time and dataset, but testing/detection is faster when compared to conventional methods which employ heavy mathematical operations (for example, PCA, SVD, etc.) and data pre-processing steps.

e.Challenges addressed: Occlusion, depth variation, illumination, scale, etc.

f.Yield prediction: Yield mapping and estimation can be made to meet real time supply and demands.

Work Done so Far

Below are the relevant projects that I have done so far, some as a part of my coursework and some for my research. The modules developed for them are listed along with the generic application. Most of the basic and specific modules developed can be employed for agricultural scenarios with little or no modifications.

A.Indoor Localization and Path Planning: The project dealt with the basics of mapping, vision-based localization and path planning for a GPS denied environment.

a.Module developed/tested: Mapping, vision-based localization, path planning

b.Agriculture applications: The module developed can serve as a basic module for localizing a robot and planning its path for several location-based tasks. For instance, it can be used for navigating a UGV in a field and detecting a weed patch.

B.Transient dynamics analysis of foldable drone: Done under robotics course, this project aimed to analyze the dynamics of a foldable drone in transition from one state to other.

a.Modules developed/tested: UAV dynamics for multiple configurations

b.Agriculture applications: The analysis in this project can be used to analyze different possible configuration for a task-oriented setup. For example, it can be used for analyzing a UAV with robotic arm in motion.

C.Vehicle detection and classification based on aerial imagery: Given aerial images, the task was to detect and classify the vehicles on highway based on their size, number of wheels etc. using YOLOv3.

a.Modules developed/tested: YOLOv3 network architecture, training and testing setup

b.Agriculture applications: The network can be deployed to classify objects in the aerial images given the annotated dataset. This can be used to classify weed/diseased plants from the healthy ones (event detection.

D.Real time stereovision aided inertial navigation for fast autonomous flight: This work was reviewed to get accustomed to computationally efficient and real time algorithms designed to achieve fast paced flight.

a.Module developed/tested: Light and fast algorithm for localization and path planning

b.Agriculture applications: This algorithm can help UAV to travel at faster speeds simultaneously keeping track of its pose and orientation. It can help a UAV to cover larger farms with faster speed hence can provide better coverage of the same.

E.Multi-agent collaborative building construction: This framework was developed for MBZIRC challenge to construct a wall using 3 UAVs and 1 UGV by picking brick of desired dimension and placing it on top of another in the minimum time possible.

a.Module developed/tested: Multi-agent simulation setup, mission planner, task handler, collision avoidance, arm manipulation

b.Agriculture applications: The setup can be deployed to serve the task-oriented problems with multiple robots which collaborate with each other for data/manipulation. For instance, UAVs can detect an event/spot from above and share the location information with the UGV to operate to. Picking up agricultural tools and spray cans for delivery to another spot is another application. The mission planner would be very useful to coordinate operations among multiple vehicles.

c.The following publication is being further developed for agricultural applications: K. Ankit, L.A. Tony, S. Jana, D. Ghose, "Multi-Agent Collaboration for Building Construction", MBZIRC Symposium, ADNEC, Abu Dhabi, Feb 2020. https://arxiv.org/abs/2009.03584

F.Multi-agent collaborative 3D mapping: This project aimed to fuse the map generated by a UAV and UGV separately and obtain a more precise and detailed 3D map of the environment.

a.Modules developed/tested: UAV-UGV collaborative mapping pipeline

b.Agriculture applications: This pipeline can help to generate a geo-referenced detailed 3D map along with aerial view for better analysis and operation. In an agricultural setup, a detailed map can be generated by fusing the aerial canopy view with the detailed closeup view of the rows by UGV. This can help to better detect the disease, weed, etc.

G.Deep RL for high precision tasks: In this project, a high precision benchmark problem “Peg in hole” was solved using a deep RL architecture.

a.Modules developed/tested: LSTMs for precise manipulation of 6DOF arm with imprecise encoders for “Peg-in-hole” problem

b.Agriculture applications: This network can be employed to precisely manipulate robotic arm for tasks such as precision fertigation and irrigation using manipulator arms on the UGV.

H.DMD-MPC: An online learning approach to MPC: In this work, an online learning perspective to model predictive control (MPC) has been applied and tested on non-linear control problems such as inverted pendulum and half-cheetah.

a.Modules developed/tested: Model predictive controller, DMD-MPC algorithm

b.Applications: This optimal controller can be deployed to predict and control the actions of a robot in a disturbance/error prone environment. An UAV can use this controller in windy conditions to properly navigate and traverse across the farm.

Major requirements for automated agriculture and the modules developed can be linked and visualized using the schematic below. Black bold lined boxes are basic modules for primary tasks such as path planning, collision avoidance etc. and colored are the specific modules depending upon the specific task like collaborative 3D mapping.