Crops are an important source of food and other products. In conventional farming, tractors apply large amounts of agrochemicals uniformly across fields for weed control and plant protection. Autonomous farming robots that can sense, understand, and act in the field have the potential to provide environment-friendly weed control and plant protection on a per plant basis. A perception system that reliably distinguishes crops, weeds, soil, and other objects of relevance under varying environmental conditions is the basis for plant-specific interventions such as spot applications. Developing new approaches and techniques, combining new technologies from artificial intelligence (AI), machine learning, robotics, and phenotyping towards sustainable crop production is part of the DFG Cluster of Excellence “PhenoRob.”
To monitor and analyze plants, plots, and whole fields, today’s robots use machine learning techniques combining computer vision and deep neural networks to extract the relevant information from camera images. In research, one often uses so-called semantic segmentation networks that assign a semantic label such as crop or weed to every pixel in each image. Together with information about the robot’s location, for example, from GPS, the environment can be interpreted in a geo-referenced fashion at a cm- to mm-level. Based on such detailed knowledge, robots can take targeted intervention actions, ranging from spot spraying, mechanical treatments, or even laser-based destruction of individual weed plants.
Analyzing and Understanding the Environment
Today’s semantic segmentation systems often show a performance decay when applied under new conditions. This is a general problem in supervised machine learning when applied to real-world problems where the environment changes naturally. Researchers from machine learning, AI, robot learning, and several other fields actively conduct research to bridge this gap between the source domain, for example, the training environment, and the target domain. This transfer of supervised learning systems is often called domain transfer or domain adaptation.
Domain Adaptation within the PhenoRob Robots
Within PhenoRob, we are developing approaches to unsupervised domain adaptation for plant segmentation systems in agriculture. Being able to perform such an adaptation in an unsupervised way allows automatic generalization of the learned neural networks based solely on unlabeled data. Our new techniques aim at generalizing and adapting existing segmentation networks to new field environments, different value crops, and other farm robots. This means, that we can transfer a classifier for sugar beets to other crops, for example, sunflowers or from a field robot to a small unmanned aerial vehicle.
The image is the result of our network, in which the image from Bonn is translated to look like an image taken in Stuttgart.
Researchers from PhenoRob have shown that their systems yield a high segmentation performance in the target domain by exploiting labels only from the source domain – across locations, crops, and robots. The techniques make use of CycleGANs. GANs are generative adversarial networks that consist of two neural networks that compete against each other. Through the competition, both networks can get better and better over time, improving their capabilities. What makes the proposed CycleGAN especially effective is the enforcement of / enforcing semantic consistency of the domain transfer by constraining the images to be consistently classified pixel-by-pixel when performing transfers in both directions, for example, from source to target and back to the source domain. This technique substantially improves the transfer of semantic segmentation systems using agricultural fields to new field environments, different crops, and different sensors or robots. This, in turn, impacts the usability of farming robots and has the potential to bring such systems closer to practice, such that the same robots can be used across environments and crops in a robust manner.
The Excelenzcluster already introduced itself in the last post, take a look: Drohnen und Roboter für eine nachhaltige Landwirtschaft der Zukunft