Advanced science.  Applied technology.

Search

Intelligent Perception for Crop Row Automation, 10-R6942

Principal Investigator
Inclusive Dates 
03/17/20 to 07/17/20

BACKGROUND

Weeds in crop row farming are a common problem because they consume nutrition and water resources that the crop needs to thrive. This project researched and demonstrated the ability to detect and localize the weed and crop from two camera views to enable automated weed management.

APPROACH

This project applied a three-stage approach to determine whether deep learning could detect crops and weeds from two camera perspectives of a lettuce crop row. First, an extensive training dataset was annotated to define ground truth labels of the vegetation within a representative environment. Second, a Convolutional Neural Network-based algorithm was trained on the dataset to learn the distinguishing features from the surrounding environment. Third, the developed network model was evaluated on a new dataset to benchmark and demonstrate the performance.

ACCOMPLISHMENTS

This project provided valuable insight as to the complexities involved with developing a robust classification system that considers the variations in size, shape, and occlusions of the crops and weeds that can occur in the environment. We were able to achieve an accuracy of 95% positive classification at a classification rate of 15 Hz for each frame. The classification system has great performance and is scalable to identify and localize vegetation from two camera views at a speed suitable for weed management systems.

The results of this project demonstrate that applied deep learning algorithms can satisfy the expectations for crop row vegetation classification in a dual camera crop row system. This is an improvement over the current fielded solutions that use conventional color and edge-based features on a single camera view.