Research

Machine vision system developed capable of locating king flowers on apple trees

Xinyang Mu, doctoral candidate in the Department of Agricultural and Biological Engineering, who led the research, works on data collection in the field at Penn State’s Fruit Research and Extension Center, Biglerville. Credit: Penn State. Creative Commons

UNIVERSITY PARK, Pa. — A machine vision system capable of locating and identifying apple king flowers within clusters of blossoms on trees in orchards was devised by Penn State researchers — a critical early step in the development of a robotic pollination system — in a first-of-its-kind study.

Apple blossoms grow in groups of four to six blooms attached to branches, and the center blossom is known as the king flower. This flower opens first in the cluster and usually grows the largest fruit. So, it is the key target of a robotic pollination system, according to researcher Long He, assistant professor of agricultural and biological engineering.

Insect pollination has traditionally been relied upon for apple productivity. However, evidence suggests that pollination services, both from domesticated honeybees and wild pollinators, is not matching increasing demands, He noted. Due to colony collapse disorder, honeybees around the world have been dying at alarming rates. As a result, producers need alternative methods of pollination.

This study is the latest conducted by He’s research group in the College of Agricultural Sciences, which is devoted to developing robotic systems to accomplish labor-intensive agricultural tasks such as mushroom picking, apple tree pruning and green-fruit thinning. The primary goal of this project, He explained, was to develop a deep learning-based vision system that could precisely identify and locate king flowers in tree canopies.

“We think this result will provide baseline information for a robotic pollination system, which would lead to efficient and reproducible pollination of apples to maximize the yield of high-quality fruits,” He said. “In Pennsylvania, we still can rely on bees to pollinate apple crops, but in other regions where bee die-offs have been more severe, growers may need this technology sooner than later.”

Xinyang Mu, doctoral student in the Department of Agricultural Biological Engineering, spearheaded the king flower study. Mu used Mask R-CNN — a popular deep-learning computer program that performs pixel-level segmentation to detect objects that are partially obscured by other objects — to identify and locate the king flowers in a machine vision system.

To build the Mask R-CNN-based detection model, he captured hundreds of apple blossom cluster photos. Then he developed a king flower segmentation algorithm to identify and locate the king flowers from that raw dataset of apple flower images. The research was conducted at Penn State’s Fruit Research and Extension Center, Biglerville.

Gala and Honeycrisp apple varieties were selected for the tests. The test trees were planted in 2014 with tree spacing of about 5 feet (Gala) and 6 1/2 feet (Honeycrisp). These trees were trained in tall spindle canopy architecture, with an average height of about 13 feet. The image-acquisition system with a camera was mounted on a utility vehicle maneuvered between tree rows.

Training the machine vision system to locate king flowers was challenging, Mu pointed out, because they are the same size, color and shape as the lateral blossoms in clusters, and the king flowers are typically obscured by surrounding flowers because of their central position.

To fulfill the requirements of transfer learning for Mask R-CNN model training, raw images were labeled in two pre-defined classes: individual flowers and occluded flowers. To enhance precision, the training dataset was enlarged by four times using data-augmentation approaches, Mu explained.

“To distinguish king flowers from lateral flowers, the most central flower within each flower cluster was targeted, or localized,” he said. “The vision system automatically located the flower clusters separately based on a two-dimensional flower density mapping approach. Within each detected flower cluster, the flower — or the mask — at the most centered position was determined as the target king flower.”

In findings recently published in Smart Agricultural Technology, the researchers reported a high level of king flower-detection accuracy resulting from Mu’s algorithm. Compared with measurements taken manually by researchers identifying king flowers by eye — called ground truth measurements by the researchers — the machine vision king flower detection accuracy varied from 98.7% to 65.6%.

Contributing to the research at Penn State were Paul Heinemann, professor of agricultural and biological engineering, and James Schupp, professor of pomology, along with Manoj Karkee, Center for Precision and Automated Agriculture, Washington State University.

The research was supported by the U.S. Department of Agriculture’s National Institute of Food and Agriculture and USDA’s Specialty Crop Multi-State Grant Program.

Last Updated January 27, 2023

Contact