How Does My Garden Grow? With Drones, Sensors, and AI All in a Row

Crop scientists hope to replace traditional, painstaking monitoring methods with automation

3 min read
Photo: Photon Systems Instruments
Crop Vision: Photon Systems Instruments, in the Czech Republic, sells automated phenotyping systems for use in greenhouses and in the field.
Photo: Photon Systems Instruments

A decade ago, a group of crop scientists set out to grow the same plants in the same way. They started with the same breeds and adhered to strict growing protocols, but nonetheless harvested a motley crop of plants that varied in leaf size, skin-cell density, and metabolic ability. Small differences in light levels and plant handling had produced outsize changes to the plants’ physical traits, or phenome.

The plunging price of genomic sequencing has made it easier to examine a plant’s biological instructions, but researchers’ understanding of how a plant follows those instructions in a given environment lags. “There is a major bottleneck for a lot of breeders to be able to get their phenotypic evaluation in line with their genetic capabilities,” says Bas van Eerdt, business development director at PhenoKey, in ’s-Gravenzande, Netherlands.

Breeders would like to be able to know whether a plant—or better, a whole crop—is growing on track and how it’s responding to local weather conditions, by observing the way it grows. Now, with cheaper sensors and more powerful artificial intelligence algorithms, researchers are inching closer to that goal. Their hope is to make the typical 1.3 percent annual yield improvement in crop production look more like Moore’s Law.

The go-to technique for this work is still optical imaging. Some researchers are now writing software to allow growers to use smartphone cameras to quantify some parts of a crop’s phenotype. But they are also adapting an array of more sophisticated imaging technologies from aerospace and biomedical physics to the field. Breeders in North Carolina and the Netherlands are using drones and greenhouses equipped with hyperspectral, fluorescent, and tomographic sensors to quantify more of their crop’s phenomes.

Hyperspectral imaging can reveal hidden damage from insects. Magnetic resonance imaging (MRI) can detect droplets of water as a seed absorbs it and follow the seed through germination and other stages of development. Positron emission tomography (or PET scans) allows researchers to peer through soil into flower bulbs and visualize the layout of a plant’s root system.

The European Union spent some €250 million (about US $300 million) between 2005 and 2015 on plant phenotyping research infrastructure, and American crop giants and government agencies are spending millions on this research alongside major breeding companies such as Syngenta and Bayer.

/image/MzE5NDU3Ng.jpeg Early Symptoms: This plant, which was sprayed with an herbicide, has no visible damage to its leaves. But a scan that measures chlorophyll fluorescence suggests the plant’s ability to photosynthesize is already being affected. Images: Phenokey/Phenovation

In the past, evaluating a new crop variety required breeders to visit every plant in a test plot, take detailed notes, and rank all the plants for the next round of breeding. “This is actually the limiting factor in the experiments that we run,” says roboticist and business developer Rick van de Zedde of Wageningen University & Research, in the Netherlands. “It’s not really about the cost; it’s more about the enormous amount of time that it takes.”

Instead, PhenoKey annotates thousands of images of test crops, adding labels to identify characteristics such as flower bud count and leaf shape. The company uses these annotations to train its artificial intelligence software on the traits of a specific type of plant. In one case he presented a few years ago, van Eerdt says, a breeding company spent no more than 50 person-hours improving an image analysis algorithm so that it could detect orchid buds with 95 percent accuracy in a greenhouse full of plants—about a twentieth of the time it took to describe the plants manually.

Van de Zedde won €22 million in 2018 to build a new Dutch national phenotyping research facility that joins a small but growing number of facilities around the world.

The ultimate goal, van Eerdt says, is to combine automated phenotyping with automated genomics screening. “If you have a deep understanding of how your genetics work…and a model that predicts phenotypic outcomes, then in theory, it’s possible to predict how your crop will look,” he says.

This article appears in the January 2019 print issue as “Automated Eyes Watch Plants Grow.”

The Conversation (0)

The Cellular Industry’s Clash Over the Movement to Remake Networks

The wireless industry is divided on Open RAN’s goal to make network components interoperable

13 min read
Photo: George Frey/AFP/Getty Images

We've all been told that 5G wireless is going to deliver amazing capabilities and services. But it won't come cheap. When all is said and done, 5G will cost almost US $1 trillion to deploy over the next half decade. That enormous expense will be borne mostly by network operators, companies like AT&T, China Mobile, Deutsche Telekom, Vodafone, and dozens more around the world that provide cellular service to their customers. Facing such an immense cost, these operators asked a very reasonable question: How can we make this cheaper and more flexible?

Their answer: Make it possible to mix and match network components from different companies, with the goal of fostering more competition and driving down prices. At the same time, they sparked a schism within the industry over how wireless networks should be built. Their opponents—and sometimes begrudging partners—are the handful of telecom-equipment vendors capable of providing the hardware the network operators have been buying and deploying for years.

Keep Reading ↓ Show less