A robot’s ‘eyes’ and a satellite’s view are changing the way we detect crop diseases
In a special test vineyard in New York State, you may spot an unusual sight: tall, boxy robots with rugged tires patrolling the cultivated rows, moving autonomously thanks to onboard GPS.
As they navigate the landscape, customized sensing cameras capture not just images but also spectral data — light reflection patterns allowing for identification and analysis not possible with the naked eye. The hope is that, as the PhytoPatholoBots pick up information about the vines, analyzing patterns in growth and development as well as blight indicators, they will be able to detect not just where plants are diseased, but also where they’re at risk of developing problems.
But for Katie Gold and Yu Jiang of Cornell University, a smart robot that can monitor crop health better and more safely than any human is only a stepping stone to something cosmically larger.
The two researchers are conducting assessments and collecting data ahead of the anticipated launch of the NASA Surface Biology and Geology satellite, currently expected to take place in 2028. According to NASA, the satellite’s payload, which will include a visible-to-shortwave infrared imaging spectrometer and thermal imager, “will provide mow-the-lawn (level of detail) global coverage of Earth’s terrestrial, coastal and open ocean surfaces as well as atmospheric conditions.”
Gold and Jiang hope that the close, detailed agricultural images it collects will be able to give farmers advance notice of threats to their crops, allowing them to address the problems. And they’re using their research robots for “ground truth” — a method of calibrating what a satellite sees by verifying its data collection and assessments at ground level.
Gold, an assistant professor and the Susan Ekert Lynch Faculty Fellow in the Plant Pathology and Plant-Microbe Biology Section of the School of Integrative Plant Science (SIPS) in Cornell’s College of Agriculture and Life Sciences, is an expert on remote plant disease sensing with a specific focus on grape pathology. As a graduate student, she had the opportunity to spend a nine-month embed with the NASA Jet Propulsion Laboratory, which widened her perspective on how she could work to protect agriculture.
Jiang, an assistant professor at SIPS, is an applied roboticist with a special focus on image processing and artificial intelligence (AI).
During COVID-19 lockdowns in 2020, Gold and Jiang started meeting remotely to brainstorm over a shared concern about the threat plant disease posed to the world’s food supply. As Jiang notes, the world is expected to have a population of 10 billion by 2050. Feeding the world, experts say, will require an annual increase in production of 2.4 percent until then.
Plant disease accounts for 20 percent to 40 percent of crop loss annually, Jiang says. “If we can solve that, I think we see the way” to meeting the 2050 target.
For Gold, optimizing the yield of existing crops that the world already consumes and enjoys is a more appealing solution than other proposals, such as turning to eating insects to meet caloric requirements. As she says, “it’s about the quality, not just about the caloric density” of the food.
The vineyards of upstate New York have proven to be an excellent testing ground. “Grapes are disproportionately supportive of small, local ecosystems, especially here in New York,” Gold says. The ecosystem of the local specialty crop “is actually really well set up for us to operationally conduct the research.”
The crop-patrolling robots Gold and Jiang developed have their own direct advantages. In California, where they also conduct research, conditions such as wildfires and extreme temperatures underscore the benefit of having a machine, rather than a human, conducting plant and leaf analysis.
“If there’s a robot that can be out there in the extreme temperatures, then the human labor can be under a better working environment or condition,” Gold says.
After the robot collects its detailed visual and spectral imagery, Jiang says, a customized AI model compares it to a data bank to spot patterns known to be indicators of existing disease or vulnerability, allowing a farmer to concentrate prevention or treatment efforts on a particular location.
The research team is aware that not every grower will be able to afford high-tech robots for plant patrol — they currently cost about $50,000 apiece, Jiang says — especially when the assessment area is much larger than the few hundred acres of test vineyard that their robots monitor.
To help all growers benefit, Gold and Jiang are looking to the forthcoming satellite, which will help them develop a regularly updating risk assessment map combining hyperspectral images from space with ground data to show crop problems and likely areas of concern.
Later this year, they are hoping to get the next best thing to satellite imagery: manned aircraft surveillance flights equipped with hyperspectral sensors, courtesy of the Jet Propulsion Lab, that will help generate “landscape-scale imagery” to help build early models of their planned risk maps, Gold says.
An accurate vulnerability and disease identification model “can really offer affordable and suitable solutions to the wide range of growers in the long run,” Jiang says. “We want to put equity and inclusion into development consideration, so that our technology is really benefiting the whole green production system, regardless of size, regardless of current financial situation.”
Gold and Jiang hope their work will help address real concerns about the future of farming by incorporating technology in an affordable and broadly accessible way. As Jiang notes, the farmers who feed America make up less than 2 percent of the U.S. population — and that population is at risk as current growers age and retire.
“I would envision AI and robotics being our new partners,” Jiang says. “For people who are enthusiastic about nature, about farming, about how to really benefit the whole society … (that) will be their new partnership.”