Glacier FarmMedia – Artificial intelligence may soon create a sea change in agriculture that could challenge conventional crop production and farm management techniques.
Many components required to build autonomous, smart agricultural equipment for vegetable and grain production in North America are already proven technologies.
Sensors including cameras, radar and lidar (light detection and ranging), as well as the electrification of machines such as hydraulic pumps, batteries and control systems, are also far more available now than they were a few years ago.
Read Also

Layer management platform aims to ease record-keeping workload
Timely management of factors influencing egg production can have a significant impact on the ultimate income from a flock.
The rapid increase of component options available to original equipment manufacturers (OEMs) as well as to short-line and start-up equipment manufacturers, suggests competitive pricing pressure will be an early feature of the autonomous farm equipment market.
Most agriculture systems now available can perform tasks semi-autonomously, from autosteer to spot spraying systems including Weed-It or John Deere’s See and Spray Select. They are based on straightforward, if complex, computer programming.
For instance, spot sprayers use cameras to identify plants in a field and then commands are sent to individual nozzles to only spray areas where plants were found.
These systems can’t differentiate between a weed or plant, but this will change quickly because AI and machine learning strategies that drove the digital revolution in other sectors are being adapted for agriculture.
Bilberry is a tech start-up founded in 2015 by Guillaume Jourdain, Hugo Serrat and Jules Beguerie, who developed an AI-based system to drive green on green spot spraying with the help of technology developed for autonomous automobiles.
Green on green spot spraying occurs when individual weeds are found and sprayed in-crop.
“At that time (2015) it was really the beginning of artificial intelligence embedded in vehicles, in a very broad way,” said Jourdain.
“Before, it was very difficult to solve all the technological challenges that exist for spot spraying. But with AI and the rise of embedded systems, really we were at the right time to work on this and try to finish the solution.”
The Bilberry system has a camera every three metres on the boom. Each camera has a dedicated processor that sends information to a central computer in the sprayer’s cab.
“From there we send the information to the nozzles to open them and close them in real-time, so individual nozzle control. Obviously, we are also linked to the GPS so we also have a section control that’s working,” Jourdain said.
“Normal speed for us (is) to be about 20 km/hr. It can be a bit faster, but 20 km is where we are very comfortable.”
Training machine learning algorithms is a long and tedious process. Bilberry started by driving fields with sprayers and four-wheel drive vehicles equipped with cameras.
The images were then labelled by manually identifying the plants in the photos and then the AI training process could begin.
“AI training means basically showing the labelled images to the algorithm several hundred or thousands of times so that it can start learning what the weeds are, what the crop is, and then in a new situation it will be able to say, ‘OK, that’s a weed or that’s a crop’,” Jourdain said.
He said Bilberry’s spot spray system is effective at taking broadleaf weeds out of cereal crops, with a 90 per cent average hit rate of the weeds while using a fraction of the chemical required for blanket applications.
The company continues to train its algorithms to improve their ability to identify different weeds in many crops but the system is already commercially available in parts of Australia.
Bilberry is working with spraying manufacturers including Europe’s Agrifac and SwarmFarm Robotics, an Australian start up that sells small autonomous robots used for multiple applications.
Bosch and BASF’s new joint venture, called Bosch BASF Smart Farming, will offer its AI-based green on green smart spraying technology in Brazil by the end of the year, and plans to expand the service to North America.
An American start-up called FarmWise builds an autonomous weeding robot for the vegetable industry that detects every plant in a field, both weed and crop. Onboard computers then send instructions to the robotic weeding arms.
FarmWise spent years developing in-house AI algorithms that are specifically made to detect crops and weeds.
“We rely on deep learning algorithms and a lot of data that we accumulated over the years to get to a very accurate decision-making process, in terms of what type of plant this is, where it’s located, and then a few other parameters that help us do a very good job at the service that we deliver,” said Sebastien Boyer, chief executive officer of FarmWise.
A Seattle-based start-up uses AI in its 9,000-pound autonomous robot that uses a 72-horsepower Cummins diesel engine to power weed-blasting lasers.
Paul Mikesell, creator of Carbon Robotics, said the machine was built to manage weeds in vegetable row crops.
“There are eight lasers across. They are arranged in a fashion that’s parallel to the furrows. So, if you imagine a row with our vehicle in it that’s driving forward, those lasers are arranged linearly pointing back to front. Then through some optics the targeting bounces that beam down at the bottom of the robot to target the weeds,” Mikesell said.
His background is in computer vision and deep learning, which he applied to help the robot differentiate weeds from crops.
“It’s a learning algorithm, so it’s a neural net that has many different layers to it. It runs on GPUs (graphics processing units), which originally originated for graphics processing and have now been used for other things … like crypto currency mining. We use them a lot in deep learning because it’s very fast vector operations, things that can run in parallel, much like a human brain does,” Mikesell said.
The learning procedure involves providing the algorithm with many labelled sample images that indicate what’s in the image, he added.
“By label, I mean pixel locations that have a label associated with it. So, like this would be an onion for example and there’s an outline of an onion, or this is a weed that we want to shoot and we’ll have the centre of the weed meristematic growth plate of the weed that we’re shooting with the laser,” Mikesell said.
Once the neural net is given enough samples, it will learn to differentiate weeds and crops.
“Now it can make inferences about things that it hasn’t seen before, and it can say, ‘oh that’s this kind of plant. I’ve seen that before. It’s not an exact copy but I know that’s an onion. Or I’ve seen that before, it’s not an exact copy, but I know that’s a purslane, which is a type of weed, or lamb’s quarters, which is a type of weed.’ So, it learns, and then as we feed it more and more information it gets better and better.”
The processing and predictions are made in real time by the on-board computer, which does not need broadband connectivity.
However, during the training process, the neural nets require the team to gather example imagery and upload it to computers that conduct the deep learning processes.
Before turning the laser-blasting robots loose on a field, Carbon Robotics scouts weeds to fine-tune the AI for a specific field.
“Sometimes we can deploy the exact same ones (neural nets) that we’ve had before. Sometimes there are some smaller tweaks, what’s generally referred to as fine-tuning,” Mikesell said.
“The procedure usually takes 24 to 48 hours from initial arrival (at the field) to getting a good neural net, good predictions for us. That’s assuming it’s a new field.”
Some companies involved in environmental monitoring offer AI-based products to help producers make decisions.
For example, a California company called FarmSense has developed an insect-monitoring system that uses AI to classify insects by the sounds they make.
Insect noises when in flight vary between species but it is difficult to use microphones in field sensors because of environmental noise.
FarmSense developed a sensor that uses a curtain of light at the opening of the trap. When a bug flies through the light, it causes a specific disruption pattern.
“We have a kind of microphone but it actually records sound bits with light, not the actual sound. We call it pseudo acoustic,” said Eamonn Keogh of FarmSense.
The sensor tracks the movement of an insect’s wing beats with a laser and a phototransistor array, and then converts the disruption to a sound file, which FarmSense processes with algorithms capable of identifying the bug associated with the sound.
Keogh said the algorithms can detect more information beyond wing beats to differentiate specific bugs, but algorithms had to be trained to do that.
He said the company built a large archive of insect sound data for the machine-learning algorithms to work from.
“For the last several years we’ve been taking insect larva, put them in a cage with our sensor, and let them hatch. We watched them from birth to death, 24 hours a day under different temperature and lighting conditions, air pressure, hungry versus not hungry and so forth,” Keogh said.
Beyond autonomous farming and environmental monitoring systems, tools based on AI that bring together data sets and then run compounding analytics may become essential for farmers.
Management decisions on Canadian farms will increasingly be made by software, said Greg Killian of EPAM, an international software engineering services company that works across many industries.
Killian said agriculture faces robust competition throughout the global food supply chain but retail, finance, transportation and life science are all further along the digital revolution path than is agriculture.
“They’ve had to adapt to similar pressures, pricing pressures, competitive pressures and things like that, which have forced them to become effectively software companies. If you look at Walmart or Target or the large banks, software has become a huge part of what they’re doing,” Killian said.
Many data sets, from demographic trends to global logistics, feed machine-learning algorithms that play a pivotal role in how these companies are managed.
Large grain-trading companies already work with powerful algorithms that crunch data sets from around the globe, from weather, logistics, and supply and demand metrics, to help them determine market positions.
Some companies offer programs that use historical production data. Growers can then use it to make low and high rainfall projections and examine previous crop price trends. From there, they can determine optimal field-specific fertility rates and crop variety recommendations.
On the research side, universities and government organizations around the world are examining how to use AI to make efficiency gains in agriculture.
In Canada, the Disease Risk Tool (DiRT1) built by Agriculture Canada in 2016 is being updated to include crops beyond canola, with the help of AI.
DiRT1 combines information from satellites and user inputs into a web application that can be used to investigate the accuracy of crop-disease forecast models.
The prototype integrates geospatial data from Environment Canada on temperature and precipitation, and the web application allows users to provide information on seeding density and disease history in the field or region.
Information from the annual crop inventory and soil moisture are combined with data on the crop’s growth stage, a method developed by A.U.G. Signals.
A.U.G. Signals has worked for more than three decades in signal, image and data processing.
Zeeshan Ashraf, senior manager of strategy and business development at A.U.G., said the company has collaborated with the Canadian government since 2014 to use remote-sensing technology to create a tool for crop phenology estimation.
A.U.G. also developed an early-stage crop classification tool to help government agencies get an early glimpse of what farmers are growing in regions throughout the country.
“A.U.G.’s technologies are based on both the optical data from the optical sensors, the satellite sensors and also the radar-based data,” Ashraf said.
Clouds and fog can prevent satellite cameras from collecting data at crucial times during the growing season but those conditions do not block radar signals.
The RADARSAT Constellation Mission is a three-spacecraft fleet of Earth observation satellites operated by the Canadian Space Agency that began operating in 2019.
“This is a 100 per cent Canadian technology. The farmers and the end users have day and night coverage of their crops on their fields for any purposes that they may need it for,” Ashraf said.
The new tool A.U.G. is building with Ag Canada, DiRT2, uses the company’s artificial intelligence and data analytics technology.
“Data from all these different sensors are aggregated, and then really the online-based tool will produce predictions and the models for the end-user where they would be notified about their crop condition, whether it is the right time to have the application of, for example, pesticides,” Ashraf said.
The EPAM Cloud Pipeline enables scientists and companies to build and run customized scripts and workflows to support modelling and simulation, as well as machine learning.
The company has been involved with big data management for 14 years and has large platforms under its management, from drug research to video games, and it processes more than 100 petabytes of data and more than a billion messages and events every day.
Killian said the rate of change in agriculture stemming from high-level software strategies will be much faster compared to industries further along the digital revolution path.
“There is more innovation to draw upon,” Killian said.
“Now, almost ubiquitously, high performance computing in genetic science, whether it’s for life sciences or with genetic science for agriculture, are all being done in high-performance computing, which use graphical computing processors that came out of gaming,” Killian said.
For instance, a typical laptop today has around eight graphics processing unit (GPU) cores, but graphics cards now contain thousands of cores and provide processing power that would have been unthinkable a short time ago.
Computer vision basically means the digitization of visual data, and Killian said it doesn’t matter whether the digital information came from a video game’s digital environment or from a camera on a sprayer.
This article was originally published at The Western Producer.