Rangeland graziers are set to be the next beneficiaries of new research into weed seeking robots.
This week’s $10 million funding package for pest animals and weeds control from the federal government has backed a project to extend existing cropping-targeted technology to grazing country.
Agricultural robotics is already a focus for major Australian research institutions such as Queensland University, with its AgBot, and Sydney University’s Australian Centre for Field Robotics, as well as private agtech developer SwarmFarm.
But until now, efforts have centred on relatively controlled cropping applications, with uniform, flat terrain, predictable light conditions (with few trees and rocks and so on), limited weed varieties and neat boundaries.
Rangelands, are more challenging, with a wide variety of weeds coupled with rugged terrain and highly variable natural light.
Leading research from a team at James Cook University (JCU) is developing new imaging technology which could save time and money for graziers battling debilitating weeds such as prickly acacia, lantana and the host of other pastoral nasties which proliferate across the country.
Driving the project is lecturer with JCU’s College of Science and Engineering Alex Olsen, who is undertaking a PhD in image processing for precision agriculture.
“Rangelands have a virtually infinite number of species, changes in the light and tough terrain for a robot and imaging technology to deal with, which makes it a very specific area of research.”
Weeds comprise around 15 per cent of flora across the country and slugs the ag sector about $4 billion a year in management costs and lost production.
“There are so many rangeland grazing areas right across Australia where it’s critically important to reduce the cost of weeds,” Mr Olsen said.
“This could be a very useful application.”
The basic elements for cropping and grazing applications are the same.
A camera collects images which are processed by bespoke software to identify weeds, and when they are detected a range of robotic herbicide applications can be deployed, from manual removal, to sprays and even ballistics, where a paintball style gun fires a control substance at a hard to reach target.
But image processing technology for rangelands need a novel solution. That’s where Mr Olsen’s research comes in.
JCU’s method combines images from the visible and near-visible infrared spectrum, to collect more detail and cope with the complex environment.
“Normally you apply one system or the other, but we are marrying the traditional approach of the visible spectrum, which captures texture, colour and shape, and combining it with the near-visible infrared spectrum,” Mr Olsen said.
“Marrying the systems allows us to differentiate between a greater number of species in a complex environment.”
With the extra information from the new image processing technique, a convolutional neural network can be applied to what the robot’s camera captures.
This is a system for machine learning that allows the robot to efficiently process complex information and increase in accuracy over time, or learn to better define its target.
Project supervisor Professor Peter Ridd, also of JCU, said commercially viable robots still need to be developed to cope with harsh rangelands.
“Robotics is really at quite an early stage in agriculture. Industrial robotic technology is more than 30 years old,” Prof. Ridd said.
“But they’re a reality now, and will be common in the coming years.
“None of the bits required to build it are impossible and I’m confident it will be solved. We should see a prototype in the field within the decade.”