Robots that can see and can work alongside humans could hold the key to solving some of Australia’s problems, such as an ageing population and a chronic shortage of agricultural workers, as they help out in aged care homes and take over from backpackers picking fruit and veggies.
A new $25. 6 million Australian Centre for Excellence for Robotic Vision has just been launched at Queensland University of Technology (QUT) in partnership with ANU, Adelaide and Monash Universities.
The research team will spend the next seven years developing robots with the kind of advanced vision and hand-eye co-ordination necessary for them to work safely with humans and to complete more complex tasks.
Government News spoke to the Centre’s Director, Professor Peter Corke about how close we are to having robots picking apples in orchards or turning people in their hospital beds.
Prof Corke said that up until now, robots have been confined to working in cages in factories in “highly organised environments”.
“The robots that you see working in car factories are very fast, very strong, but actually quite dangerous to human beings so in those areas of the factory human beings are excluded from being in there with the robots,” Prof Corke said.
“They are essentially blind machines. That means although they can work very well in an organised place like a factory, they can’t work at all well in an environment that is unorganised, so that might be in an orchard or vegetable field where they’re trying to harvest produce.”
One of the Centre’s robots, Baxter (made by Rethink Robotics), is ‘visually enabled’ with cameras in his head and wrists, which allows him to recognise objects in the workplace as well as to move and manipulate them.
This capacity also makes him human friendly because if Baxter does happen to touch a human colleague he can sense this and stops.
Prof Corke calls Baxter ‘the vanguard of the next generation of robots” which he says we will increasingly see in our workplaces.
“It’s a good research tool because it’s the first robot with some visual capability. One aspect of our research is how to make that vision capability better.”
Improving robots’ visual capability is an area scientists have been working on for more than 50 years and it is linked closely to neuroscience.
For example, when humans see a building, whether it’s in daylight, night-time, during a storm or has ten people standing in front of it we know it’s the same building. This is a much more difficult concept for robots to grasp, says Prof Corke.
“We are extraordinarily good at filtering out all of this. We have got a highly evolved vision system and about one-third of our brains is devoted to visual processing. We don’t yet have the capability to emulate the human brain – or even one-third of it – but neuroscience has come along in leaps and bounds.
“We’re starting to learn more tricks about how the human brain visually processes things, we just need to pinch some of these tricks and embed them into some of our robots.”
Prof Corke said search engines like Google and social media like Flickr had opened up the possibilities for creating artificial neural networks because of the huge amount of data they held.
For example, a human can identify coffee cups of different shapes and colours and still knows this is a coffee cup but this hasn’t been so easy for a robot to do but Prof Corke said that if you show a robot enough images it will being to begin to learn the characteristics of objects.
Developing this visual capability is critical if robots are going to take over some of the routine heavy lifting jobs in the Australian economy, area like cleaning, farming and factory work where Australia suffers a labour vacuum.
Prof Corke’s team is also working on another robot, Agbot II, which is built for agrarian tasks, including fertilising land and spraying herbicides or pesticide.
“The idea is that they operate in big fields, like wheat fields, and use their vision to navigate along the rows of the fields and if there’s some obstacle, like a fallen tree or a fence post, then they can detect that and adapt their behaviour. They can also give feedback to the farmer,” he said.
Agbot II can already pick plastic capsicums from plastic plants but backpackers shouldn’t despair just yet. Agbot needs refining, particularly in its vision and hand-eye co-ordination and Prof Corke says trials are “a few years out”.
“We’re doing quite well at finding the capsicums but we haven’t quite worked out the best fingers to add to the robot. You can’t grab it too hard or you’ll damage the fruit but you need to grab it reasonably firmly so you can apply enough force to get it off the bush,” Prof Corke said.
The team is also working on an underwater robot to detect and kill the crown of thorns starfish, a notorious coral damager. At the moment, divers must inject each starfish, an incredibly expensive and time-consuming process.
An underwater robot has already been developed that can identify and inject starfish, the team is now working on an underwater vehicle.
Prof Corke says robots are well adapted to coping with highly labour intensive situations and he suggests they could be used to offset the problems of having an increasingly ageing population.
Robots could be used to help lift people, turn them in their beds or take them to the toilet.
Asked whether robots could ever replace human interaction in hospitals, aged care homes or community care settings Mr Corke said it was possible, but not necessarily desirable.
“Humans are able to emotionally connect to almost anything. It’s extraordinary. People can emotionally connect with machines, whether it’s ethical to rely on them, I don’t know, but it’s certainly possible.”
The aim is to build a much more generic, multi-talented robot. At the moment these robots each have a specific task, whether it’s picking capsicums, injecting starfish or assembling a factory product. One robot that does the lot with software that can be quickly downloaded to train other robots is the ideal.
Amazing story. The age of the robot has begun …