Deploying autonomous technology on water comes with extra challenges

RangerBot investigating a reef.

For a robotics engineer who specialises in AI’s ability to perceive its environment, autonomous marine vehicles offer a unique challenge.

Matthew Dunbabin likes a challenge. That’s why he works with autonomous boats: when your research involves using robotics to perceive and navigate spaces, working on water accentuates the difficulties.

“It is challenging; when you don’t have communication, you also often don’t have GPS,” he told create.

“It makes you think about what’s important in the algorithms, the way you perceive the environment.”

The open waters are a long way from where his career started. Dunbabin, a Professor at Queensland University of Technology’s School of Electrical Engineering and Robotics, was originally interested in designing Formula One racing cars.

That took him into simulation modelling of road trains and, after that, work on mining robotics at the CSIRO. 

“After a little while I became fascinated in the potential to use robotic technology for the conservation and management of natural environments,” he said. 

“I basically then refocused my entire research into what we call environmental robotics — that is, the creation of robotic systems to accurately perceive, move around, and perform complex tasks in the environment.”

Putting the autonomous vessel to work.

That focus — particularly as applied to the maritime domain — has come to life in two innovations that use autonomous watercraft to reach places and perform activities that are difficult or dangerous for humans to achieve alone. 

These include SAMMI — or Seqwater’s Automated Motorised Monitoring Instrument — which is a surface vessel designed to monitor water quality in hard-to-reach reservoirs. Another, RangerBot, is a vision-based underwater robotic system that helps conserve coral reef environments and is currently used to help preserve and restore the Great Barrier Reef. 

One way it does this is by identifying and counting numbers of crown-of-thorns starfish, an insidious pest that feasts on coral, matures rapidly and can reproduce quickly.

“We have developed a system that initially started with a robot called COTSbot to automatically identify the starfish,” Dunbabin explained. “The algorithms we have developed operate on board the system — so there’s no human in the loop — and process images, looking for starfish, and controls itself in real time.” 

Serving and protecting

RangerBot isn’t just designed for surveillance — though it is effective at that: it can pick out a crown-of-thorns starfish from other sea life with 99.4 per cent accuracy.

And once it has done so, it can address the problem.

“It is challenging when you don’t have communication, you don’t have GPS. It makes you think about what’s important in the algorithms, the way you perceive the environment.”
Matthew Dunbabin

“If it sees an image and if we have the injection system on it as a payload, then yes, the robot can inject the starfish,” Dunbabin said.

And this injection only affects the coral-destroying starfish; it is harmless to other life on the reef.

“We’re now looking at how to optimise the deployment of these robotic systems in what we call techno-economic assessments,” Dunbabin said. “Is it more efficient to use robots, humans, or a combination of both?”

RangerBot can also be used to monitor other issues affecting the reef, such as coral bleaching, water quality, pollution and siltation.

One thing that was important for Dunbabin to get right was the user experience (UX).

RangerBot has pre-programmed missions that can be selected using a tablet, and Dunbabin wanted this process to be as straightforward and as intuitive as possible.

“The goal of RangerBot in terms of UX was to have any person be able to operate it — whether a person off the street, someone who has some snorkelling experience, or a management authority — to operate the vehicle within five minutes,” he said. 

“They can quite easily generate a mission, upload it and then send it off. This was actually the first time that we really started to explore the integration of smart-device psychology, creative industries, user experience [and] interactive designers in our robotic systems.”

And that meant testing out the UX on everyone from members of marine research groups to members of the general public.

“The very first apps that were based on my experience, nobody got. If you gave them to the general public, there were terms in there that weren’t familiar,” Dunbabin said. 

“We went through a few iterations and converged on something that’s actually quite nice, and it’s fit for purpose.”

Coral creation

But RangerBot is more than a starfish-menacing machine.

“I’m working with Professor Peter Harrison from Southern Cross University,” Dunbabin said.

“He’s a Coral IVF expert, so he knows how to take coral larvae, rear it, and then redistribute it over damaged reefs to regrow reefs. What we’re now doing is actually using RangerBots, which we call LarvalBots for that purpose, to take that coral larvae and basically precision-seed the reef.”

“The goal of RangerBot in terms of UX was to have any person be able to operate the vehicle within five minutes.”
Matthew Dunbabin

The robot’s real-time vision system determines if a reef substrate is suitable for reseeding and, if it is, it delivers the coral larvae to that section.

“We’ve been able to upscale the deployment areas dramatically,” Dunbabin said. “In the last year we’ve gone mainly from diver-based systems of patches that are maybe 200 m2 at most, to having one robot cover three hectares in six hours.”

Sailing with SAMMI

The importance of UX is something RangerBot shares with SAMMI, the water monitoring boat that Dunbabin devised for the Queensland Government water utility Seqwater.

“The tablet is the interface to set the mission that you want it to do,” Dunbabin said. 

“Dam operators and scientists have preloaded missions that routinely go out and do things, but we can also set up new missions. This tablet UX has been specifically designed to allow operation very easily — so you don’t need to be a trained roboticist — or really care about the technology — to do useful tasks.”

Dunbabin operating SAMMI.

SAMMI is currently deployed on a remote reservoir in south-east Queensland, where it helps Seqwater monitor the quality of drinking water. Usually, this is a task humans are required to carry out, and it is one that can be difficult and dangerous. The reservoir is hard to access, particularly in times of drought or after a storm.

“What we looked at was how can you remotely conduct the normal methods of compliance monitoring that Seqwater needs to routinely do to manage the reservoir,” Dunbabin said. 

“We then brainstormed up some ideas. This was around SAMMI — the idea of a robotic boat that could live on the water storage and routinely collect water samples on specific times every couple of weeks.”

But, in turning this idea into reality, the project got more complicated than first anticipated.

“We already had a group of autonomous boats that we were interested in that we’ve worked on in the past, but they turned out to be too small and their capabilities weren’t good enough to do these very remote unsupervised management tasks,” he explained.

“We had to basically go back to the drawing board and come up with a whole new concept as we were designing these concepts and specifications. Things like having to be helicopter-lifted into the reservoir, if need be, and be protected from debris during major storm events.

“All these other things that aren’t normally part of a robotics program had to be considered. How do you get access to deploy it? Once it’s in there, how do you stop biofouling?”

“On a typical mission, the robot travels over seven kilometres.”

Once he started addressing these problems, Dunbabin found that SAMMI was able to collect more data than Seqwater had originally hoped.

For instance, it could engage in continuous water quality monitoring — such as measuring temperature, acidity, and salinity — that was previously done using fixed instruments at remote sites. It was also able to take physical samples of the water and bring them back to waiting dam operators and scientists. 

“The whole idea of SAMMI is to travel across this water storage where there is only one access point in and out,” he says. 

“On a typical mission, the robot travels over seven kilometres.”

SAMMI has been at work for eight months — continuously and autonomously. That means it has to be entirely solar-powered, since it can’t be refuelled regularly.

“We can actually manage our power with a reasonable-sized solar panel and onboard batteries,” Dunbabin said. 

“We can’t go out every day and travel the whole storage, but we can do the compliance monitoring, which is required every couple of weeks.”

It’s exactly those kinds of challenges that keep Dunbabin engaged with this field. 

“You’ve got systems that need to be out there for hours to days, to months, to years,” he said.

“How do you manage power? What are the minimum sensor and hardware and algorithm requirements needed to operate day and night? These are some of the big challenges that make it a really interesting field to work in.” 

Exit mobile version