Drones are not simply taking over the skies. They're taking to the seas, too. The oceanic explorers of the future could be teams of ocean-going bots, manned or unmanned working together to gather more data than marine scientists have ever had to work with before. The Coordinated Robotics expedition is putting that vision to the test.
On board the Schmidt Ocean
Institute's Research Vessel
Falkor (which they get to borrow
for free), the team, led by chief
s cientist Oscar Pizarro of The
Australian Centre for Field
Robotics at University of
Sydney, can run experiments
that examine the way multiple
mechanical tools can work
together. By improving these
tools and the way scientists use
them, the team aims to advance techniques that will lead to more and better data about our precious oceans.
A wide range of robotic instruments can be the eyes, ears, and hands of scientists. In the case of this expedition, that includes autonomous underwater vehicles (AUVs), gliders, Lagrangian floats, and autonomous surface vessels (ASVs).
The AUVs take photos and collect water measurements at various depths, while the floats measure ocean currents and other parameters such as temperature or salinity. ASVs and gliders, meanwhile, collect data as they rise and fall with the waves.
For two weeks, the crew worked in a remote area between Australia and Indonesia called Scott Reef. There, far away from any obstacles and other ships in the Timor Sea, the science team performed 19 dives with AUV Sirius, totaling 200 hours of bottom time. The team also completed 22 dives with the Lagrangian Float, collecting 3,000 to 4,000 images on each dive. "We explored some new areas," said Stefan Williams of ACFR. "I think having multiple robots operating simultaneously and being able to re-task them on the fly—
During these two weeks, the science team collected approximately 400,000 images—about a terabyte of data a day. The imagery was collected from sites previously visited in 2009 and 2011, and so these new pictures will provide valuable long-term monitoring of these areas. Using the new high-resolution maps in conjunction with the data, the science team was able to identify sites that were of most interest and target them for surveys with the Autonomous Underwater Vehicles (AUV) on board.
One big challenge of this research is having so many different sea vehicles in close proximity to each other—it's not easy to get all these platforms to communiate, but having a good sense of where the AUV's are and what they are doing – while in communication with them – is cutting- edge science. To solve the problem, post-doctoral research engineer Ariell Freedman developed a web-based tool that can visually locate the vehicles in real time. The flexible tool allowed scientists to see what was happening from anywhere on the ship using a computer, TV, or even a smartphone.
Of course, the problem with amassing a trove of images is figuring out how to process them. Pixels on hard drives don't really paint the picture of what is going on. To make scientific conclusions about the data, the team needs to know what the pictures show, but there are simply too many images for the scientists to review themselves. The solution is to task computers with organizing the imagery, but it's not that easy. Determing the content of a photo is one of those jobs that's easy for a human but still hard for a machine.
So Freedman asked robot and science enthusiasts for help. He created a citizen science website using images collected from the AUVs, called Squidle, which allows participants to label images used to train algorithms to
interpret the imagery. Once they have a large number of labels, the team will use the data for training machine learning algorithms. It could open up some really interesting avenues for future research.
As an added bonus, the science team and crew on Falkor also managed to come back with the same number of robots that they left with... which is not necessarily a guarantee.
|Author:||Logan Mock- Bunting|