The oceans cover 70 percent of our planet, but humans have explored less than five percent of it, according to the National Oceanic and Atmospheric Administration. In the past, scientists observed the oceans through boats, divers and submarines. But new advances in robotic drone technology is enhancing our ability to study the seas. Last summer, University of Delaware finished building their ocean robotics laboratory, which now hosts 13 robots.
Inside their laboratory are 13 robotic drones. Some weigh hundreds of pounds and are shaped like torpedoes. They’re dropped into the water and survey anything down in the depths, from the plankton and squid that whales eat — to planes and ships submerged from storms and wars. Others are flying drones that can provide researchers an aerial view of how sea level rise and storms are impacting our coastlines.
Hunter Brown, the operations manager of UD’s robotic discovery laboratories, said that their unique collection of drones sets them apart from other oceanography laboratories. “We have 10 underwater robots and 3 aerial robots right now, making us one of the larger universities in terms of number of vehicles and on top when it comes to diversity,” said Brown. “So for example, we have the Remus 600, then we have some smaller versions called the Remus 100’s, we’ve got two Teledyne Gavia vehicles, two Slocum Gliders, and we have three quadcopter aerial vehicles.”
Mark Moline, the director of marine science and policy at UD, said these robots are exploring the oceans in a way that humans have never explored them before. They do this by going deeper than any diver can go and using sonar devices to detect what objects the robots are encountering.
One robot, the Remus 600, was made by Kongsberg, a Norwegian manufacturer. It bears the same shade of yellow as a school bus and measures out to be a little more than three meters long, with two small black fins and a propeller sticking out at the end. When placed into the sea, it can travel as deep as 600 meters from its starting point. Moline points out a sensor called a transducer located on the drone and how the robot uses them to detect exactly what marine life it’s encountered.
“The transducer sends a signal out at a certain frequency down into the water and looks at the return,” said Moline. “By having two frequencies, there’s a larger one and a smaller one, this is a lower frequency and a higher frequency. By having two frequencies and looking at the ratio of the frequencies, you can tell what targets it’s bouncing off of, whether it’s a fish, whether it’s small zooplankton.”
Because each type of organism sends back a unique set of frequencies to the drone, the scientists can identify what the drone is looking at. This feature was important for the Remus 600’s most recent mission: studying the distribution of food sources for whales. Whales are incredibly difficult to study because they dive down to depths around a thousand meters, or a little over half a mile, which beyond what humans can easily observe. The Remus 600, however, can reach those depths because it can travel 600 meters down from its starting point. “By putting the [drones] at 600 meters, you can extend the depths down to 1200 meters,” said Moline. “So now we’re able to find out, for the first time, what the distribution of food resources for deep diving whales are at those depths.”
The Remus 600 isn’t the only drone that’s looking at aquatic life. One drone called the Slocum Glider is currently swimming with penguins in Antarctica. Another drone, the Teledyne Gavia, surveys scallops to track how dredging affects their populations. Brown indicates a black bar that serves as a sensor. “It creates an image using only sound,” said Brown. “By sending out sound pulses and timing the travel time and intensity of the reflection, it can build a reflection of what’s on the sea floor.”
The Teledyne Gavia also maps the sea floor for non-living things. UD Professor Art Trembanis used it to study a shipwreck discovered four years ago off the coast of Cape Henlopen. The ship was identified as the W.R. Grace, a supply ship that was run aground by a hurricane in 1889. Brown points to a large image of the wreck they have hanging on the wall. “The picture you see on the back right is of the W.R. Grace just offshore here,” said Brown. “And that was created just by using sound.”
The researchers have also been using the drones to search for lost war vessels to help the BentProp Project recover U.S. servicemen missing in action. Moline and other oceanographers travel every spring to Palau, a Pacific Island near the Philippines that served as a critical World War II battle site. Last spring, they uncovered two lost planes and potentially the whereabouts of 70 missing airmen.
Just like how the sensors on the Remus 600 can differentiate squid from plankton, the sensors on the Remus 100 can detect the difference between aircraft metal and organic matter. “One of the interesting things about the sound is that it reacts differently to different types of substrates, so if it hits mud it absorbs a lot of sound, ” said Moline. If it hits metal it returns a lot of sound and so if we fly by an aircraft, it’s going to give a very bright return signal and we’re going to identify that.”
Moline’s grandfather was also a Navy chaplain aboard one of the World War II vessels that attacked Palau, the U.S.S. Princeton. “My grandfather was on one of the aircraft carriers as part of the strike on Palau in 1944,” said Moline, “and I had received a scrapbook with that information about a week prior to going in 2013. In his journal he referenced an airman who didn’t return and had his name in there. His name was one of the names BentProp is searching for. It’d be a Hollywood ending if we found this individual.”
While there’s plenty to study in the ocean’s depths, it’s also important to study what’s happening on the surface. The aerial drones are the most recent addition to the laboratory. They have the ability to soar above and survey the coastlines. Soon, Brown and other UD researchers hope to fly them along the Delaware shores to study the impacts of sea level rise and severe storms. “So we will use the quadcopters to take a lot of photographs,” said Brown, “and by knowing exactly where the vehicle is in the air and what direction it’s pointing when it takes the photograph, we can actually stitch those together and create a 3-D model or representation of what it sees.” “That’s really important for beach profiles,” Moline added, “if you get storms coming in, seeing what the erosion rates are, that kind of thing.”
The capabilities of these vehicles extend far beyond what humans are able to do — given the depths that the drones can travel, its ability to distinguish between aircraft and cephalopods. There’s no doubt that robots are key to advancing the field of oceanography into the future.
The next step for this laboratory however, isn’t to add more robots. It’s to get more people involved. The Remuses, gliders and Gavia’s are bringing in so much data that they need more researchers focused on analyzing all of it. “If I had all the money in the world right now, I’d invest in a core group of ten people that are working on not only the deployment side and the engineering side of the vehicles, but also the data side of how to visualize that information, out dive deep down into what the information is telling us about the environment. We’d probably buy a few more bells and whistles for the drones, but in terms of major costs, what I’d do is I’d invest in some sharp people.”
The drones act as the scientists’ interpreter in the oceans. But now the challenge lies in making sense of all the swaths of information it’s sending back to us and how it all might change our relationship with the oceans.