The OOI, a project funded by the National Science Foundation, is planned as a networked infrastructure of science-driven sensor systems to measure the physical, chemical, geological and biological variables in the ocean and seafloor. The OOI will be one fully integrated system collecting data on coastal, regional and global scales. The OOI Program is managed and coordinated by the OOI Project Office at the Consortium for Ocean Leadership, in Washington, D.C., and is responsible for construction and initial operations of the OOI network. Three major Implementing Organizations are responsible for construction and development of the overall program. Woods Hole Oceanographic Institution and its partners, Oregon State University and Scripps Institution of Oceanography are responsible for the coastal and global moorings and their autonomous vehicles. The University of Washington is responsible for cabled seafloor systems and moorings. The University of California, San Diego, is implementing the cyberinfrastructure component. Rutgers, The State University of New Jersey, with its partners University of Maine and Raytheon Mission Operations and Services, is responsible for the education and public engagement software infrastructure. National Glider Asset Map
The “Gliders for Research, Ocean Observation and Management” (GROOM) project co-funded by the EU with €3,5 million started in 2011 and will evaluate the requirements to set up a sustainable European glider infrastructure to safely operate individual as well as fleets of gliders in order to create a continuum of observations. In the future operations shall be coordinated to fill the gaps left by present marine observation systems on global, regional and coastal scale, with benefits for both fundamental marine research and operational oceanography. The GROOM consortium is composed of 19 partners representing 9 European countries. The partners have a wealth of experience and expertise in all aspects of ocean observations from the technology used to collect the data through to outputs and dissemination of data products. GROOM will define the scientific, technological and organizational/legal levels, of a European glider capacity for research and sustained observations of the oceans, in line with the other European and international initiatives for marine in-situ observations.
Argo is a global array of 3,000 free-drifting profiling floats that measures the temperature and salinity of the upper 2000 m of the ocean. This allows, for the first time, continuous monitoring of the temperature, salinity, and velocity of the upper ocean, with all data being relayed and made publicly available within hours after collection.
Argo deployments began in 2000 and by November 2007 the array is 100% complete. Today’s tally of floats is shown in the figure above. While the Argo array is currently complete at 3000 floats, to be maintained at that level, national commitments need to provide about 800 floats per year. Additionally, Argo continues to work toward global ocean coverage. Frequently, even with the 3000 float target achieved, more floats are needed because some areas of the ocean are over populated while others have gaps that need to be filled with additional floats.
Besides float deployment, Argo has worked hard to develop two separate data streams: real time and delayed mode. A real time data delivery and quality control system has been established that delivers 90% of profiles to users via two global data centers within 24 hours. A delayed mode quality control system (DMQC) has been established and 60% of all eligible profiles have had DMQC applied.
Float reliability has improved each year and the float lifetime has been extended. Argo has developed a large user community in universities, government labs and meteorological/climate analysis/forecasting centers. The need for global Argo observations will continue indefinitely into the future, though the technologies and design of the array will evolve as better instruments are built, models are improved, and more is learned about ocean variability.
Brief History of Argo
The name Argo was chosen to emphasize the strong complementary relationship of the global float array with the Jason satellite altimeter mission. In Greek mythology Jason sailed in a ship called “Argo” to capture the golden fleece.
Together the Argo and Jason data sets will be assimilated into computer models developed by project GODAE (Global Ocean Data Assimilation Experiment) that will allow a test of our ability to forecast ocean climate. For the first time, the physical state of the upper ocean is being systematically measured and the data assimilated in near real-time into computer models. Argo builds on other upper-ocean ocean observing networks, extending their coverage in space an time, their depth range and accuracy, and enhancing them through the addition of salinity and velocity measurements.
An Argo float being deployed from a research ship.
Argo is not confined to major shipping routes which can vary with season as the other upper-ocean observing networks are. Instead, the global array of 3,000 floats will be distributed roughly every 3 degrees (300km).
It will provide a quantitative description of the changing state of the upper ocean and the patterns of ocean climate variability from months to decades, including heat and freshwater storage and transport.
The data will enhance the value of the Jason altimeter through measurement of subsurface temperature, salinity, and velocity, with sufficient coverage and resolution to permit interpretation of altimetric sea surface height variability.
Argo data will be used for initializing ocean and coupled ocean-atmosphere forecast models, for data assimilation and for model testing.
A primary focus of Argo is to document seasonal to decadal climate variability and to aid our understanding of its predictability. A wide range of applications for high-quality global ocean analyses is anticipated.
Started january 2008 as a project, Euro-Argo aims at developing a European “infrastructure” for Argo to the level where the European partners have the capacity to procure and deploy about 250 floats per year, to monitor these foats and ensure all the data can be processed and delivered to users (both in real-time and delayed-mode). With a mean float lifetime of 33/4 years, such a European contribution would support approximately 1/4 of the global array and provide an additional 50 floats per year for enhanced coverage ine the European and marginal seas.
Listen to the Deep Ocean Environment (LIDO)
The next decades will see increasing levels of offshore industrial development that will lead to increased levels of noise pollution in the oceans. These sounds can have physical, physiological and behavioural effects on marine fauna in the area of activity: mammals, reptiles, fish and invertebrates can be affected at various levels depending on the distance to the sound source. The problem faced by the industry, and more generally by society, is that many economically important activities at sea are at risk because of a lack of information about the effects of anthropogenic sound on marine mammals and especially a lack of available tools to mitigate these effects. Technological developments were needed to combine the interests of the industry and the good environmental status of the oceans.
The Laboratory of Applied Bioacoustics (LAB) of the Technical University of Catalonia (BarcelonaTech, UPC) is leading an international programme entitled “Listen to the Deep Ocean Environment (LIDO)” to apply and extend developed techniques for passive acoustic monitoring of natural (e.g. rain, waves, earthquakes), biological (e.g. cetaceans, fishes, crustaceans) and artificial (man-made noises) sounds to cabled deep sea platforms and moored stations. LIDO constitutes the technical development support of the applied solutions that are now integrated and available for the administrations and the offshore industry (http://sonsetc.com).
The software framework developed under this programme is currently active at several sea observatories around the world: these include the ANTARES (ANTARES Collaboration, France) neutrino observatory, the OBSEA (UPC, Technical University of Catalonia, Spain) shallow water test site, the NEPTUNE (University of Victoria, Canada) network, the Kushiro and Hatsushima (JAMSTEC, Japan) observatories and the NEMO (INFN, Italy) sites. The system was also tested and implemented in autonomous gliders and towed arrays in collaboration with the NURC (NATO Undersea Research Centre, La Spezzia, Italy), autonomous radio-linked buoys, trawler-safe bottom-mounted structures and on offline recordings.
The LIDO software contains several independent algorithms that process real-time data streams. Among these, dedicated modules conduct noise assessment (that include the European Marine Strategy Directive descriptors of noise), detection, classification and localization of acoustic sources. From the acoustic data stream, LIDO characterizes and localises the detected events, produces spectrograms for live visualization and compresses audio for online access. It should be noted that the compressed audio is only provided to allow users to listen to a sound stream with minimal bandwidth usage; it is specifically not used for any scientific analysis, this latter being conducted obviously before displaying the results. The raw data is optionally stored locally if there is an interest in subsequent research.
A NASA-sponsored expedition is set to sail to the North Atlantic’s saltiest spot to get a detailed, 3-D picture of how salt content fluctuates in the ocean’s upper layers and how these variations are related to shifts in rainfall patterns around the planet.
The research voyage is part of a multi-year mission, dubbed the Salinity Processes in the Upper Ocean Regional Study (SPURS), which will deploy multiple instruments in different regions of the ocean. The new data also will help calibrate the salinity measurements NASA’s Aquarius instrument has been collecting from space since August 2011.
SPURS scientists aboard the research vessel Knorr leave Sept. 6 from the Woods Hole Oceanographic Institution in Woods Hole, Mass., and head toward a spot known as the Atlantic surface salinity maximum, located halfway between the Bahamas and the western coast of North Africa. The expedition also is supported by the National Oceanic and Atmospheric Administration and the National Science Foundation.
The researchers will spend about three weeks on site deploying instruments and taking salinity, temperature and other measurements, before sailing to the Azores to complete the voyage on Oct. 9.
They will return with new data to aid in understanding one of the most worrisome effects of climate change — the acceleration of Earth’s water cycle. As global temperatures go up, evaporation increases, altering the frequency, strength, and distribution of rainfall around the planet, with far-reaching implications for life on Earth.
TRIDENT proposes a new methodology for multipurpose underwater intervention tasks with diverse potential applications like underwater archaeology,oceanography and offshore industries, and goes beyond present-day methods typically based on manned and / or purpose-built systems. Trident is based on new forms of cooperation between an Autonomous Surface Craft and an Intervention Autonomous Underwater Vehicle.
Firstly, the I-AUV performs a path following survey, where it gathers optical and / or acoustic data from the seafloor, whilst the ASC provides geo-referenced navigation data and communications with the end user. The motion of the ASC will be coordinated with that of the I-AUV for precise Ultra Short Base Line positioning and reliable acoustic communications. After the survey, the I-AUV docks with the ASC and sends the data back to a ground station where a map is set up and a target object is identified by the end user. Secondly, the ASC navigates towards a waypoint near the intervention area to search for the object. When the target object has been found, the I-AUV switches to free floating navigation mode. The manipulation of the object takes place through a dextrous hand attached to a redundant robot arm and assisted with proper perception. Particular emphasis will be put on the research of the vehicle’s intelligent control architecture to provide the embedded knowledge representation framework and the high level reasoning agents required to enable a high degree of autonomy and on-board decision making of the platform.