Nineteen thousand sensors collect data in the Gulf of Mexico every day, feeding it back to researchers around the world. The information acquired is used to track long-term trends such as sea level rise and climate change, to enhance weather and boating forecasts, to aid shipping and navigation and track harmful algal blooms that could affect the health of coastal residents.
The Gulf of Mexico Coastal Ocean Observing System (GCOOS) is one of many data centres that gathers this information and streams it out to industry, researchers, resource managers and the public with the goal of providing timely, reliable and accurate information about coastal and open ocean waters.
But how do the people putting the data to work judge the accuracy and reliability of the information they’re using? A new project will develop the tools and the social and technical infrastructure to gather this “metadata” – the data about the sensors – so end users know where the information came from and how it was collected. The project will make this metadata easily discoverable, searchable and available to be incorporated into automated archival systems so users have a better understanding of the data’s quality and can use it appropriately.
The two-year pilot project is being led by scientists at Woods Hole Oceanographic Institution and research partners from GCOOS/Texas A&M University-Corpus Christi, the University of California, Santa Barbara, the Monterey Bay Aquarium Research Institute and Botts Innovative Research, Inc.