Underwater gliders have become widely used in the last decade. This has led to a proliferation of data and the concomitant development of tools to process the data. These tools are focused primarily on converting the data from its raw form to more accessible formats and often rely on proprietary programing languages. This has left a gap in the processing of glider data for academics, who often need to perform secondary quality control (QC), calibrate, correct, interpolate and visualize data. Here, we present GliderTools, an open-source Python package that addresses these needs of the glider user community. The tool is designed to change the focus from the processing to the data. GliderTools does not aim to replace existing software that converts raw data and performs automatic first-order QC. In this paper, we present a set of tools, that includes secondary cleaning and calibration, calibration procedures for bottle samples, fluorescence quenching correction, photosynthetically available radiation (PAR) corrections and data interpolation in the vertical and horizontal dimensions. Many of these processes have been described in several other studies, but do not exist in a collated package designed for underwater glider data. Importantly, we provide potential users with guidelines on how these tools are used so that they can be easily and rapidly accessible to a wide range of users that span the student to the experienced researcher. We recognize that this package may not be all-encompassing for every user and we thus welcome community contributions and promote GliderTools as a community-driven project for scientists.
Tools and Data
Ocean ecosystems are in decline, yet we also have more ocean data, and more data portals, than ever before. To make effective decisions regarding ocean management, especially in the face of global environmental change, we need to make the best use possible of these data. Yet many data are not shared, are hard to find, and cannot be effectively accessed. We identify three classes of challenges to data sharing and use: uploading, aggregating, and navigating. While tremendous advances have occurred to improve ocean data operability and transparency, the effect has been largely incremental. We propose a suite of both technical and cultural solutions to overcome these challenges including the use of natural language processing, automatic data translation, ledger-based data identifiers, digital community currencies, data impact factors, and social networks as ways of breaking through these barriers. One way to harness these solutions could be a combinatorial machine that embodies both technological and social networking solutions to aggregate ocean data and to allow researchers to discover, navigate, and download data as well as to connect researchers and data users while providing an open-sourced backend for new data tools.
Marine heatwaves (MHWs), or prolonged periods of anomalously warm sea water temperature, have been increasing in duration and intensity globally for decades. However, there are many coastal, oceanic, polar, and sub-surface regions where our ability to detect MHWs is uncertain due to limited high quality data. Here, we investigate the effect that short time series length, missing data, or linear long-term temperature trends may have on the detection of MHWs. We show that MHWs detected in time series as short as 10 years did not have durations or intensities appreciably different from events detected in a standard 30 year long time series. We also show that the output of our MHW algorithm for time series missing less than 25% data did not differ appreciably from a complete time series, and that the level of allowable missing data could cautiously be increased to 50% when gaps were filled by linear interpolation. Finally, linear long-term trends of 0.10°C/decade or greater added to a time series caused larger changes (increases) to the count and duration of detected MHWs than shortening a time series to 10 years or missing more than 25% of the data. The long-term trend in a time series has the largest effect on the detection of MHWs and has the largest range in added uncertainty in the results. Time series length has less of an effect on MHW detection than missing data, but adds a larger range of uncertainty to the results. We provide suggestions for best practices to improve the accuracy of MHW detection with sub-optimal time series and show how the accuracy of these corrections may change regionally.
With globally accelerating rates of environmental disturbance, coastal marine ecosystems are increasingly prone to non-linear regime shifts that result in a loss of ecosystem function and services. A lack of early-detection methods, and an over reliance on limits-based approaches means that these tipping points manifest as surprises. Consequently, marine ecosystems are notoriously difficult to manage, and scientists, managers, and policy makers are paralyzed in a spiral of ecosystem degradation. This paralysis is caused by the inherent need to quantify the risk and uncertainty that surrounds every decision. While progress toward forecasting tipping points is ongoing and important, an interim approach is desperately needed to enable scientists to make recommendations that are credible and defensible in the face of deep uncertainty. We discuss how current tools for developing risk assessments and scenario planning, coupled with expert opinions, can be adapted to bridge gaps in quantitative data, enabling scientists and managers to prepare for many plausible futures. We argue that these tools are currently underutilized in a marine cumulative effects context but offer a way to inform decisions in the interim while predictive models and early warning signals remain imperfect. This approach will require redefining the way we think about managing for ecological surprise to include actions that not only limit drivers of tipping points but increase socio-ecological resilience to yield satisfactory outcomes under multiple possible futures that are inherently uncertain.
Pollution by marine litter is raising major concerns due to its potential impact on marine biodiversity and, above all, on endangered mega-fauna species, such as cetaceans and sea turtles. The density and distribution of marine litter and mega-fauna have been traditionally monitored through observer-based methods, yet the advent of new technologies has introduced aerial photography as an alternative monitoring method. However, to integrate results produced by different monitoring techniques and consider the photographic method a viable alternative, this ‘new’ methodology must be validated. This study aims to compare observations obtained from the concurrent application of observer-based and photographic methods during aerial surveys. To do so, a Partenavia P-68 aircraft equipped with an RGB sensor was used to monitor the waters off the Spanish Mediterranean coast along 12 transects (941 km). Over 10000 images were collected and checked manually by a photo-interpreter to detect potential targets, which were classified as floating marine macro-litter, mega-fauna and seabirds. The two methods allowed the detection of items from the three categories and proved equally effective for the detection of cetaceans, sea turtles and large fish on the sea surface. However, the photographic method was more effective for floating litter detection and the observer-based method was more effective for seabird detection. These results provide the first validation of the use of aerial photography to monitor floating litter and mega-fauna over the marine surface.
The dynamics of fish length distribution is a key input for understanding the fish population dynamics and taking informed management decisions on exploited stocks. Nevertheless, in most fisheries, the length of landed fish is still made by hand. As a result, length estimation is precise at fish level, but due to the inherent high costs of manual sampling, the sample size tends to be small. Accordingly, the precision of population-level estimates is often suboptimal and prone to bias when properly stratified sampling programmes are not affordable. Recent applications of artificial intelligence to fisheries science are opening a promising opportunity for the massive sampling of fish catches. Here, we present the results obtained using a deep convolutional network (Mask R-CNN) for unsupervised (i.e. fully automatic) European hake length estimation from images of fish boxes automatically collected at the auction centre. The estimated mean of fish lengths at the box level is accurate; for average lengths ranging 20–40 cm, the root-mean-square deviation was 1.9 cm, and maximum deviation between the estimated and the measured mean body length was 4.0 cm. We discuss the challenges and opportunities that arise with the use of this technology to improve data acquisition in fisheries.
Elasmobranchs, extremely charismatic and threatened animals, still are an important economic source for fishers in many parts of the world, providing significant income through trade. Even though Greek seas host at least 67 elasmobranch species, our knowledge about their biology and ecology is to a large extent unknown. In the present study the integration of conventional (legislation, official data from fisheries landings and fish market value and import/export data) and unconventional (social media) sources of data, accompanied with the use of genetics, aim at outlining the elasmobranch fisheries and trade in Greece and identifying “weak spots” that sabotage their conservation. Results revealed that: (a) about 60% of the 68 specimens collected in fish markets were mislabelled, with that being very common for Prionace glauca and Mustelus spp., (b) Illegal fishing is a reality, c) Greece represents one of the top-three European Union southern countries in terms of elasmobranch market size, (d) Aegean Sea and especially its Northern part (Thermaikos Gulf and Thracian Sea) contributed to more than half of the M. mustelus Greek fisheries landings and (e) wholesale prices of elasmobranchs have remained stable during the last decade. Mislabelling and illegal trade of elasmobranchs are common ground in Greece. This context stems from incoherent and complex fisheries legislative framework due to institutional decoupling, discrepancies in the collection and analysis of fisheries-related data, thus substantially reducing the efficiency of the fisheries management in Greek seas.
As ocean acidification (OA) sensor technology develops and improves, in situ deployment of such sensors is becoming more widespread. However, the scientific value of these data depends on the development and application of best practices for calibration, validation, and quality assurance as well as on further development and optimization of the measurement technologies themselves. Here, we summarize the results of a 2-day workshop on OA sensor best practices held in February 2018, in Victoria, British Columbia, Canada, drawing on the collective experience and perspectives of the participants. The workshop on in situ Sensors for OA Research was organized around three basic questions: 1) What are the factors limiting the precision, accuracy and reliability of sensor data? 2) What can we do to facilitate the quality assurance/quality control (QA/QC) process and optimize the utility of these data? and 3) What sort of data or metadata are needed for these data to be most useful to future users? A synthesis of the discussion of these questions among workshop participants and conclusions drawn is presented in this paper.
A new method consisting of enrichment factor (EF) determination, nonmetric multidimensional scaling (NMS), and the geographic information system (GIS) technique was firstly developed to identify anthropogenic heavy metal sources in marine sediments of Hong Kong. Firstly, the EF was determined to differentiate between heavy metals originating from human and natural sources. Subsequently, NMS was applied to identify various source patterns of heavy metals, and the NMS score was calculated and spatially interpolated using GIS technology to evaluate the spatial influences of anthropogenic impacts in different areas. The concentrations of heavy metals in sediments of Hong Kong substantially exceeded their background values, demonstrating anthropogenic pollution. Two different types of human sources could be identified via NMS, one representing the industrial pollution discharges in the period from the 1960s to the 1980s before pollution control was introduced and one representing sewage discharge before the Tolo Harbour Action Plan in the mid-1980s.
In the process of marine resource development and marine environmental protection, the government supervises the production behavior of enterprises; enterprises accept government supervision; and non-profit organizations supervise the process. On this basis, a conceptual model of the relationship between government, enterprises, and non-profit organizations is established, and the internal mechanism governing their interactions is analyzed. Using game theory, a simulation model of government, enterprises, and non-profit organizations is constructed, and a Nash equilibrium solution and strategy selection analysis are carried out. The correlation between the game participants and strategy selection is simulated and analyzed with MATLAB 7 software. Lastly, relevant countermeasures and suggestions are put forward to engender effective supervision by government departments, continuous environmental development and effective environmental protection of enterprises, and effective supervision by non-profit organizations. Studying the regulatory strategies of the government, enterprises, and non-profit organizations can provide a foundation for marine resource development and marine environmental protection policy in accordance with the current situation.