How to balance business and research interests in marine environmental monitoring data was a subject of debate at the 5th International Conference on Ocean Energy held at the World Trade and Convention Centre in Halifax, Nova Scotia on 4-6 November 2014. Held for the first time in North America, this conference brought together industry, government, and scientific stakeholders involved in oceans energy to focus mostly on the future of marine renewable energy sources, including offshore wind, wave, and tidal energy.
The need for standardization and collaboration echoed across a number of sessions and plenaries. Mutual trust and pooling of resources, data, and information are integral aspects of collaboration. However, in the competitive, emergent field of ocean energy, information sharing is not always advantageous.
In addition, the commercialization of raw data offers a substantial source of revenue. The resulting tension between data as a commodity and open data initiatives was explored in the panel session entitled, “Measurement, Communication and Monitoring.”
Monitoring devices and marine renewable energy
Environmental monitoring plays a vital role in marine renewable energy by uncovering the conditions in which machinery operates and helps to measure potential impacts on marine life. Subsea monitoring devices often run twenty-four hours a day, capturing multiple types of data (e.g., video, sound, and sonar) in real-time. This data is then fed into research facilities operated by organizations such as The European Marine Energy Centre (EMEC) and the Fundy Ocean Research Center for Energy (FORCE).
Given the sheer volume of raw data collected, these organizations are faced with the increasingly common challenge of how to manage Big Data. While previous research was often hampered by data scarcity, today’s researchers must deal with the problems associated with too much data (e.g., the strain on human resources and infrastructure requirements capable of both storing and parsing massive datasets).
Monitoring organizations have developed different strategies to manage data overload. Some devices are equipped with sensors set to detect “events” (i.e., an aberration from the normal ocean state) which then trigger their recording devices. Others feed a stream of live data into software capable of filtering out long stretches of inactivity, or even automatically indexing events so that they become searchable. While such methods help to mitigate the problem of data deluge, they alone cannot make raw data meaningful.
What to do with data?
For data to become useful, it must be interpreted. Interpretation requires mechanisms for the data to reach the hands of researchers. The dissemination stage is where problems arise. The Big Data epidemic demonstrates that the issue is not about our ability to generate data, but rather in our desire to share and manage it.
University of Washington researcher James Joslin suggested that one of the significant obstacles his team encounters is the inability to obtain data from developers. He identified a need for increased collaboration and open access to raw data. This point was reiterated by others including Dr. Mairi Best, a scientific consultant for the European Multidisciplinary Seafloor & Water Column Observatory, and Dr. Anna Redden, Director of the Acadia Tidal Energy Institute.
Open access data policies could lead to greater transparency, as well as prove beneficial to both industry and scientists alike. Currently, data is being drawn from marine energy test sites faster than it can be analysed. But, data that is metaphorically collecting dust in a storage facility has no utility. If this raw data could be made widely available to researchers, more value could be drawn from it since more eyes considering data usually means more insights, essentially providing businesses with “free” expertise.
Of course, before such expertise can be provided the cost of data collection must be borne by someone. Companies often invest millions of dollars in building infrastructure to collect and store data. For the industry to remain financially viable, these companies must see a return on their investments. Leasing their facilities (and databases) to researchers is a core component of their business model. If the data cannot be collected cost free, why should it just be given away? In response to this line of reasoning, Dr. Best, a proponent of open data initiatives, argued that “you may have spent a lot of money to get the data, but it’s not information until somebody makes sense of it” (Personal communication, November 5, 2014).
Another aspect of this debate centered on community engagement, highlighted prominently in the opening keynote address “Tidal Opportunities in the Maritimes and the Future Impact of a Local Industry on a Global Scale” delivered by Chris Huskilson of Emera, Inc. Since citizens have a stake in coastal and environmental issues, some suggest companies have a societal obligation to share environmental monitoring data. For example, the potentially broad utility of environmental data might be overlooked if the data was only examined with regard to a particular need. Scott McLean of Oceans Networks Canada discussed how his organization worked with schools to help educate children about oceans issues. Additionally, Oceans Networks Canada has also successfully engaged citizen scientists to assist in the processing of video data through gamification techniques, with one player contributing over 40,000 annotations. Such initiatives help to both inform the public and give citizens a stake in oceans management.
It is the nature of wicked problems to resist simple solutions. Accurate and current environmental data are invaluable to researchers in many disciplines. At the same time, environmentally beneficial technologies like marine renewables will not gain traction without economically viable industries supporting them. Oliver Wragg of EMEC suggested that as the industry moves toward commercial maturity, more open data initiatives will emerge. Although data mined from marine energy sites is plentiful, facilities capable of generating, storing, and parsing this data are scarce. So, for the present, this scarcity means that the data will remain a valuable commodity.
Ultimately, any change in how monitoring data is shared will have to be systemic. If such data is valuable, then it could be worth subsidizing the cost of research data collection facilities in exchange for open data policies. The challenge will lie in balancing a need for open data with the high costs required to collect it. Costs are always present, and the question is: how much is this data worth? In the end, can a balance be achieved?
Author: Lee Wilson