Opinion: The Climate Change Fight Will Need A Lot Of Good Data To Work

Mar 08, 2016

At the end of 2015, over 150 world leaders descended on Paris to discuss what is arguably one of the most important issues facing the human race currently – climate change.

For two weeks, Presidents Obama, Putin and Xi, Prime Minister David Cameron and German Chancellor Angela Merkel, as well as numerous other world leaders, discussed and debated the severity of climate change and how we as a species would come together to slow down, or even reverse, the damage we have inflicted on the planet.

It was in this environment that a new landmark agreement was made: 195 nations from across the world will attempt to cut their greenhouse emissions and limit the global average temperature rise to “well below 2C” since pre-industrial times. The agreement shows a commitment from every nation of the UN; had a single one objected, the deal would have been lost. Part of what has created this unanimity of thought is the science behind climate change – science underpinned by data.

It should be of little surprise that there are some naysayers with regards to climate change, stating that it is a hoax, a deception that is the product of ‘environmental extremists’ or political thinkers who prefer greater government involvement in society. However, there is no denying the data of climate change, and it is in the data that the proof becomes evident.

All over the globe, there are scientists dedicated to the collection and analysis of data for studying the climate and the weather. It is through their work that the scientific community is able to demonstrate the effects of CO2 emissions and fossil fuel consumption on global temperature and the planet more broadly.

There are a myriad of data sources, for example:

  • Marine observation stations, such as moored buoys in the North East Atlantic, Marine Automatic Weather Stations aboard lightships, or island systems for the gathering of air pressure, air and sea temperature, humidity, wind speed, wind direction, wave height and wave period data.
  • Ice cores gathered from the Antarctic, Artic and glaciers around the world, which contain evidence of temperature changes and chemical changes that have occurred in the past hundreds of thousands of years.
  • Satellite observations that take measurements for a whole host of inputs, such as sea-level rise patterns, sea surface temperature, wind patterns, precipitation patterns, vegetation changes over time, plus much more.

Gathering all of this data together to analyse as one can be a challenge for climate organisations, particularly because a huge amount of weather data is gathered by volunteer weather watchers. According to NASA, there are over 8,700 citizen observers in the National Weather Service’s Cooperative Observer Program, who log weather data daily. The contribution that volunteers make to climate science is invaluable; however, it makes the amalgamation of data and subsequent analysis difficult.

Bringing together the vast quantities of climate data for analysis requires technology infrastructure that can do three things: handle the portability of huge data sets; enable easier management and control of those data sets and incoming additional data resources; and security to ensure the longevity of those data sets.

Significant Computing Resource

Within these broader technology requirements, they have to be able to detect and handle variations in data from the gather point and recognise and mitigate anomalies. Anomalies can occur from something as simple as human error, or it could be from new equipment used to take measurements, or due to contextual changes. For example, the growth of urbanisation can impact temperature changes due to the high levels of emissions in urban centres or greater density of buildings that retain and produce heat, thereby skewing temperature findings. The technology must be able to recognise this based on its inputs, and this requires a hugely powerful computing system.

High-performance computing technology (HPC), also known as ‘supercomputing’, is perfect for this kind of data and compute-intense process. Without this, the kinds of valuable insights that scientists rely on to draw accurate conclusions on humanity’s impact on climate and the world would not be possible. Supercomputers have been around since the 1960s, but it is in the last few decades that they have become available to more parties, thanks to the developments and advances in technology – be that through vendor purchased hardware, to build your own, or through cloud computing HPC where a “virtual” supercomputer can be created in a matter of minutes.

Whether it is on-premises, a custom-built computing system or one that is spun-up in the cloud, what will determine the success of HPC climate research is the efficacy of the data strategy of the researchers. The vast data sets collected will not be sitting in the HPC at all times, instead they will be stored in cost effective data storage systems that meet the requirements of the researchers. This could be HDDs, flash storage, or even tape archive. The data could be qualitative or quantitative, it could be recent or decades old and it could sit on-premise or in the cloud.

What researchers need is the ability to gather together the relevant data from the disparate storage systems easily and then feed this into the HPC system for analysis. One way of doing this is through an advanced IT storage operating system, such as NetApp’s Clustered Data ONTAP, that can overlay all storage resources to make them act as one – it is as if the storage OS creates a fabric across IT environments, this makes data management and transportation easier, and ultimately means analysis and conclusions can take place and be drawn more quickly.

Nearly Midnight?

The clock is ticking for climate change. We no longer have time to wait for analysis. If we are to save our planet and limit the consequences of the damage already inflicted, scientists need to deliver findings and advice faster. It is only through the advances in technology – be that supercomputing or the storage systems that hold the data – that we can keep up with the pace of climate change.


Laurence James, Product Alliances and Solutions Marketing Manager, NetApp

Image Credit: Flickr/ United Nations Photo

Author: Laurence James
View the original article here.
Published under license from ITProPortal.com

 

 

 

Comment

 

Understanding the risks and rewards of public sector cloud 

Download the Whitepaper now

Partner

24Newswire
Sign up to receive latest news