As we write this, several million people are without power on the East Coast of the US as a result of Hurricane Sandy. In the aftermath of the storm, the press and industry experts are writing about the fragility of grid infrastructures and how a lack of smarter, newer, more secure grid technologies contributed to the extent of the power outages.
Sandy’s circumstances and commentary echo those of other large-scale blackouts in the last several years. The 2003 blackout that affected more than 50 million people in the US and Canada was particularly significant. After that event, the US Department of Energy (DoE) issued a report that found that it was not for lack of action that the outage happened.
Many decisions were made, but the DoE found that the blackout was due in large part to “lack of awareness of deteriorating conditions”. The report recommended a real-time measurement system, and computer-based operational and management tools be developed to enable improved decisions and actions.
Almost 10 years later, the world’s energy infrastructures are beginning to get the injection of better intelligence needed to create the smart grid, and enable utilities to achieve situational awareness to reduce the risk of power outages from outside forces.
The term ‘situational awareness’ (SA) originated during World War 1. The concept was a crucial capability for crews in military aircraft. Today, SA concepts are used in a variety of industries, such as air traffic control, nuclear power plant operation, vehicle operation, anaesthesiology and, of course, the smart grid. For the smart grid, SA involves recognising the current state of grid elements, analysing implications and taking effective actions. The implementation of systems for SA will involve sensors for data collection, information systems for analysis and processes for effective courses of action.
Given the scale of the grid, these implementations will involve big data operations, substantial real-time analytics and systems for complex decision-making.
Utilities’ big data reality
While conceptually simple, the complexities of big data and being able to manage volume, complexity and speed make building SA systems for the smart grid especially challenging. Today, widespread sensors create new data at an exponential rate. The deployment of planned synchrophasors alone over the next several years will increase data volumes for utilities by 700 to 800 percent.
Furthermore, the use of millions of smart meters will create many new substantial data streams. Smart meters measure a variety of data types such as consumption and power quality, as well as diagnostic and event-related data (e.g. connect/disconnect information).
The number of smart meters will triple between 2010 and 2014, reaching 50 million in the US alone. Italy rolled out smart meters to an entire customer base of more than 30 million between 2000 and 2005. The Department of Energy and Climate Change in the UK will have smart meters in all homes (more than 27 million households involved) by 2020. In China, State Grid has pledged to roll out a smart grid system by 2020.
Scalable, real-time big data solutions will be the only way to deal with hundreds of thousands and eventually millions of transactions per second to create value through wide-area situational awareness and informed decision-making. Utilities face key issues about smart grid data that make situational awareness a challenge for them. Big data is traditionally defined with three parameters:
– Volume: Certainly a big data problem for utilities, as increased synchrophasor use is projected to create an eight-fold data volume increase in less than 10 years. Moreover, tens of millions of households will be added to the smart meter network. This will create hundreds of thousands of transactions per second to be handled by the utilities’ backend information system.
– Variety: The array of different sensors and systems that utilities already apply to gain intelligence about power consumption and flow create more than a dozen different data formats. This will drastically increase with the deployment of additional smart grid technologies.
– Velocity: Many critical decisions about grid operations and power trading require analysis of large and complex data sets and must occur in milliseconds.
Utilities present their own unique challenges. Two of the biggest data characteristics that must be managed are:
– Validity: Utility data must be validated as clean, correct and useful. Otherwise, business process execution will be harmed. Data corruption or security vulnerability will be the result. Due to data validation, information in the utility environment has a ‘shelf life’ during which it is valid and useful and therefore needs to be stored for that particular amount of time. The question of when to archive or even dispose of data becomes relevant. It drives the cost of storing large data quantities inherent in a big data problem.
– Veracity: Establishing trust in big data presents a huge challenge as the variety and number of sources grows. The integrity of the business processes, the security of the information systems and the protection of privacy for the customers are all significant issues and become increasingly complex as data volumes, variety and velocity increase.
Further, utilities must manage and analyse even more event data for power flow and consumption. They also need to ingest, analyse and manage data about large external security risks and information about compliance requirements. Security for the grid used to be relatively simple compared to today, as infrastructure was really only at risk from natural disasters and equipment failure.
Today, through the rise of potential security threats to critical infrastructure, sophisticated cyber-security threats present another big problem for utilities. There are also many new legal and regulatory issues associated with the smart grid for which compliance is a critical business issue.
Out with the old, in with the smart
Utilities have primarily used relational databases (RDB) to manage and analyse their data for the last 30 years. The RDB began to be used prominently by utilities during the 1970s, when storage media was very expensive.
The main advantage for RDB was that storage costs could be minimised, but at the expense of needing to write substantial proprietary code to describe the relationships between data sets.
The shortcomings of RDB have become apparent as needs for analysis across multiple data types, formats and domains are created to enable situational awareness for the smart grid.
After the US DoE issued findings on the 2003 blackout, the Electric Power Research Institute (EPRI) began a project within its IntelliGrid programme to develop architectures to help utilities avoid another disaster characterised by a “lack of awareness of deteriorating conditions”.
The programme led EPRI and many of its member companies, including Versant Corporation, to create a system architecture capable of providing advanced warnings. The new system showed exactly where problems were occurring, and delivered the real-time intelligence needed to keep the lights on.
The technology was demonstrated in a joint EPRI-Versant presentation at the June 2012 Data Management and Analytics for Utilities Conference in Redwood City, CA. This demonstration used sizeable data samples from utilities’ data from the hours leading up to the 2003 blackout. Many utilities will need to make substantial and difficult changes to many of their existing systems to implement such architectures. However, the above demonstration showed some of the benefits of making these changes.
The demonstration platform provides an automated means by which to ingest, manage and analyse large, complex data volumes in real time. An advanced, high-performance data management architecture allows data to be modelled as objects and classified into object classes that inherently match those of the smart grid. This architecture, developed by Versant, provides an effective platform for the complex applications of the smart grid. Further, the architecture adheres to the IEC CIM standard, including interoperability among all network devices used in smart grid systems.
Implementing the smart grid on a large scale presents many challenges and opportunities. The management and analytics of big data will be critical in many ways, including the development of sufficient SA systems. With the intelligent design of sensor networks and analytics systems, utilities can reduce operating costs and the risk of blackouts, mitigate potential security threats to the network, and improve their environmental impacts. Transforming the power grid is one of the great opportunities of this century, and with the intelligent use of information technologies, we can make it happen.
Bert Taube is Director of Energy and Smart Grid Solutions, and Robert F Brammer is Director of Versant Corporation