Volume and Velocity
Any company with a large number of assets and customers will have a lot of data collected from the moment it started its business. This data issue is known as ‘volume’, which brings about the term ‘Big Data’. However, with Smart Grids utilizing sensors which generate data automatically every few minutes or even seconds, the issue of data velocity emerges – where multiple data is generated too quickly for any person to make sense of. Due to velocity, the volume of data received by the business continues to increase exponentially.
Reporting from silos
Besides these two issues, data in any corporation is generally stored in various systems, creating data silos across the business. When any reporting is required on an asset, manual data processing becomes tedious and complicated because the following processes have to be carried out:
- Sift data from multiple sources.
- Validate data to ensure its correctness. Multiple data can appear to conflict with each other if it is collected in different manners. Data often needs cleaning and correcting.
- Transform the data into the required reporting format.
- Consolidate any missing data to ensure that all required information for the report has been covered.
- Visualize the information in suitable formats such as graphs to increase the effectiveness of the message.
Quick decisions needed
The manual processing of data is called data curation. However, for any power utility that needs to ensure that electricity is delivered to consumers, the manual process takes too long. Without energy storage, the laws of physics dictate that electricity delivery from a power plant has to reach consumers within nanoseconds. A utility cannot afford to wait for lengthy data curation to take place if it needs to restore its network from an outage. There are now strong requirements for automated, advanced computational frameworks to be implemented in the power sector. Examples of such frameworks are the Wide Area Management System (WAMS) and Advanced Distribution Management System (ADMS). The framework performs real time computation that provides quick decision support to utility.
Growing need for automated advanced data analytics
With the wider adoption of Smart Grids globally, the need for automated advanced data analytics will grow. In 2013, GTM Research forecasted that between 2012 and 2020, cumulative global spending on Smart Grid related analytics would top USD 20,6 billion with an annual spend of USD 3,8 billion globally in 2020. GTM also estimated that despite huge investments, utilities would achieve more than USD 121,8 billion return on investment over the nine year period .
Big Data is key
For any utility that wishes to embark on Smart Grids, the issue of Big Data will be a key consideration. With the multitude of smart technology options available today, a utility needs to carefully think about the data applications. Some of the considerations are:
- How will the data be used? e.g. will the data be used to control load operation or to improve long term planning?
- Does the data require archiving and further analytics?
- Who will be using the data and how can they access it?
Applications influence platforms
The different applications will influence the choice of sensors and communication platforms because the data requirements can be completely different. Real time operational data often requires fast sampling rates and critical data might even require contingencies in its communication platforms. Data that requires archiving will incur storage costs and deciding between an internal storage system and a cloud solution is no small task. Data requiring analytics might necessitate a certain level of accuracy to generate meaningful results. Data that needs to be accessible companywide requires an enterprise system rather than standalone software on a personal computer.
The need for International Standardization
Enterprise systems that handle data for utilities are also highly specialized in their functions. A system that handles control room operations does not necessarily track the cost associated in addressing a fault. The data format in one system does not necessarily agree with another, which significantly increases the duration of any data curation activity. With many employees duplicating efforts in handling data, waste in terms of time and money are multiplied across the business.
This is where standardization is playing a key role. The standardization of models such as Common Information Model by enterprise systems will allow for data to be automatically shared and exchanged.
1. The Soft Grid 2013-2020: Big Data & Utility Analytics for Smart Grid, David J. Leeds, GTM Research