In what's become known as the smart grid, big data analytics has been a hot topic for a while now. As a result of the significant evolution in big data technology, weve developed a number of different use cases at Intel. All have been driven by specific business needs, and more importantly, by the need to analyze the right data at the right time.
From mining smart meter data in order to enable better customer billing and engagement, to analyzing syncrophasor data from transmission lines in order to reduce losses in real-time, big data analytics has become a vital part of overall business decision-making. For data collected over a period of time say a number of months we often see analytics used as follows:
Descriptive Analytics: Taking data and analyzing it to see what happened. Use cases can range from calculating how much energy was consumed in order to generate a bill, to identifying the performance of a transformer over time.
Diagnostic Analytics: Analyzing and identifying why something happened. A prime use case would be identifying if there were any data patterns from a failed transformer. Do these patterns indicate degradation over time?
Predictive Analytics: Isolating a pattern and using it to determine what to watch out for. If you know what a failing transformer looks like, you can proactively check the data for all existing transformers of the same model. Are any of them showing signs of degradation?
Prescriptive Analytics: Developing strategies based on previously identified patterns. This moves the IT asset management strategy from a calendar-based approach to a more predictive one. Many would say that this is required if automated demand and response is to become a reality in the smart grid.
Pre-emptive Analytics: Focusing on the what ifs. This level of analysis is crucial when considering overall grid reliability and stability. With access to real-time data in the grid, you can run a real-time simulation to see the actual effects of an asset failure on all other assets.
The dream is that all the data that exists within the grid regardless of the data type (structured or unstructured) is available to be analyzed real-time in all sorts of ways to gain wonderful insight. Think of the technology shown in the movie Minority Report. The film shows amazing systems that allow people to manipulate vast volumes of data in 3D. With elaborate hand movements, the users are able to look for undiscovered linkages between the data. This kind of useful analysis could be easily monetized. While we still have a long ways to go before this dream become a reality, the current big data deployments of today give hope for that future.
With this large range of solution requirements in businesses today, Intel's Xeon processors offer the ideal package of capabilities for computing and deploying different solutions. Technologies In-Memory Databases (like SAP HANA), Hadoop implementations from Cloudera, different real-time optimized solutions and full High Performance Computing (HPC) Clusters all deliver solutions aimed at different use cases.
Deciding on the appropriate solution usually comes down to not only considering the use case, but also determining how much real-time you require in your analytics. The time and latency parameters for a given analytics solution can widely vary. For example, the time and latency requirement to run a real-time syncrophasor analysis in milliseconds is vastly different than the needs to generate a customized customer bill at the end of the month (think batch job).
What big data use cases have you used or implemented? Let us know..
View all SMART GRID Bulletins click here
Enter your email-id to subscribe to theSMARTGRID Bulletins
14 June 2017