Situational Intelligence: The ‘Now’ Model for Utility Data Analytics

Situational Intelligence: The Now Model for Utility Data Analytics

For decades, utilities have been collecting and analyzing data, and making business decisions based on these analyses. But theyve never had as much data coming from as many devices on the grid as theyre getting today -- and that means that the next generation of utility data analytics may look very different from those that have come before.

Factors such as the proliferation of smart meters and grid sensors, the rise of distributed generation resources like rooftop solar and behind-the-meter batteries, and the emergence of customers equipped with new technologies to manage and control their electricity use are all bringing far more data into the utility purview than theyve ever had to deal with before. And with much of this data flowing in close to real time, it's straining the bounds of the batch-processing, data-warehousing methods of data analytics tools of the past.

GTM Research has pegged the value of the global utility data analytics market at a cumulative $20 billion between 2013 and 2020, growing from $1.1 billion in 2013 to nearly $4 billion by decades end. This growth will be largely driven by this flood of new data, and the desire to turn it from an overwhelming deluge into streams of business value.

Enter the contenders for this massive new market. In the past half-decade, weve seen IT giants like Oracle, IBM, SAS, Teradata, EMC and SAP make big investments in the field. On the industrial front, Siemens is integrating its operations and data management technologies, and General Electric and partner Pivotal are creating data lakes that could be tapped for grid analytics. Startup C3 Energy, backed by $105 million in VC investment, has landed Baltimore Gas & Electric as a marquee customer for its massive integrated data analytics approach.

These all-inclusive, leave-no-data-point-behind approaches promise great rewards at the end of the day. But theyre expensive, and they take a long time to get up and running. Theyre also premised on the idea that utility customers are willing to commit to significant upfront investments with as-yet-uncertain outcomes.

In the meantime, another breed of software tools are bringing a different approach to the utility data challenge. In particular, two startups, Space-Time Insight and Bit Stew, have been landing big utility customers by putting disparate data streams and stores to use in applications to solve todays data management challenges -- not in years or months, but in weeks, or sometimes even days.

 These companies arent promising to provide every last analytics applications utilities may want someday. Indeed, neither company bills itself as an analytics provider, per se. But they are offering utilities a way to gain a core understanding of what's going on with all their new networked grid devices. This, in turn, could serve as the launching pad for new analytics applications over time. In a utility data landscape thats changing so quickly, this approach could well provide a model for the rest of the industry to follow.
Delivering real-world value in days, not years

Take Bit Stew, the Vancouver, Canada-based startup that got its start with hometown utility BC Hydro. At last months DistribuTECH conference, Bit Stew announced it has landed California utilities Pacific Gas & Electric, San Diego Gas & Electric, and Southern California Gas as new customers of its Grid Director platform, the software interface built on top of its MIx Core data engine.

In some cases, Bit Stew won those contracts in competition against some well-known competitors in the data analytics field, Franco Castaldini, the companys vice president of marketing, told me. While he wouldnt name them, he did say that the winning difference was Bit Stews promised ability to get its system up and running much faster than those of its competitors.

Those customers are able to stand up a first instance of Grid Director, for example, with five different source systems, in two weeks, with one person, he said. Thats compared to a typical ETL [extract, transform, load] approach, where youre taking data from source systems, and pushing them into Hadoop or Cassandra, two well-known and much-used big data platforms -- a process that can take months.  

Thats a critical capability for utilities that have been frustrated by past efforts to build an operational platform for putting their data to use, said Michael Allman, a former executive of Sempra Energy, the parent company of SDG&E and Southern California Gas, who joined Bit Stew as COO last month.

 If you look at an electrical utility, theyve got a disparate [array] of subsystems, a number of them are homegrown, or systems theyve been using for years, all from different companies, such as customer information and billing systems, grid-asset health tracking systems, and distribution grid management systems. How do you take all these different systems, coming in different data formats at high speeds, and turn it into actionable information?

Companies have tried to do this for years, and have been unable to crack it, said Allman. Bit Stew has been able to crack that nut -- and it has proven that now with several big customer wins, in very competitive environments.

Thats a self-promotional description of Bit Stews capabilities, of course. But Castaldini, who previously led product marketing for GEs Energy Management software solutions, agreed that traditional data management approaches have often struggled to solve these problems: It slows down the project, it might break down the line, it frustrates the relationship between OT and IT, and it often leads to failure of potential projects.

Source: greentechgrid

SMART GRID Bulletin March 2017


View all SMART GRID Bulletins click here


Enter your email-id to subscribe to the

SMARTGRID Bulletins