Analytics changing the modern utility

Analytics changing the modern utility

For decades, utilities in general were known as stodgy outfits that were slow to adapt to change.

How things have changed. In perhaps the biggest shift in the history of the industry, utilities are embracing the burgeoning data analytics industry to gain better efficiencies and provide better customer service.

According to ABI Research, spending on big data and analytics in the energy industry will amount to $7 billion in 2014, representing over 15 percent of the overall cross-industry spending. In 2019, spending on energy analytics will grow to more than $21 billion.

Greater shareholder pressure is pushing many energy groups to improve their returns after having it easy in the past, said Aapo Markkanen, the chief analyst on the ABI Research study.

In such a highly asset-intensive field, huge cost savings are possible by making the operations more driven by data. Analytics allow the early movers to gain a critical competitive advantage over laggards, in a field where competing by the end product is seldom an option.

For those reasons and more, utilities big and small are rushing to analyze the reams of data that are now available to them. Utilities are putting numbers to work for them in a wide variety of use cases across the entire enterprise.

Big data and our ability to apply it is fundamental to meeting a number of strategic objectives, including customer and employee safety, improving the overall customer experience, improved customer communications, understanding, meeting, and where possible exceeding customer expectations, outage prediction and resolution, and optimal operations of the distribution grid, said Steve Pratt, corporate technology officer at CenterPoint Energy.

Changes to operations include installation and implementation of supporting technologies and restructuring of the processes and organizations that support them. Visualization of field forces and need have provided the opportunity to allow increased operational effectiveness, he added.

At San Diego Gas & Electric, data analytics are being put to work in several departments across the utility, including operations, IT and security and customer-facing solutions, where the combination of social media data, customer contact data and customer energy usage data should allow SDG&E to serve customers much more effectively.

We expect that analytics will play a role in most or all of our business operations, said Hanan Eisenman, communications manager for San Diego Gas & Electric.

Eisenman elaborated that, using smart meters as the foundation, SDG&E has already launched a Reduce Your Use program that provides incentives to customers who save during critical peak events, helping to ensure reliability during the high-use summer months.

SDG&E is making the biggest use of analytics for customer service and weather tracking. On the customer side, the utility is deploying its next best option service for its contact centers, which provides agents with analytics-based recommendations for customers in real time.

The recommendation is based on everything were able to assemble on that specific customer and their context and needs, so SDG&E can act as a trusted energy advisor for our customers, said Eisenman.

SDG&E has expanded its weather sensor network to approximately 150 state-of-the-art weather stations throughout its service territory in an effort to track adverse weather conditions during the dangerous Santa Ana fire season. Already the largest and most sophisticated utility system in the nation, the weather network measures everything from temperature and humidity to wind speed and solar radiation, all of which provides a greater awareness of the state of the electric grid and provides valuable information to prevent wildfires and promote public safety.

SDG&Es weather analytics generate more than 30 TB per day. That data allows SDG&E to model the wind and temperature across its region at very high accuracy, enhancing the utilitys situational awareness of the state of the grid.

We use the inputs from approximately 150 stations to do fine-grained weather forecasting, and we use those forecasts to proactively deploy crews to areas with upcoming high winds and with higher fire potential, said Eisenman.

Sila Kiliccote, the lead for grid integration at the Lawrence Berkeley National Laboratory in California, added that while good progress has been made on the demand response side, there is room for improvement when it comes to capturing and analyzing data from distribution and transmission systems.

The integration of these sensing and analytics capabilities into distribution and transmission systems has been slow, she said. Transmission systems in particular have been more involved in dealing with centralized controls and with large generators as opposed to many distributed assets, which is why the demand side has been more creative about using these technologies.

She continued, We dont really see these demand side capabilities going up to the distribution level, so weve got this last mile where there is a lack of sensors, lack of visibility and lack of incorporating these real time capabilities into operations.

Kiliccote noted that the Lawrence Berkeley National Laboratory is conducting research that will allow it to develop low cost sensors for the distribution system so that distribution operators and planners can use data analytics to plan distribution system upgrades. In the future, she hopes that the new data will go a step further and help to optimize the operations of distribution systems.

Source: intelligentutility

SMART GRID Bulletin April 2017


View all SMART GRID Bulletins click here


Enter your email-id to subscribe to the

SMARTGRID Bulletins