The grid is changing from a one-way, centralized electricity delivery system to a distributed, networked system for generating, storing and consuming power -- and the technology is changing a lot faster than the economic and regulatory systems that govern it, as we've seen in states like California and New York that are struggling to catch up.
But what if all this complexity could be boiled down to one metric that captures the costs, benefits and tradeoffs in a way that can serve the grids needs, not only on a minute-by-minute basis, but forecast over years to come? Grid software vendor Integral Analytics says it has an answer -- distributed marginal price, or DMP. Consider it the grid-edge equivalent of the localized marginal price (LMP) metric used by transmission grid operators to determine the marginal cost of delivering energy on any given point on the grid, at any given moment in time -- only down to millions of grid endpoints, and forecast over a decade or more.
Our process goes through the standard calculation methods, to go from house to house to find the true cost to serve, Tom Osterhus, CEO of the Cincinnati-based privately funded utility software vendor, explained in an interview. Were doing direct and accurate marginal costing -- which the regulator always wanted to do, but never had the computing power to do it.
Theres a lot more than the cost per kilowatt-hour of generating power that goes into this calculation. Short-term factors include distance from generation to consumption and consequent efficiency losses; the value of providing reactive power; voltage support and other such technical services at different points on the grid; and broader grid reliability measures. In the long term, youve got to calculate how much these grid-edge resources could save you in power plants not built or feeder lines, transformers and substations not upgraded or replaced, and balance out which combination of technologies and strategies can provide the optimal outcome, for utilities and customers alike.
But once its created, this DMP metric can serve as a common point of reference for multiple technologies that must be integrated to serve the common cause of delivering the best value for the lowest cost, he said. California regulators have just launched a process to convert a state law demanding integrated distribution grid planning into a working model to use at the state's big investor-owned utilities -- and Integral Analytics, which has been working with the likes of California utility Pacific Gas & Electric and grid operator California ISO, could represent a real-world software platform for making that happen.
These are the kinds of visionary software efforts that Greentech Media will be exploring at our Soft Grid 2014 conference next month in Menlo Park, Calif. The proliferation of smart meters, grid sensors, smart solar inverters, smart thermostats, building energy management systems and other distributed intelligent devices provide the data to make these kinds of software innovations possible. But Integral Analytics (IA) is one of the first to propose a concrete measurement of value for all these very different grid-edge resources as a result.
So how does IAs software turn reams of data into a guide for moment-to-moment grid energy interactions, as well as insight into long-range distributed energy investment plans? Heres a breakdown, starting with the stuff thats happening in real time on the grids edge.
View all SMART GRID Bulletins click here
Enter your email-id to subscribe to theSMARTGRID Bulletins
14 June 2017