A changing paradigm of electricity generation is paving a robust transformational path for the energy and utilities industry to manage distribution grids in the future. With the rise of commercial and residential rooftop solar and wind generation and battery storage units, utilities are forced to accurately forecast and schedule energy at the edge of the grid.
Conventionally, utilities used ‘connected models’ to accurately predict grid consumption and grid states. Based on a central generating model, utilities forecast and schedule energy that cascades across its grid system in a reliable and predictable manner. However, with the proliferation of distributed energy resources (DERs), these models are now more error-prone and no longer accurately predict the state of the grid at some of its feeder circuits.
Planning for the New Distributed Grid
According to a recent study published by Lockheed Martin (titled ‘State of the Distributed Grid’), nearly 65% of the utilities believe that the distributed grid’s impact will grow exponentially over the next 12-24 months. Hence, it is critical to the fortunes of the industry how utilities change the way they manage and control the distribution grid to cope with these new developments, perpetual innovation, and changing priorities.
Many utilities are bulking up their capabilities to manage the grid at medium voltage substations by implementing Advanced Distribution Management Systems (ADMS). These applications promise to enhance observability and control by extending traditional SCADA and other actuation capabilities. Thus, utilities can closely monitor and adjust how energy flows over the distribution network. ADMS can also improve power quality by enabling control over tap changers, capacitor banks, and other voltage regulation devices.
ADMS applications are expensive, multi-year implementations that require major investment and commitment. There is a trade-off between the numbers of substations covered by these systems and the possible ‘blind spots’ for the utility operators at low voltage.
In the quest for better ways to manage the distribution grid, utilities are getting gradually inclined toward taking an Internet of Things (IoT) approach that enables cost-effective pilots, using a lightweight, open-source, and IP standards-based design. Such an approach would test the efficacy of monitoring electrical magnitudes (for example, current, voltage, power factor, etc.) at low-voltage substations (at the feeder circuit level) without making a long-term commitment to a heavyweight application. An industrial IoT approach would not only allow experimentation and future scalability, but would also demonstrate the value of this data, some of which is computed in the field using an energy gateway that backhauls information, making it useful for operations, forecasting, and further analyses.
The aforementioned Lockheed Martin study also observed that “nearly two-thirds of Commercial & Industrial (C&I) customers believe that their utilities should play an important role in helping them build a more distributed grid and their capabilities with DERs.”
This provides another well-founded reason for IoT to play a key role in how utilities integrate DERs for their C&I as well as residential customers. An IoT approach will enable utilities to extend the capabilities of their ADMS applications to their portfolio of consumer programs such as Smart Metering, Demand Response, and other energy efficiency exercises. An IoT approach can integrate the consumer-facing, internet-based systems with the OT systems that utilities will continue to expand to improve how they monitor and control the distribution grid of the future.