Electronic – How do electric utilities determine the Load Demand on the network

electricitypower-grid

Provided that electric utilities are tasked with ensuring that the load demand is always met, how do these utilities determine load demand to be met in the first place in real time rather than using some forecast?

Best Answer

Grossly, the grid frequency is constant when load = generation, rises when generation exceeds load, and falls when there is insufficient generation. If you think about it the whole thing is a torque balance between the turbines and the reaction force on the rotors due to the power being drawn.

Operators leverage this by configuring the various generation facilities to have different set points and slopes on their speed governors. A base line plant for example (Say hydro or nuclear) will typically go to full song if the frequency drops below say 60.25Hz, so it is essentially always running flat out, a gas turbine peaking plant on the other hand might be set to not load up until the frequency drops to 59.75Hz so that most of the time it is idling as spinning reserve. Something like a battery storage system will not load up until the grid drops further then that, whereupon it gets paid mad money to prevent a grid collapse for a few minutes while some more gas turbines spool up or some load is dumped.

That deals with the generation, but there are also serious constraints on transmission, you can only shove so many MVA over a given transmission line before either the insulation fails or the cables overheat and sag.

Power flow is controlled in the first instance be controlling reactive power flow, by switching either inductors or capacitor banks in and out, and the game is to try and always have N+1 redundancy against anticipated load so that a single failure in the transmission network will not turn into a cascading failure that can take out half the state.

If grid operations see a loss of redundancy in either generation or distribution they will (and should) load dump, simply because a lot of generation depends on the grid being up to be able to operate. If the grid goes away then many, many large scale generators will trip off line (And worse their steam production systems will trip offline), so you wind up having to restart things with a lot of your generation down for a few days to a few weeks. Rolling blackouts are MUCH less disruptive than having that happen.

For example a civil nuclear plant that loses grid connectivity will scram the reactor automatically, and a restart is impossible until the core poisons have decayed sufficiently for a controlled delayed criticality to again be possible, maybe a week, but the regulatory types will often extend that. Large scale coal, similar story (Boiler steel thermal behaviour in that case), once you lose the grid you are looking at much time to get it back, especially if the surrounding grids are not in a position to help out.

You really do not want a grid collapse in the transmission system, rolling blackouts (Or even 'just' switching off a city or two) are MUCH less messy than that.

Today we have some very good software that provides simulation support for the dispatchers running this stuff, and grid stability is relatively easy to monitor in realtime, this was not always the case.... Still, dispatcher with the +1 having just tripped off line, and the frequency dropping, with LA on Oscars night being the best choice for a load dump is when you EARN your pay!

Related Topic