Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Can someone knowledgeable explain this?

> But the situation was only growing more dire. At the precise time of the third call, the frequency reached a critical threshold: 59.4 hertz.

> Automated turbines across the state began spinning even faster to produce more electricity, but when the frequency dips below 59.4 hertz, the turbines reach speeds and pressures that can cause catastrophic damage to them, requiring that they be repaired or replaced.

Is this actually a way to bump up power generation capacity - by getting the turbines to "spin faster" rather than spinning up more turbines? Also, if the turbines are spinning faster, why does the frequency of the generated AC drop?



There is not enough information in the article to actually understand what they mean. Spinning the generator a bit faster or slower is not the problem. They should always be operated within their limits, so that is not the problem. The problem is that when one region is overloaded it loses synchronization with the rest of the grid and that causes issues. Before that happens some regions are shut off (local controlled blackout), the grid is split up and some power plants even go offline.

In general it's like this:

* Generators in the whole grid need to be synced up (frequency identical and phase difference small).

* If the generators in one area cannot keep up with the load in that area the frequency drops a bit (phase is shifted back).

* Powerlines transmit power from areas with more generation capacity (where phase is ahead) to areas lacking capacity (phase in the lacking area is pushed ahead and frequencies remain identical).

* Powerlines have limited capacity.

* If the transmitted power is not enough to satisfy demand in that area the generators slow down more (phase gets further behind).

* If one area gets out of phase too much it would lose frequency completely and there would be catastrophic failure in powerlines, transformers, generators, switches etc..

* To prevent catastrophic failure all areas agree on a small band of frequency variation inside which the regulation mechanisms work.

* If the frequency falls below that the grid disconnects into islands.

* The islands will either go dark completely (and power plants in those areas shut off) or they will have an oversupply of electricity (with these power plants reducing production).

* Regions with an oversupply go back to nominal frequency, sync up and reconnect. Then bit by bit the other areas are connected.


Much of that involves dealing with frequency and phase. Would a black start be easier in an HVDC grid?


Yes. Many things are easier with HVDC, except producing the hardware. It's also far more efficient at transmission.

The AC power grid is mostly a historical accident; it's only been recently that we gained the ability to fully control DC power at grid scale.


> Is this actually a way to bump up power generation capacity - by getting the turbines to "spin faster" rather than spinning up more turbines? Also, if the turbines are spinning faster, why does the frequency of the generated AC drop?

The article might be explaining it wrong.

They want the turbines to spin faster because they're generating at 59.4 Hz when they should be at 60. Keeping at 60 is a balancing act. When there is more load, the turbines want to slow down and to maintain speed you need more pressure.

You can have a 50 MW turbine generate 25 MW of electricity. It still spins at the same speed, but because there is only 25 MW of load, it needs less steam pressure to maintain speed. When the load increases, they increase the pressure and it generates more power while spinning at the same speed.

But if you take a bunch of generation capacity out of the grid, you now have a 50 MW turbine with 65 MW of load. Putting enough steam pressure to generate 65 MW of power into a 50 MW turbine is out of spec and could damage it, but put less than that in and the frequency falls.


My understanding is that the author is not a technologist and that these aren't literal physical descriptions of what happened with the turbines. The simplest generators maintain a steady angular velocity that (when normalized for the number of poles) corresponds to the frequency of the output signal.

To give the benefit of the doubt to the author, it is also true that a turbine shaft can be connected to a generator via a gearbox, in which configuration a faster turbine would correspond to more power delivered to the generator at the same generator RPM.


Deviation from 60 hz is a measure of overloaded or oversupplied grid.

They were "dialing up" the turbines to compensate for overload but it was insufficient and causing the turbines to redline/exceed operating allowances.


This is a hypothetical explanation because I don’t know how grids work in practice. The faster turbines turn, the more voltage they generate because the rate of change in flux with respect to time increases as per Faraday’s law. This means they can compensate for the voltage drops across the network due to increased current as per P=I*V.

I think frequency dropped because there was too much load on the grid. More load produces more impedance in the grid, this means that, without additional power on the turbines, they respond by turning slower. Like having to bike up hill.


If grid scale equipment works like 80kw generators, the voltage is electronically controlled by changing the excitation voltage. An overload condition manifests as a decrease in frequency with the voltage still fine.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: