Why is the mains voltage generally above the nominal value?
I am not talking about power spikes, which leave the margins. We are talking about standard operations. By design, the power is set closer to the top margin than to the middle. These are the reasons:
Standard power generators all run with a certain rotational speed which is synchronized with the grid frequency. The rotational frequency of the generator also depends on how many poles it is equipped with, all 4-pole generators in 50Hz grids run with 1500/min, for instance.
Grid frequency is just about the only persistently constant value you can expect from the grid.
At the fixed speed, the power output of a generator is regulated by the excitation of the field coils and the mechanical input at the turbine or engine. Both values must be regulated in unison. If you increase the excitation without increasing mechanical input, the machine will slow down, and come out of sync, which must be prevented.
Some kinds of power plants run asynchronous (flywheel, solar, wind mostly) which means their power output has to be electronically regulated to fit it onto the grid.
For several reasons the power suppliers will regulate towards the upper end.
First, they can react more quickly to reduce power output: Divert some steam, reduce excitation, done. To react upwards, they must first make more steam, which takes time. So it is safer to be on the top limit.
Secondly, the same power can be more efficiently be transported when the Voltage is higher. Losses almost exclusively come from current, higher voltage means less current, so less loss, bigger percentage of voltage arrives at the customer, and only power that arrives will be paid.
Lastly, a part of the used power is pure electrical resistance, which consumes more power with higher voltage, leading to higher consumption and higher sales. I suppose this is not a big deal.
Now the power suppliers know very well how much power will be consumed on average. They know how much more will be needed on special days like thanksgiving (every stove is in action that day), or on superbowl day. They will plan ahead for quite a while.
The quality of the grid lines is taken into consideration here: If they know the voltage drop within a neighborhood rather high, the supply to that neighborhood will be set up so the planned voltage arrives at the customers, if possible. Transformators between the high/medium/low voltage networks can be regulated to some degree. (see ULTC at http://en.wikipedia.org/wiki/Tap_%28transformer%29)
Therefore voltage drops and also phase shifts are the bane of the suppliers: These two factors lead to bigger losses in the lines, which they have to pay for themselves.
60 Hz was the result of engineering tradeoffs, I think made by or influenced by Nicolai Tesla. He was one of the early proponents of distributing AC, as apposed to Edison who wanted to distribute DC. The tradeoff had to do with size of the machines and transformers needed, which get smaller with higher frequency, and some losses, which go up with frequency. I remember reading that some careful study went into the decision to pick 60 Hz.
50 Hz, on the other hand, was due to marketing. There was a German manufacturer of power grid equipment that wanted to distinguish themselves and managed to get 50 Hz pushed thru as the standard in Germany and then much of Europe. This meant they didn't have to compete with the American 60 Hz equipment. The rest of the world ended up with 60 or 50 Hz depending on who they bought their equipment from and whether they were more economically tied to Europe or the US. Since Russia adopted the European 50 Hz standard, the soviet block all became 50 Hz countries.
Best Answer
From this wiki page:
See also this website primer from UST Power that explains a lot of different techniques for automatic voltage regulation.