Your question is very general, and so is this answer.
When a power plant creates power like the Hoover Dam, it can provide 2.07 GW of electrical power. My question is what does this mean? I assume from Faraday’s law that the induced voltage across the generator coil produces an current, and this combination (P = VI) is the actual power but I'm sure that my thinking is naïve. Can someone roughly sketch out how the electrical power of a power plant is computed?
From a mechanical perspective
"2.07 GW" means that the peak output of the power plant is 2.07 GW. This is most likely a series of smaller units, say 20 × 100 MW units = 2.0 GW.
The generator is a converter of mechanical energy into electrical energy. So to generate 2.07 GW of electrical energy, an equivalent amount of mechanical energy has to be provided. In the case of the Hoover Dam, the mechanical energy is provided by water falling to a lower elevation, giving up its gravitational potential energy in the process.
From this perspective, you can think of the maximum electrical output power of a power plant as the maximum rate at which it can convert mechanical energy into electrical energy, factoring in the efficiency of the conversion process.
The rate at which mechanical power is generated is a mechanical engineer's problem. For a hydroelectric station, the mechanical power would depend on the water pressure, turbine size, and various design parameters. For a wind turbine, the mechanical power would be set by the radius of the blades. And so on.
From an electrical perspective
Yes, the electrical energy produced follows Faraday's Law and Ohm's Law, though for an AC system, V and I are sinusoids which may not be in phase, and P ≠ VI. Rather, apparent power (volt-amperes) S = VI, and real power (watts) P = VI cos ɸ.
Other complications include electrical losses (per Ohm's law) and magnetic losses (eddy currents induced in metallic parts).
If possible, what kind of voltages and currents are power plants producing before the Step Up transformers? Of course, this varies from one power plant to another.
Regarding typical voltages
In my experience, small generators (i.e. diesel gen-sets) generate directly at the utilisation voltage, say 415 V here in Australia.
Larger power station units generate at a medium voltage like 11kV before stepping up to transmission voltage, i.e. 132 kV.
I imagine a medium voltage like 11kV is preferred vs. a high voltage like 33kV, because less insulation is required on the windings and the rotating parts may be physically lighter.
Regarding typical currents
An aeroderivative gas turbine, i.e. the General Electric LM6000, is typically rated about 45 MW and might have a 60 MVA alternator attached to it. Calculation of the three-phase line current at 11kV is left as an exercise to the reader. Don't forget your √3.
A coal power station unit might be rated 400 MVA at 22kV. See "Tarong Power Station" in QLD, Australia, which consists of four large units like this. Again, calculation of the line current is left as an exercise.
Note: I am at home and hence don't have access to my reference material at work. The above numbers are indicative, so treat them with a grain of salt.
If you are curious as to the exact operating principles and theory of an AC generator, I would encourage you to look up a textbook on electric machinery. My personal favourite is Mulukutla Sarma's Electric Machines. Check your university library for a copy.
Best Answer
A power plant rated at 1GW can produce 1GW of power, at the rated conditions.
If it has an efficiency of 20%, then it will be consuming 5GW of energy in some form to do that.
If the power plant is (say) thermal steam, then the calculations are fairly easy, because we can assume that it can do this continuously, as long as fuel arrives. It will generate 1GWh of energy in 1 hour. Note that 20% efficiency is pretty poor, archaic even, for a combustion based plant, but might be reasonable for geothermal (low temperature) sources.
If the power plant is solar, then
(A) it's weather dependent (don't work well in cloudy weather)
(B) time of day dependent (don't work as well when the sun is low in the sky) and
(C) the efficiency is less relevant than for fuel burning plant, as it gets whatever sun it does for free all the time (subject to a and b above), though cost of plant, installation and real estate to deploy it will increase with lower intercept efficiency, so that will continue to be improved by manufacturers
We come back to the rating conditions. Is that 1GW at solar max, best time of day? Or is it average between the hours of (say) 10am and 4pm? For what the 1GW means, you will need to read the fine print for what it really means.
Once you know what it means, as a function of time of day, and as a function of seasonal weather, then you can integrate the amount of energy that the solar plant will produce over a day, or a year.