Good mains voltage is a clean sinusoidal signal (or several phases thereof, as the case may be) free of dips, spikes and excessive distortion, whose frequency is stable over time, and whose amplitude stays within several percent (say, five) of the nominal value. Also, on good mains voltage, an appliance can draw the maximum rated current for the circuit it is plugged into, without experiencing a voltage drop which takes the voltage out of range.
Good mains voltage is a function of the ability of the grid, as well as the local circuit, to handle the load which is placed upon it.
It is also a function of the quality of the local wiring installation in a building, and also of the quality of the devices that are plugged in. Devices can degrade power quality seen by other devices by spewing noise into the wiring.
This kind of design assignment should not be too difficult to break down into constituent parts, so you can replace one big problem with a bunch of smaller problems. Hopefully you already know how to solve some, or all, of the smaller problems.
If you start at the output and work backwards- you want 3 LEDs driven by some kind of circuit that controls their illumination. Let's say you have a "DC" voltage that represents the RMS AC input voltage that you've been asked to measure. By "DC" I mean rectified and low-pass filtered so that it has little ripple. Say the voltage is 10V for 240V RMS, 9.5V for 5% low, and 10.5V for 5% high. So you need to design a circuit that will illuminate the Red, Green or Yellow LED based on that voltage (3 states, so it can be defined with two comparison bits). That's one smaller problem.
A second problem is how to power the circuit. You know you have a step-down transformer, so you should be able to design a power supply. But wait- there's an issue here with the specifications. You're told to illuminate a yellow LED if the voltage is 5% or more below nominal, but it's going to be hard to do that at 0V. You may have to make a reasonable assumption here- say it will work down to 30% under nominal. So your power supply has to work with as low as 160V in, and still provide (say) 15V regulated for the circuit to work.
That's smaller problem number two.
The third problem is how to get a voltage representing the RMS voltage into a DC voltage. One approach is to use a rectifier or precision rectifier circuit and rectify and filter the output voltage of the transformer. It's easier to measure the average value of the rectified voltage than the RMS value and assume it's a sine wave (this is where your AC analysis might come in, there is a constant factor between the two for a sine wave). This is really three even smaller problems- rectify the voltage, filter the voltage and (perhaps) scale the voltage so that it meets our requirements in the first problem of 230VAC->10.0V output.
So, a total of five smaller design problems, and we've detected a deficiency in the specifications. This is a fairly representative assignment in terms of what you'll run into, in miniature, but all the elements are there.
One little enhancement I'll recommend- keep the current draw (especially of the LED circuit since it will draw the most) constant regardless of which LEDs are illuminated. If you can describe why that's a good thing, you may get bonus marks.
Best Answer
Circuit breakers are not enough to protect life. Circuit breakers are there to stop the cable in the walls of your house melting and possibly catching fire – circuit breakers and fuses perform the function of stopping a fire (which of course is also very dangerous to life).
For direct contact with a live AC part, in the UK we have residual current devices (RCDs) – these "trip" the supply if the current taken down one of the AC wires is different to the current down the other by ~20mA:
(source: diyhowto.co.uk)
Clearly a fuse wouldn't be useful because the normal current of the devices attached to the AC will be tens or more amps. So if you have an appliance taking 10 amps and you touched one of the AC conductors you'd draw an earth current of maybe 20mA and this would "imbalance" the RCD and trip the supply.
As for touching both terminals simultaneously a different scenario has to be envisaged. I'm talking about AC power systems where one conductor (sometimes called neutral) is "earthy" i.e. it may have a voltage of only a couple of volts to earth – if you touched only this wire then it is very unlikely to trip the RCD BUT who cares – it's only a couple of volts put across your body at best and hardly any current will flow. If instead you touched both AC wires (live and neutral) then there will be an earth current taken from the live that is still significantly greater than the earth current from the neutral and the RCD trips.
Having said all of this ~20mA is still going to sting even if it is only for sub 100 milliseconds. Will it be lethal – possibly to people with heart complaints but will those folk be rummaging under a desk to blindly push a connector into a socket?
For AC systems that are "isolated" from earth, touching any one wire will barely be noticeable, but touching both will not trip an RCD and you'll be in serious danger – the current flow will be directly through the body and from conductor to conductor. Luckily these sorts of installations are not very common but certainly not unheard of. Losing the neutral-earth bond can cause this problem.