Yes, the dropout voltage is the headroom the regulator needs to work with. This is the minimum amount the input voltage must be above the output voltage for the regulator to work properly. So as you say, a 3.3 V regulator with 1.1 V dropout requires at least 4.4 V input.
Stuff happens, so it's usually a good idea to give the regulator a little more room. However, going too far will be less efficient, so there is a tradeoff. The range you quote of 4.75 to 5.25 volts could well be a reasonable tradeoff. Basically, that's a "5 V" supply.
The output current of a linear regulator is its input current (minus the small current on the ground pin, but that is small enough to ignore for the point I'm trying to make). Since the input voltage is higher than the output but the currents are the same, the output has less power. The difference in input power and output power is wasted in the regulator as heat. Another way to look at this is that the regulator will dissipate the difference between the input and output voltages times the current as heat. For example, consider a case where the input is 5.0 V, the output 3.3 V, and the current 500 mA. The voltage accross the regulator is 5.0V - 3.3V = 1.7V. That times 500 mA is 850 mW, which will go into heating the regulator. That would be fine for something like a TO-220 package, but a SOT-23, for example, would burn up at 850 mW.
The output of voltage of the transformer will be proportional to the input voltage. If it is 15 VAC at 220 VAC in, then it will be 13.6 VAC at 200 VAC in, for example.
You seem to have a lot of headroom, so should be able to tolerate significant power voltage sag. In fact, you seem to have so much that your supply is quite inefficient. At 15 V out, the peaks of the waves will be 21.2 V. Assuming a full wave diode bridge, you loose about 1.4 V leaving 19.8 at the peaks. This is the level that the capacitor after the full wave bridge will be charged to twice per line cycle, which is every 10 ms for 50 Hz power.
The 7805 regulator needs about 7.5 V in to make reliable 5 V out. There is a lot of room between the 7.5 V minimum the regulator needs and the 19.8 V peaks the cap gets charged to. The cap voltage will drop between the peaks depending on the cap size and the current. Usually things are sized for a few volts drop. For example, let's say you have a 1 mF cap and the current draw is 100 mA. That would only sag 1 V over 10 mS, which brings the lowest voltage into the 7805 down to 18.8 V.
We can work this backwards and see what the minimum necessary line voltage is in this case. You need 7.5 V into the regulator minimum. The sag between peaks is still the same because it is only a function of the current draw and the capacitance. That means the cap needs to be charged to 8.5 V at the peaks, which requires 9.9 V AC peaks before the full wave bridge, which is 7 V RMS. That is 47% of the output at 220 V, so your minimum input limit is 103 V.
Of course you should always leave some margin because stuff happens, but any vaguely reasonable "220 V power" line isn't going to sag as low as 103 V unless something is very wrong.
As for another solution, get a "universal input" power supply. These are designed to work with any of the worldwide house power. They are usually specified for something like 90-260 V AC, 50 or 60 Hz. They are switchers, so are a lot more efficient than your linear regulator. This is really a better answer anyway than a big iron line power transformer type power supply.
Best Answer
It is literally the difference between the input voltage and the output voltage. If the input is 9V and the output is 5V then the differential is 4V.
The reason the graph doesn't go all the way down to 0V is due to the dropout of the regulator.