Ignoring charging losses (and I'll get to that), the first statement is wrong and the second statement is correct.
Converter efficiency is measured in terms of power, not current. So, for instance, a 12-volt input at 1 amp would give a 4-volt output at 3 amps, and that's why the 75% figure is correct - it is 3 times greater than the first.
With that said, even if you have a perfect converter, batteries do not charge with perfect efficiency. At the very least, they start getting hot towards the end of the charge cycle. Even when charged, a battery's capacity has to be measured by a discharge cycle, so charging efficiency is actually efficiency through a charge/discharge cycle. Wikipedia suggests a charge efficiency for li-ion batteries of 80-90%. A standard 16-hour @ 0.1C charge cycle for NiCads produced a maximum efficiency of 62%.
Firstly the battery is not \$5\mathrm{V}\$, it is nominally \$3.7\mathrm{V}\$ as pointed out already.
However the confusion lies in misinterpreting the \$\mathrm{mAh}\$ rating of the battery, and it has nothing to do with converter losses, and everything to do with
$$P=I\times V$$
If the battery is rated at \$6\mathrm{Ah}\$, it means you can draw \$1\mathrm{A}\$ for \$6\mathrm{\space hours}\$. So lets say you are drawing \$1\mathrm{A}\$ from the battery. This means it is delivering:
$$P=I\times V = 1 \times 3.7 = 3.7\mathrm{W}$$
Now lets say you feed this through an ideal boost converter. In this case \$P_{in}=P_{out}\$. So assuming we boost it up to \$5\mathrm{V}\$, this means that the current that must be drawn from the output to discharge the battery at this rate is:
$$I=P/V=3.7/5=0.74\mathrm{A}$$
So what this means is at the higher voltage, you can draw a current \$0.74\mathrm{A}\$ for \$6\mathrm{\space h}\$ for that battery capacity - so the output capacity is \$4.44\mathrm{Ah}\$.
Again, this is not converter losses - in fact with losses the number would be lower. Instead it has to do with the fact that you are sacrificing current capacity for a voltage gain which is how a boost converter works - if this wasn't the case then you would have invented free energy.
Essentially at a higher voltage, the energy delivered by each unit of charge \$\left(\mathrm{V}=\mathrm{J}/\mathrm{C}\right)\$ is higher, hence at a higher voltage but lower current you are still delivering the same energy.
Best Answer
There are two reasons why batteries are rated for capacity (A·h) instead of energy (W·h):
Cells using the same chemistry will have equal (or very close) voltage ratings, so voltage can be factored out when comparing their capacity.
In many chemistries, the voltage changes significantly during charge-discharge cycle, but also with temperature and load. Factoring that variable voltage in the battery energy will make many calculations complex:
As you can see, if you say "I have a batter of 1 W·h", you'll need to specify in which conditions this energy has to be consumed, while "a battery of 1 A·h" characterizes the battery itself, not the environment where it will be used.