When charging over a USB port which is also used for communication, you're limited to 2.5W/500mA. When using a port dedicated for charging (which can be a female USB-A plug), you can use almost 10W/2A. From these numbers, the dedicated charger is clearly better.
However, batteries have thermal and chemical properties which often limit the charge rate to something less than these values. This is especially true for small batteries in confined spaces like your cell phone battery, and the lithium-ion chemistry which is likely used has the added potential of explosive thermal failure, so the charge rate is limited by some circuitry.
It would be reasonable to assume that the maximum charge rate for your cell phone battery is a little less than 2A and more than 500mA. Cell phone batteries are usually charged at about 1C, and the maximum cell voltage of a Li-ion is about 4.2V, so assuming a 1000mAh battery and a 90% efficient charger (which is generous), you'd need 4.6W of power to charge your battery as fast as possible.
Charging more slowly than this (i.e. at 2.5W) shouldn't diminish the charge level of the battery in any noticeable way. A software power meter could be erroneously reporting a higher battery life due to heating of the battery and the consequent higher voltage, or any number of other errors. If an accurate test (identical use patterns, signal strength, previous charge levels, etc) indicates that your phone has markedly less battery life after charging by USB, then either your battery is defective or your phone circuitry is poorly designed.
I believe the answer can only be empirical, not definitive.
To examine some of the figures mentioned:
there is a 20% inefficiency (which I do not know if it is true for most portable charger)
A portable charger that is itself charged from USB (5 Volts) would need a boost converter to be able to supply 5 volts at its output. Boost converters commonly mention efficiency of 65 to 85%. TI's TPS61030, TPS61031 and TPS61032 state 96%, and Maxim's MAX8815A states 97% efficiency.
These figures do not account for possible efficiency loss due to external components (ESR of capacitors for instance) or temperature variation. Thus, treat that "20%" number as indicative at best.
your phone needs power for stand-by, so in my experience, you'll have just 65% capacity.
That would depend on whether the phone is kept powered on while charging, what power intensive tasks (e.g. WiFi, social media polling software) are running on the phone, and even the current draw of the phone in the nominal "powered off" state - Some smartphones do not actually power off completely unless the battery is pulled out.
Thus, that 65% number is also indicative at best, though varying it somewhat is within the user's control.
by industrial standards for batteries is +/- 20% tolerance admitted with capacity.
That number would be defined in the datasheet of the specific battery in question. It would also vary widely by age / charge cycle history of the battery, temperature, contact oxidation and possibly several other factors.
So, while the number is a reasonable guesstimate, it is not definitive.
Note that this last figure is applicable to both, the cellphone battery and the portable charger battery.
So, can one use the magical value 45% as a gauge for portable battery charger?
Clearly not. The only numbers that can be used, even as a rule of thumb, are those empirically measured for your particular situation and use pattern. Even so, the percentage will change widely over charge cycles, season and time of day (temperature factors).
Best Answer
A 4.2V buck converter isn't good for charging a lithium ion battery unless it can't source too much current. There's three stages that a li-ion battery needs to be charged correctly. Messing any of those up can result in exploding batteries. See charging lithium ion batteries for more info on that.
Your question about the charge entering and leaving a battery isn't really an issue. I think in that regard, your question is a dupe of another on here a month or so ago that asked what happens when you charge and discharge a battery at the same time (I can't find it now...). At any rate, the point of the answer was that any extra current you need from the 4.2V buck will be sourced from it, and any current from the battery that it can source will be sourced.
If the battery is discharged lower than 4.2V, then the buck converter will supply both the battery some current and it will source the boost some current. This assumes that you have connected the battery in parallel to the boost converter.