I've recently bought 5V/1A USB power supplies, and since the packaging was slightly damaged (specified in the auction I bought it from), I thought about testing it before connecting to anything sensitive.
Since it's a USB-A receptacle and I have the corresponding socket in my parts supply, I was thinking simply about soldering leads to Pin 1 and 4, connecting those through a resistor, and the connecting a multimeter in series or in parallel to the resistor for measuring current and voltage output. I'll be using 10W resistors to stay on the safe side (with the lowest resistance of 5 Ohms, of course).
My questions is: is there something inherent in the USB standard that will cause problems, or worse, hazards, with this test setup?
Also, a side-question: is it safe to base this type of setup on a typical breadboard and 22 AWG cables?
I don't believe USB power supplies require any negotiation to ask for current. When working with a PC, devices are supposed to draw no more than 100 mA and ask permission to draw more (and in practice they usually just take what they need up to 500 mA). I only mention that because you can't expect this same test to work with your computer.
The test you're considering should not damage the power supply. On a cautionary note, the 5W in a 10W-rated resistor will generate plenty of heat over a small area, more than enough to cause skin burns and start fires. Left alone for several minutes, even in free air, it will easily exceed 100ºC. There's a reason power resistors are usually made of ceramic materials. Take some precautions. Work on a metal surface, wear gloves, use a small fan or attach a heat sink, and you should be fine.
22 AWG can tolerate 8-13A depending on the insulation. Another StackExchange question indicates that breadboards have about a 1A current limit, so I refer you to that question for alternatives. (The voltage in breadboard does not matter as long as it's low.)