Is a 250V lamp compatible with both 120VAC and 220VAC

frequencylampmainsvoltage

Is a 250V and 660W lamp compatible with both:

  • 120VAC at 60Hz, and
  • 220VAC at 50Hz?

And why? Also, does the frequency of the mains have any bearing on the lamp?

In North America I was very surprised to notice a lamp certified for 250V but plugged into an 120VAC mains. My naive intuition is that the lamp should be certified for 120V. And similarly, the same 250V lamp would probably be compatible with a 220VAC mains (as in Europe), but I would have expected that a lamp in Europe should be certified for 220V.

But I'm most likely wrong here, so:
Why is a 250V lamp compatible with both 120VAC and 220VAC mains?

PS To avoid confusion, by "lamp" I mean a light fixture and NOT a light bulb.

Best Answer

I see that you're referring to the light fixture itself. In this case, the rating is the maximum that the fixture construction can withstand. Remember that the fixture is really only wires, switches, and connectors. It only serves to pass the current from the wall to the light bulb.

In your case, the fixture can handle any voltage up to 250V. You can use a 120V or a 240V bulb. The bulb would have to match the mains voltage.

660W is the largest bulb the fixture has been designed for, so that nothing gets too hot and/or melts. You can use a smaller-wattage bulb.

And the frequency doesn't matter to the fixture, although the bulb you choose might be affected (but probably not...)


That was the simple answer, but there is another consideration. Thanks to @david for pointing it out.

The ratings on the fixture should include maximum voltage and maximum current. The maximum voltage is simply the highest voltage that the fixture is rated to withstand, without arcing or breaking down in some fashion.

The maximum current, if given, will be independent of voltage. Often, however, the manufacturer will use a power rating instead of a current rating (660W in this example). To do this, they have to make some assumptions about the incoming voltage. Those assumptions reflect the manufacturer's expectations of where the product will be used.

For example, a 660W bulb draws 5.5A at 120V, or 2.75A at 240V. Were they expecting a 120V lamp? Probably so, if it was sold in an area that uses 120V. So, they chose wire that can handle 5.5A (or higher). Since this will also work with a 240V bulb, there is no problem.

But, say that you are using a 24V bulb (assuming you had a 24VAC power source). The fixture could certainly handle the voltage, and should work fine. However, if the bulb was 660W (the fixture's "maximum"), then the required current would be 27.5A. The wiring in the fixture will almost certainly fail at this current.

So, if in doubt, check to see if there is a current rating on the fixture :)