I see that you're referring to the light fixture itself. In this case, the rating is the maximum that the fixture construction can withstand. Remember that the fixture is really only wires, switches, and connectors. It only serves to pass the current from the wall to the light bulb.
In your case, the fixture can handle any voltage up to 250V. You can use a 120V or a 240V bulb. The bulb would have to match the mains voltage.
660W is the largest bulb the fixture has been designed for, so that nothing gets too hot and/or melts. You can use a smaller-wattage bulb.
And the frequency doesn't matter to the fixture, although the bulb you choose might be affected (but probably not...)
That was the simple answer, but there is another consideration. Thanks to @david for pointing it out.
The ratings on the fixture should include maximum voltage and maximum current. The maximum voltage is simply the highest voltage that the fixture is rated to withstand, without arcing or breaking down in some fashion.
The maximum current, if given, will be independent of voltage. Often, however, the manufacturer will use a power rating instead of a current rating (660W in this example). To do this, they have to make some assumptions about the incoming voltage. Those assumptions reflect the manufacturer's expectations of where the product will be used.
For example, a 660W bulb draws 5.5A at 120V, or 2.75A at 240V. Were they expecting a 120V lamp? Probably so, if it was sold in an area that uses 120V. So, they chose wire that can handle 5.5A (or higher). Since this will also work with a 240V bulb, there is no problem.
But, say that you are using a 24V bulb (assuming you had a 24VAC power source). The fixture could certainly handle the voltage, and should work fine. However, if the bulb was 660W (the fixture's "maximum"), then the required current would be 27.5A. The wiring in the fixture will almost certainly fail at this current.
So, if in doubt, check to see if there is a current rating on the fixture :)
Powering such a device from DC is perfectly fine, in fact usually the input is rated both for AC and DC. You just need to be very careful about the input voltage though: when you rate an AC input you usually give the maximum peak to peak voltage amplitude, but when you rectify it there is that 1.414 factor, also known as \$\sqrt{2}\$, that must be taken in account.
So, no risks for the user or the adapter. If you exceed the maximum input voltages some capacitors are going to explode (at least), if you use a lower input voltage the adapter won't work properly and might heat up due to undervoltage protections.
I'm guessing you want to power your laptop with your car battery: that is possible but not just hooking the laptop adapter to the battery.
Best Answer
If we examine the power draw at the power inlet of the device (and thus rule out one advantage of 220Vac over 110Vac) then if an appliance draws 5mA when powered from 220Vac its power draw via P = I*V would equate to 220 * 0.005 = 1.1Watts. [1]
If we consider two types of loads.
A passive-type load would follow Ohm's law and thus a reduction in voltage would result in a reduction of current & equally a reduction in power. This is true for heaters, lightbulbs etc...
If you were to take a 5kW electric heater from the EU and plug it into a US 110Vac outlet you would find its output is 1.25kW (give or take) as the resistive load is drawing 11A instead of 22A.
Does this make 220V more or less efficient when viewing a passive load? neither because if you wanted a 5kW electric heater you would need to correctly acquire one for the operating voltage.
Active loads however are more constant power loads and thus a reduction in input voltage will be met with an increase in current draw as it attempts to maintain its operating point.
For you example of 220Vac @ 5mA == 1.1Watts. An active load would draw whatever current is required to satisfy its load's need. So if the voltage was reduced to 110Vac the current that would be drawn would be 10mA to meet the power needs of the active circuit ( [1] is still applicable here)
Does this make 220Vac or 110Vac more efficient? Well this is where copper losses come into play & thus 110Vac would be less efficient than 220Vac systems.
[1] This is assuming that 5mA was drawn at unity displacement power factor & pure sinewave... it isn't in practice but for quick calculations to prove the concept.