Yes, they're the same thing, but, an electricity bill stating n * 3.6 megajoules is a bit abstract compared to Watthours when light bulbs are rated in Watts.
The issue you're confused about seems to be the difference between power and energy.
Energy is how much work you can do. Common units are joules or watt-hours.
Power is how fast you do work. It's a rate of change. Common units are watts or horsepower. Horsepower is probably an instructive unit to consider. Say you wanted to move a large pile of straw. Whether it's moved by a horse or a housecat doesn't affect the amount of work done. But the horse does it faster, because it's a more powerful animal.
For the purposes of discussing grid electricity consumption, watts (W) and kilowatt-hours (kWh) are the most common units used. To know how much energy is consumed, multiply the power by the time. 100 W x 1 hour is 100 watt-hours, or .1 kWh. In short, the relationship between power and consumption is time.
A 100W bulb consumes 100W assuming the voltage across it is what's specified on the package, which is usually 120V in my experience. If the voltage at your socket is lower, the bulb will consume less power. It's approximately a fixed resistance, so the power consumed is
$$
P=\frac{V^2}{R}
$$
As an aside, remember energy conservation. If something consumes 100W, that energy is being converted to some other form. Either it gets stored (potential energy), it's used (light, motion, etc.), or it's wasted as heat. For an incandescent bulb, ~90% of the power consumed is converted to heat. So a 100W incandescent bulb consumes 100W, but only outputs 10W of light. It gets hot because the other 90W is being wasted. Which is why CFL's run so much cooler and consume less power for the same light output.
Best Answer
Anyone who has a clue about how physical units works will of course realize that
kWh/1000h
means "1000 watt-hours per 1000 hours" which can be shortened to justW
.But when it comes to lamps, the unit "W" is already used for the light output. Light bulbs which use more energy-efficient technologies than the classical incandescent light bulb often state their light output in equivalency to an incandescent bulb with a specific power consumption. Until 2010 you could often find LED light bulbs stating to be "equivalent to a 40W bulb". So the consumer knows that if they want to replace an old 40W incandescent bulb with an equally bright LED bulb, they need to look for a 40W LED bulb. A consumer buying an LED lamp with an input power of 40W might be surprised by how bright it is.
Also, the average consumer doesn't know much about how electricity works. They know they need to pay for their electricity consumption in a unit called "kWh", so they want to know how much they need to pay when they run the device for x hours.
So from the point of view of the average consumer, the unit "Watt" means "light-intensity" and "kWh per hour" means "energy consumption". A physicist will of course inject that the unit for visible light radiated by a source is "Lumen" and "Watt" is the unit energy consumption should be measured in, so that's what should be printed on light bulb boxes. But physicists aren't average consumers.
Using different units for each - even if both of them are misleading from a physicist's point of view - is what's the least misleading way to communicate it to the end-user.