What’s the frequency v power relationship of an oscillator + transmitter

frequencyoscillatorpowerRF

I'm starting with the assumption that a higher frequency EM wave is more energetic than a lower frequency one and thus requires more energy (and thus more power) to transmit.

In my naive model of a transmitter, there's an oscillator circuit, an amplifier and an antenna. It doesn't seem to be the case that there's a proportional relationship between frequency and power in an oscillator circuit so am assuming that the extra power must be consumed by either the amplifier or antenna circuitry.

Is the power consumed by an oscillator circuit not proportional to the frequency of oscillation, ceteris paribus?

In what stage of the generation/transmission of higher frequency EM waves does the extra power get consumed ?

Assuming wide enough bandwidth in the amplifier + antenna for the frequencies under consideration (ie 3 dB point is far enough out or ideal no parasitic system), where does the extra power get consumed to justify conservation of energy and Einstein-Planck (E=hv)?

Best Answer

Here is an engineers and not a physics forum, so I will try to give a simple answer.

A radio (electrical) oscillator working like a fast switch opening and closing a circuit those producing a series of pulses (very simplified). It does not charge atoms to change their energy level and then to release photons that have energy related with their frequency. So in radio oscillator it is not necessary to add power as the frequency increase. Few meters from an ordinary RF antenna you will have EM radiation or photons with very low energy (not intensity!) hundred thousand lower than a UV photon.

But in –let’s say- X-ray tube yes you have to increase the voltage (even DC) to accelerate electrons and to obtain high-energy photons from specific metals (i.e. 100kV for 100eV photon).

It is clearly different mechanism.

In any case don’t mix-up classical electrodynamics and quantum mechanics.