The average value of the output of a buck converter in discontinuous mode is still the same as continuous mode. However, the ripple voltage increases.
This happens on light loads when the switcher cannot produce pulses of low enough duty cycle and it transfers a little too much energy per switching cycle for the light load.
This means the average output voltage starts to rise too high and the control system shuts down the switcher for several cycles resulting in a slightly lower than normal output voltage. Until the load starts to take more power this situation persists.
Discontinuous mode also happens when the input voltage to the buck rises too high.
EDIT - more information:-
If the buck converter is operating at 100kHz and the inductor is 10uH. Let's also say that the supply is 12V and duty cycle is 50%. The "on" pulse of the mosfet will charge the inductor with current and this current is determined by input voltage, output voltage and inductance. Let's say the output voltage is 5V - this means the voltage across the inductor is 7V when the mosfet is "on".
V = L\$\dfrac{di}{dt} \therefore 7 = 10\times 10^{-6} \times\dfrac{di}{dt} \therefore\dfrac{di}{dt} =\space \$700,000
Because dt is 50% of 10us we can calculate I, which is \$700000\times 5\times 10^{-6}\$ = 3.5A.
The energy transferred per cycle is therefore \$\dfrac{L\times 3.5^2}{2}\$ = 57.8uJ.
This transfers 100k times per second \$\therefore\$ the power to the load is \$ 57.8\mu J \times 100000\$ = 5.78W.
If the load resistor is too high to take that power at 5V then the simple fact is that the output voltage will rise and the buck convertor will enter discontinuous mode unless the control loop reduces the duty cycle.
This duty cycle reduction will of course happen because it would be a poor buck converter that couldn't produce less than 50% duty but, it will have a minimum value and at this point, if the energy-load equation isn't in equilibrium the converter will enter discontinuous mode.
Periodic steady state does mean normal operation (which obviously isn't a constant output voltage). The idea is that the dynamics of the converter have settled and all startup transients have settled.
The net change in the capacitor voltage in a buck converter must be zero or else the output voltage would be moving around, which means the circuit wouldn't be working as a DC-DC converter. So yes, I would say it is because the output DC voltage settles to be a constant (with some ripple).
Best Answer
First, "DC component" is an exact synonym for "signal average". The latter term is the time-domain math explanation (you integrate the signal over time and divide by the interval). The former is the frequency-domain math explanation (you take the Fourier transform of all the components, then divide by the interval -- but the "DC component" is the zero-frequency component, which is the integral of the signal over time).
The only place that the charge from the inductor can go is into the output capacitor or the load. If the average current in the inductor doesn't match the average current into the load, then the excess goes into (or comes out of) the capacitor.
Capacitors integrate current into voltage -- so if the net current of the capacitor isn't zero then its voltage will change. That'll change the current from the inductor and change the current into the load.
Let the inductor current be \$i_L\$, and the output current be \$i_O\$. Then, by Kirkoff's current law, the capacitor current has to be \$i_C = i_L - i_O\$.
If the average capacitor current is anything but zero, then the capacitor voltage will continually climb or decrease. Since the capacitor current is the difference between the inductor current and the output current, if the average capacitor current is zero, then the average inductor current must equal the average output current.
So, just by the fact that the output voltage is steady and there's a cap there, the inductor average current has to match the load current.