Electronic – What prevents abrupt voltage change in a circuit with a capacitor

capacitorvoltage

Let's say we have a simple circuit consisting of a power supply and a resistor, and currently the input voltage is 0V. We now apply a voltage of 5V to the circuit (like a step increase – instantaneously). The voltage across the resistor changes instantaneously to 5V.

If a capacitor is introduced into this circuit, it will gradually charge until the the voltage across it is also approximately 5V, and the current in this circuit will become zero.


My question:
What is now preventing us from suddenly changing the voltage from 5V to let's say 10V (again like a step increase – instantaneously)? We could do it before the capacitor was introduced, but why not now?


The answer I have thus far always gotten is that for that to happen, according to i=C*(dv/dt), the current flowing in the circuit must be infinite, and since that cannot happen, the voltage cannot be changed instantaneously.

The problem is that we ARE changing the voltage of the power source instantaneously, just like before the capacitor was introduced. The introduction of the capacitor has not somehow taken away our ability to change the voltage of the power source (instantaneously), has it???

So, in this circuit with the capacitor included:

  1. Can we change the input voltage instantaneously or not? (theoretically)
  2. If we can, what will happen to the current? Will it try to become infinity as i=C*(dv/dt) states? Or is there something else that I'm missing?
  3. If we can't, is the capacitor somehow magically preventing this change? How can any component actually affect the input voltage of the circuit?

Best Answer

It's like the "paradox" of the immovable object meeting the irresistible force. In reality, neither can exist.

A real power source will have some impedance. A real capacitor will have some impedance. Real wires connecting them have resistance and inductance.

So in reality when you slap a fairly 'stiff' power source across a fairly good real capacitor there's a spark and the capacitor charges through those series resistances with some ringing and stuff due to the inductances. Ignoring the inductances, the voltage difference would simply divide in ratio to the internal resistance of the capacitor and the internal resistance of the power supply and the wire resistance.

To answer your specific question: If the capacitor and voltage source and wires are ideal, you have a mathematical problem, like division by zero. It's of no consequence in the real world- it just illustrates that the ideal models of the power source and the capacitor and wires are insufficiently accurate to describe their real-world behavior. Modelling any one of those as a real part with real resistance (and inductance) will make the mathematical problem go away, but it won't likely give you an accurate indication of what is actually happening.

For example, if the wire (or the capacitor or the power supply) had 10m\$\Omega\$ resistance, and there is a 10V difference, you could predict you'd see 1000A (which is very high, but not infinity) and the capacitor would charge very quickly. In reality that isn't likely going to happen because of other non-ideal factors. If the 10m\$\Omega\$ was modeled as in the power supply, the power supply voltage would drop. If the 10m\$\Omega\$ was modeled as in the capacitor, the voltage would suddenly appear across the capacitor terminals. If the 10m\$\Omega\$ was modeled as in the wire, the voltage would appear across the wire. But none of those is very realistic.

If you modeled as a circuit with no resistance at all and a tiny bit of inductance (even superconducting wires of any length have inductance) then a simple mathematical model would predict ringing that would persist forever, energy sloshing back and forth between the inductance and capacitance at an angular frequency of \$\omega_0 = {1 \over \sqrt{LC}}\$.