# Electronic – Discharge and charge rates of a capacitor – Comparing Energy Movements

capacitorchargecurrent

This is a really simple question. But I'm seeing some confusing behavior in an experiment I'm trying to make sense of. So I'm trying to see if my understanding of theory is wrong.

Wikipedia says: The coulomb (unit symbol: C) is the SI derived unit of electric charge (symbol: Q or q). It is defined as the charge transported by a steady current of one ampere in one second:

Ok so 1 amp for one second transports one coulomb, right?

So in other words, if I discharge a capacitor for one second at 1 amp and charge a second capacitor for one second at 2 amps from another power source, the final state outcome is that I put twice as much charge on the second capacitor as I discharged from the first capacitor, right?

If that is correct, then let me also ask something else: The voltage is irrelevant to the measured energy expenditure right? It doesn't matter if the first capacitor starts at 100 volts and discharges to 0 volts while the second capacitor starts at 0 and charges to 10 volts. All that matters for computing the energy movement is the actual charge/amps. Right?

Also, is there any kind of known condition where this doesn't hold true or something that easily trips people up?

If that is correct, then let me also ask something else: The voltage is irrelevant to the measured energy expenditure right? It doesn't matter if the first capacitor starts at 100 volts and discharges to 0 volts while the second capacitor starts at 0 and charges to 10 volts. All that matters for computing the energy movement is the actual charge/amps. Right?

This part is not right. The voltage is very relevant to the energy.

The energy stored in a capacitor is:

$$E = \frac{1}{2}CV^2$$

The voltage on a capacitor is determined by the charge and the capacitance:

$$V = \frac{Q}{C}$$

If you have a smaller capacitor, a given amount of charge will create a higher voltage, and it will take significantly more energy to put it there.