Electronic – what happens with diode voltage drop when connected directly to constant voltage supply

diodesvoltage

The diodes usually have voltage drop of about 0.7V or so, which is said to be constant for some type of diode (ie. not depending on voltage/current).

So what would happen if one connected a diode directly across the power source (for example, over the 1.5V battery), oriented so it conducts?

On the one hand, one would expect to measure 0.7V across diode, as it is its voltage drop. On the other hands, that same points are also directly across power source, so one would expect to measure 1.5V of the power source. And due to KVL (and no other elements in the circuit), they cannot differ but must be the same.

I'm primarily interested what would happen in real world and why (although I'd appreciate ideal diode theory and/or simulations in addition to it).

What would voltmeter across diode read (in the short time before diode self-destructed due to excessive heat)? And where would be missing 0.8V (1.5V-0.7V) actually be (it must be somewhere according to KVL)?

Best Answer

What would happen is that the diode would self-destruct in a (possibly) spectacular fashion.

Every real-world battery has some internal resistance, as do real-world diodes. This resistance, together with the diode-drop, would determine the current flow.

Since there would be a very large current flow for most common batteries, the diode would be unable to dissipate the energy, and will overheat and fail.


Taking a common example: A 1N4001 diode connected directly across a alkaline "AA" battery:

  • The battery's internal resistance is ~200 mΩ.
  • The contribution of the diode's internal ohmic resistance is negligible.

Therefore, the current flow can be solved for fairly trivially, particularly if you ignore the fact that the diode drop varies depending on current.

The simple solution is \$\frac{1.5V - 0.7V}{0.2Ω} = 4A\$ (fig 1), so approximately 4 amps of current would flow.

With 4A current flow, and the 0.7V diode drop, the diode would be dissipating \$4*0.7 = 2.8W\$ (fig 2).

We can then look at the diode's thermal resistance (\$R_{θJA}\$), which is specified as 100 K/W. This means that for every watt dissipated, the diode's temperature will increase by 100 K (kelvin).

Therefore, with a 20°C ambient temperature, the diode's temperature will be \$20° + 100°*2.8W = 300°C\$ (fig 3). 1140° is well past the point where the diode would be incandescent, and it will promptly self destruct.


Edit:

Basically, the critical thing here is there are no perfect voltage sources. If you connect a diode across a perfect voltage source, you will get the voltage of that perfect voltage source across the diode, for the infinitesimal period of time before the diode self-destructs due to self-heating.

However, all real world voltage sources (such as a battery) have a internal resistance. It's that internal resistance, together with the resistance of the wires leading to the diode, the internal ohmic resistance of the components within the diode, and the actual diode-drop itself that must be considered when trying to determine the instantaneous voltage across the diode the instant it's connected (well, that's ignoring cable and battery inductance, but that's another matter).


Further Edit:

I'm using \$0.7V_{F}\$ as a simplification of the real diode forward voltage, for the sake of ease of calculation (and because I'm too lazy to work out all the math). In a real situation, the diode drop will depend on the current, so the actual forward current of the diode will be somewhat lower.

If you want to know the exact forward voltage to a higher degree of precision (and if so, why?), you can instead replace the perfect 0.7V forward voltage drop with the ideal diode equation, and calculate the voltage drop of that in series with the internal resistance of the battery.

The datasheet has a graph of forward voltage versus \$I_{F}\$, which you have already found.

enter image description here