Electronic – We can ignore the wire resistance if the current delivered by the wire is low

circuit analysisresistancewire

In this question asking about the Resistance of wire, the answer by user 比尔盖子 provided the following explanation for When Resistance Can be Ignored:

Most of the time, resistance of a wire is too low when you compare it to the resistance of other components and loads, so it's negligible and often safe to ignore. Moreover, $$\ V = IR \$$, the lower the current a load needs to take, the higher its equivalent resistance, so you also ignore the wire resistance if the current delivered by the wire is low, because it's equivalent to connecting a small resistor (a wire) to a large resistor (a device that takes current) – almost no effect.

The first idea is clear to me: Most of the time, resistance of a wire is too low when you compare it to the resistance of other components and loads, so it's negligible and often safe to ignore. However, the second idea is unclear to me; specifically, the idea that, because $$\V = IR \$$, the lower the current a load needs to take, the higher its equivalent resistance, and so we can ignore the wire resistance if the current delivered by the wire is low, because it's equivalent to connecting a small resistor (a wire) to a large resistor (a device that takes current) – almost no effect. If $$\V = IR \$$, then, $$\ V \$$ being constant, a lower value of $$\ I \$$ implies a larger value of $$\R\$$ — a larger resistance. But I don't understand why this implies that we can ignore the wire resistance if the current delivered by the wire is low.

I would appreciate it if people would please take the time to clarify this.

$$\V=IR\$$ isn't just something you apply to the circuit as a whole. $$\V\$$ also isn't necessarily the voltage you're driving through the circuit. $$\V=IR\$$ is also how you calculate the voltage drop across a given resistance.

If our load is relatively high resistance and as a result, our current $$\I\$$ is small, that means the actual impact (voltage dropped) by a wire of low resistance will be quite low indeed. If we have $$\5\ \mathrm V\$$ powering a $$\1\ \mathrm{k\Omega}\$$ load, and a wire between the voltage source and that load with a total resistance of $$\5\ \mathrm{m\Omega}\$$, then the total current will be $$\5\ \mathrm V/1000.005\ \mathrm{\Omega} = 4.999975\ \mathrm{mA}\$$. This means that our $$\5\ \mathrm V\$$ wire, because of $$\V=IR\$$, will drop the voltage by: $$\4.999975\ \mathrm{mA}\times0.005\ \mathrm{\Omega}=24.999875\ \mathrm{\mu V}\$$. Yes, 25 microvolts.‬ Due to the wire, the load sees a voltage that is $$\0.0005\ \%\$$ lower than the $$\5\ \mathrm V\$$ it might otherwise expect.

A low current when looking at the total voltage across a circuit does indeed imply a high resistance for the entire circuit. But there is no reason to hold voltage constant, as it is not constant through our circuit, but rather it will drop by some amount after each series-connected element. The current is the same through things connected in series, so this means that each element will drop a share of the total voltage across the circuit proportional to how much of the total load it accounts for.

So when the current is low, that means the voltage dropped by something with low resistance relative to the total resistance will be a very small percentage of the total voltage. So a very small resistance in series with a very large resistance, no matter how much voltage you actually put across them, that small resistance will always only drop the voltage proportionally to how much of the total resistance that small resistance accounts for. And when that current (and thus voltage across the entire circuit) is small, then the voltage drop caused by the small resistance will be very small.

Now, there are certainly some precision applications where even a drop of microvolts is something that must be accounted for and potentially calibrated out or minimized. Platinum resistance temperature detectors (RTDs) are a great example. They're little more than a resistor, typically $$\100\ \mathrm{\Omega}\$$ or $$\1000\ \mathrm{\Omega}\$$ (roughly) at $$\0\ \mathrm{^\circ C}\$$, with only a fairly small change in resistance (less than $$\0.5\ \mathrm{\Omega}\$$ per degree for the $$\100\ \mathrm{\Omega}\$$ versions) vs temperature. To increase measurement accuracy, it is common to use four wires – two wires which carry the 'excitation current', usually a few $$\\mathrm{mA}\$$ through some series connected resistors to induce a voltage drop across the RTD, and two sense wires which connect to an amplifier with very high input impedance. This minimizes the error caused by the wire resistance in the excitation wires carrying, say, $$\5\ \mathrm{mA}\$$ by simply bypassing them entirely. Separate voltage sense and current carrying wires like this are common in precision equipment or very high current applications (where even low resistances of thick wires can introduce meaningful error due to the voltage drop).

So there isn't any definite threshold where it automatically becomes totally safe to ignore small parasitics like wire resistance out of hand. Only you, with knowledge of the exact application or circuit you're designing, can make that judgement call. And most of the time, most small things like this can be ignored. Just don't forget about them, because they're always there, and will impact your circuit, negligible or not.

Parasitics come in more flavors than resistance. Parasitic inductance and capacitance are always present as well. And for a bonus headache, everything conductive is an antenna and can happily receive or emit electromagnetic waves, depending on size. And everything that isn't conductive (dielectrics) will happily conduct free electromagnetic waves where conductors would absorb or reflect them. And all of these things I just described can balloon to your primary design concern depending on what you're doing.

Anything can be ignored. At least until it can't be.