my basic understanding of electricity is as follows:
Voltage is the difference between the electric field strength of 2 points and how much work can be done by a charged particle as it is moved by those fields between those 2 points.
Current is the flow of charged particles and the consequence of voltage. There can be no current without voltage although there can be voltage without current.
Resistance is a measure of how much a material inhibits current.
The relationship between these three is expressed in an equation:
V = I x R
The problem I have is that when I apply this understanding to a series circuit, I struggle to wrap my head around how current and resistance can remain constant throughout the circuit but voltage varies? Shouldn't the current drop with the voltage? There are other people who have asked similar question but I could not really understand the answers so please keep it simple for me. Thank you.
Best Answer
simulate this circuit – Schematic created using CircuitLab
Figure 1. Two 2 kΩ loads on a 10 V supply.
Figure 2. See my answer to Intuitive interpretation of negative voltage for more on voltage references. Image original by @Transistor.