Electronic – Is a digital bit represented by any two discrete signals

computer-architecturecomputersvoltage

I am trying to understand computers from the building blocks up. I know computers use transistors to amplify voltages, and this is used for arithmetic, such as a MOSFET.

However, what really makes up a typical computer bit? Is it just two input currents, one higher and one lower? Two possible states? Is movement a bit (run or walk)?

How can I make a bit? Do I just need an emitter, current, and multiple passages?

Would I have a bit if it couldn't be amplified (i.e. a two-series circuit with no amplification of current)?

Best Answer

A computer bit can be considered the transmission of a signal from one place to another over a wire. The "digital" nature of a bit definition can be any two states that make sense relative to how the circuitry that drives the bit level onto the wire and the circuit that detects the level of the bit signal on the other end of the wire. This could be defined as specific voltage level ranges, current amount ranges or even direction of current flow. The most common scheme used in digital circuits is to use voltage levels. One level pulls the signal line toward the GND whilst the other state of the signal line is determined by pulling the line toward the supply voltage.

Bit signal levels transmitted through a circuit on signal wires always need to originate from some place. These can originate from electro mechanical components that hole the lines at the particular defined voltage levels. They can also originate from special circuits that hold or store the signal level. Such circuit, commonly referred to as a latch or flip-flop, are designed so that the output can drive an output to a signal wire to establish the pair of defined digital states that represent the '1' and '0' values.