I saw someone asking this EE.SE question in a thread over 2 years old and could not really understand some things about the answer.
He had a 3.3V input signal that he wanted to change into 5V signal.
This is a circuit someone suggested:
Here is a comment he made about the circuit:
[…] the transistor is configured as an emitter follower and the voltage on the emitter is base voltage minus about 0.6V. If the emitter got higher it would turn off the transistor thus preventing the voltage rising much above about 3V. Think of base and emitter and what differential voltage they must be at to sta[r]t to turn the transistor on.
What I don't understand is:
- What is Vb when there is 0V in the emitter? I know that Vbe = Vb – Ve, and that Vb is supposed to be 0.6V, but why? There is the 3.3V supply in the base, doesn't it contribute anything? Vb is determined only and only by Ve?
- Pretty much the same question but about Ve. If there is a voltage Vb that is set by the 3.3V supply and the R1 resistor, there is supposed to be a Ve according to the equation in (1). But if Ve is set by the 0-3.3V input – isn't there some kind of a clash?
- Why is the transistor off when the input is 3.3V (in the emitter)? According to the equation in (1), Vb is supposed to be Vb = Vbe + Ve = 0.6+3.3 = 3.9V. That means, the base has '1' (high), which means the transistor should be on, no?
I assume the 3.3V supply is limiting Vb to 3.3V, but I'm asking anyway.
- Any reason why the resistors have these values?