# Electronic – Powering an LED with transistor in saturation mode

lednpnsaturationtransistors

I currently try to design my first transistor circuit to switch an LED on and off. So I came up with this circuit diagram:

I have to use the following components:

What I need to figure out are the values of R1 and R2. So first I picked a desired current of 70mA from the 5V source to GND. According to the LED datasheet, 70mA of current induces a voltage drop of 1.3V. Figure 11 of the transistor datasheet shows that the saturation voltage at 70mA is roughly 0.06V.
Now we can calculate R2 using Ohm's Law:

$$R2 = \frac{5V – V_{D1} – V_{CE(sat)}}{I_{C}} = \frac{5V – 1.3V – 0.06V}{0.07A} = 52\Omega$$

In order to get the base current \$I_{B}\$, I looked for DC current gain in the datasheet. The lowest value is \$\beta = 10\$ as shown in Figure 11. Therefore

$$I_{B} = \frac{I_{C}}{\beta} = \frac{0.07A}{10} = 7mA$$

Which is definitely a problem. I don't want to draw more than 2mA from the logic source as it can possibly be damaged. To stay on the safe side, I need to find a way to increase the lowest possible value of \$\beta\$ to reduce the base current.

Is this the way to go? And if yes, how would I accomplish an increase of \$\beta\$ ?

Please don't tell me to buy other transistors with higher DC current gain, although I will definitely do that later.