Why we use 330 ohm resistor to connect a LED?
Correct formula for LED current-limiting resistor?
Why we use 330 ohm resistor to connect a LED ?
the R is by practice 330 ohm.
Why this value? How do I calculate it? what's the purpose of it?
Is there a specific parameters in LED to get this value?
Why is R by practice 330 \$ \Omega \$? I didn't hear of that practice until now.
The resistor is not 330 Ohm "by practise". The value depends on the voltage drop and what current you want thru the LED. 330 Ohms gets you about 9 mA with a 5V supply and 2.1 V LED. That is no more the right answer than many others.
@OlinLathrop I'm here to understand something that I dont understand ... I just many of designs with 330 ohm
He did NOT ask what the "correct" value to use was. He asked why people use 330 Ohms "as of right". Thedifference is perhaps subtle but perhaps important. Or not :-)
The typical red LED gave reasonable brightness at 10 mA forward current, and dropped 1.7 V at 10 mA. That means you want the entire circuit to deliver 10 mA, and you need to drop 3.3 V in the series resistor. Ohm's Law gives 330 ohms for that setup.
Each color of LED requires differing amounts of current to achieve the same brightness. Or, put another way, they need the current limited to different amounts to prevent overheating and destruction. Have you Googles around a bit to find site like this? https://www.kitronik.co.uk/blog/which-resistor-should-i-use-with-my-led/
This is to limit current through LED, without resistor LED will eat current until it melts.
Voltage drop across a LED depends on a it's color, for blue led for example - 3.4V. So if you have 5V power supply, and want 5mA current through led (5mA usually gives good visibility), you need (5V-3.4V)/0.005A = 320 Ohm resistor. (I.e. this resistance will give voltage drop across resistor of 1.6V, remaining 3.4V drops on LED => 5V total)
Red LEDs usually have smaller voltage drop (~2V), so you'll have slightly higher current with same resistor, but anything below 20mA is usually ok. Also, slightly smaller currents are ok, LEDs at 1mA are easily visible.
PS. few extra things:
1) Light output of led is linearly proportional to current until it's well over specifications. That's why everyone are talking about current through led.
2) Personally I throw 220 Ohm in 5V circuits to make it really bright :-)
But on my recent project where I had 3.3V supply, and leds of different color (green, red, blue) I had to calculate resistances more carefully, and they were 68 Ohm for blue and 220 Ohm for green and red.
A series resistor limits the current to a value which can be designed for if you know, V supply, LED voltage drop at desired current and desired current. See LED data sheet for typical Vled at a given current. Then -
- Iled = (Vsupply-Vled)/ Rseries or
- Reseries = (Vsupply - Vled) / Iled.
Many small LEDs are rated for 20 mA max operation.
Using 330 ohms in series is a "lazy man's" calculation-free and thought-free method of ensuring that an LED will be able to be safely operated on a 5V supply but still have a reasonably large percentage of the output that it would have at 20 mA.
330 ohms may be used by some people as a "get you going" value that works "well enough" in many cases.
The purpose of the resistor is to "drop" voltage that is not required to operate the LED, when the LED is operating at the desired current. As the forward voltage of LEDs varies both with colour and chemistry used and with current, and as the "desired" current varies with the user's needs, there is NO single correct value. See "Procedure:" at end for a step by step application of this.
White LED, forward voltage = Vf = ABOUT 3.3V.
On a 5V supply resistor voltage = Vr = 5-LED voltage = 5-3.3 = 1.7V.
Current = Iled will be V/R = 1.7/330 = 5.15 ~= 5 mA
Red LED. Vf =ABOUT 2.2V.
Vr = 5-2.2 = 2.8V.
Iled = 2.8/330 = 8.4848... ~+ 8.5 mA.
IR LED. Vf = 1.8V. Iled =~ 10 mA.
In the above cases Iled varies from ABOUT 5 mA to ABOUT 10 mA.
A factor of 2:1.
In reality currents will be somewhat higher as typical Vfs I used are at 20 mA typically.
At lower currents Vf is lower (see LED data sheets) and so R has more voltage drop so there is more current so ... .
Specify desired current = I_LED Specify supply voltage = Vs
Use data sheet to determine typical LED 'forward' voltage drop at specified current = Vf
Voltage drop across resistor = Vr is the portion of Vs voltage which is not across LED. ie Vr = Vs - Vf
Resistor value = R is given by Ohms law: R = V/I
where V is voltage across resistor and
I is current through LED + resistor in series.
So: R = V/I = Vr / I_LED = (Vs-Vf)/I_LED
I wouldn't say that the purpose of the resistor is to drop voltage that isn't required for the LED to operate, this is miss-leading. In the datasheet for most LEDs there will be a graph that shows the relationship between voltage across the LED and the current through it. By limiting the voltage to a value on that graph we are limiting the current through the LED so as not to damage it. Some LEDs may have different I/V graphs so saying that an LED does not require 5V to operate may not always be the case.
@jduncanator Current control is usually the focus when driving LEDS BUT my statement taken in context is precisely correct, it's what the resistor does and it's what it's for. There are of course various ways of looking at the same situation and you can focus on LED current or Vf or ... . I covered the issues that you mention in the (now 4.5 years old) answer. Except in exceptional and non-typical cases modern single LEDs have a max Vf of about 3.5V at typical currents. 4V may happen but is unusual. For white, the minimum at rated current is about 2.8V for some high efficiency LEDs.
I didn't mean what you said was wrong, just tried to provide some more insight as to why you need a resistor in series with LEDs. I personally thought that _The purpose of the resistor is to "drop" voltage that is not requred to operate the LED_ sounded a bit miss-leading. Maybe that is just me.
@jduncanator I'd agree with you completely if the statement was the only one made or even if t as how I started the answer. But I have to "agree in part" in that the statement is one of several amd needs (I think) saying, even if it's not how it may usually be looked at. Consider - I ant to operate an LED at say 20 mA and the data sheet says Vf will typically be 3.2V at this current. BUT I have a 5V supply. Q: What "happens to" the remaining 1.8V. A: We can choose to use a resistor to 'drop' the remaining voltage. -> For a beginner this MIGHT clarify what is otherwise a quite obscure process.
... ie how else do you descrobe it that makes clear what is going to happen in a non oterative fashion. I hope you'll find that it's actually quite a good fundamental view. If onstead you say "I have a 5V supply and want 20 mA LED current, what resistor do I use?", you will need then to ask 'What is Vf', then 'what is Vr?' to get R = (Vsupply-Vf)/I_LED. Which is the same general process but what MAY well be a more confusing path. Either way, all paths lead to Ohm, [ :-) ] but mine involves minimum steps. | OR if you can oprefer you can start with th method that my answer started with. viz:...
... "A series resistor limits the current to a value which can be designed for if you know, V supply, LED voltage drop at desired current and desired current. See LED data sheet for typical Vled at a given current. Then ...". So, choose which approach suits.
This answer has good examples. This can be a bit hard for beginners to grasp, since LEDs don't work like resistors. Another way to word it is that pretty much any resistor value in this circuit has the same drop voltage (5V minus LED's fixed drop voltage), so the goal is to choose a resistor that allows an appropriate amount of current, below the LED's maximum.