Quote Originally Posted by redline
I got a bunch of LM317s kicking around, but wasn't planning on using them.
I figured resistors would do the job as long as I will willing to do a 200 hour array break-in and readjust resistance if needed. I am also going to have a couple of panel current meters monitoring 6 arrays each. I am also going to have a couple more meters where I can spot check current to individual arrays.

Does an electronic regulator have about the same power loss as a resistor?
No, it has lower losses. But it depends of how much voltage it need to adjust. If its little, the power loss is kept small. The closer the voltage of the array to 24V (supplied by th PS), the less volts drop.

You will notice it for how much hot get the LM317 (if i remember well LM317 have a max current rating of 0.5A, so you could use them).

Quote Originally Posted by redline
Also, I figured using 24 volt power supply is better then using a lower voltage since any power supply voltage variation is distributed across a larger number of LEDs.
Yep, but the issue here is not variation on the PS, but different current flowing for same voltage with the changing chips temperature. So the larger the array power, the larger the variation:

say Tj is about 80ºC, for example. The K2 drops between 2 a 4 mV for each ºC over 25ºC. So 80-25=55ºC of increase on Tj lead to 165mV less (at 350mA, average of -3mV/K). The more LEDs in the array, the higher voltage drop along the array, so you need an higher resistance resistor to compensate for it, but lower than the sum of two resistors if you split the string in two series, each one with its own resistor (so better a 24V PS than a 12V one)

In this example, with 165mV less the associated increase in current is about 250mA. If you dont compensate that with the resistor, the array will work at 750mA (in the practice, still higher, because at 750mA and only 0.165V less, the power being burn is over 20% (near 25%) higher than initially, and that lead to the chips getting still hotter, and so on.

So you need a resistor that when all leds on the array drops 165mV, as current increases it increases the voltage dissipation as closer to that figure as possible. For example, if the string is 8 leds long, 8*165mV=1.32V. At 500mA, it implies 2.64 ohms. Power dissipated by the resistor would be 0.5A*1.32V=0.66W.


But obviously, you will determine the value of the resistor according to how much difference of voltage are between that supplied and the string requirements at the target current. In this case, its going to be similar, as you say its going to be about 1 and 1.5V, so its going to be very similar of that calculated before (about 2.5 Ohm, which drops 1.25V at 0.5A).

So if you use a 2.5Ohms resistor to regulate the current at 0.5A, then as chips heats, current across the circuit raises. But each 0.05A of increased current does the resistor drops 0.05A*2.5 Ohm=0.125V more. At a increase of 250mA, it would drops 625mV more. In the first example, we calculated a drop in each led of 165mV, so the resistor would compensate fully a string 5 leds long.

In order to calculate it accurately, you need to estimate the raise of Tj. Thermal resistance of your setup is about 20 K/W, eyeballing. Maybe 25 K/W. At 0.5A, blue K2 draws 3.51V (average, you should measure it for your batch), that may drop to 3.35 after 200h. Its 3.51V*0.5 A=1.575W, for 25K/W, increase of Tj= 39K. So at -3mV/K, you may expect a max drop of voltage at each led of 117mV. In 24V, you may fit 6 blue leds (at 3.51V each), up to 7 (at 3.35V). So you get a max drop in a blue string of 117*7=819mV.

Ive done this calculations so may may know how to do it.

But the right way to do them is by first calculating the number of LEDs on each string. To do it accurately, you should measure the voltage drop of your leds at you current target. Ill use the average Vf from the K2's datasheet: 3.51V for blues and 3.22V for reds (at 0.5A).

The PS gives 24V. So in each string fits 6 blues or 7 reds, and rest 2.94V (blues) and 1.46V (red) to dissipate with the resistor. At 0.5A, then you need to use a resistor of 2.94V/0.5A=5.88 Ohms (remember, R=V/I) for blues and 1.46/0.5=2.92 Ohms for red strings.

In order to compensate for the voltage drop due to the increased temperature, always choose the closer value of resistance over the calculated figure. That way, you set the current in cold below the current target, so as leds gets hotter, the final value is close with your actual target.

For the average figures of Vf of the K2, it would be a 6 ohm resistor for blue strings and a 3 Ohm for the red strings.

Would this minimize the current swing? Lets go to calculate it, although with the calc done in the example, i know the answer is yes.

At 25 K/W (estimated thermal resistance junction to ambient temp), blues may increase Tj up to 39K, as seen before. At -3mV/K, its 117mV less. For 6 leds in the string, its 702mV. How much the resistor will let increase the current before compensate it? As I=V/R, I=0.7V/6 ohms=0.117A, about 100mA.

As the setting for cold LEDs were regulated below 0.5A, final current on operating conditions is going to be lower than 600mA. Likely, about 550mA.

The problem of this design is that a 6 ohm resistor is going to dissipate about 550mA and 3.3V, or 1.8W wasted for each string.

Notice that the excess voltage on operating conditions for that Vf figure is pretty close to the Vf of another blue led. If the Vf of your leds is exactly that, then you could use a string of 7 blue leds connected directly at the 24VDC PS without any resistor and get a current very close to 500mA. As far as the PS is solid giving the 24V, it should work perfectly. But in order to do it you need the Vf of your leds were exactly that (and likely, it wont , but who knows?).

Measure accurately the Vf of your leds is a must if you are going to control them by voltage.


Quote Originally Posted by redline
I was trying to find info on how large arrays are set up in the sign industry, but could find very little on the web.
They mostly use strings of 3 leds with a resistor on a 12V PS, and prefer to forget about reliability of the system. Mainly because they rarely use LEDs over 200mW, in which thermal effect is way less important than when using high power leds.

Quote Originally Posted by redline
I don't know how to calculate thermal efficiency (yet), so I have another question. I figured mounting bare emitters directly to the heat sink is preferable to stars, since you eliminate an extra thermal barrier. But you need to use an adhesive instead of a grease which cuts down a bit on thermal efficiency. Any thoughts on best way to go...stars or emitters?
If you keep the thermal adhesive layer thin enough (by applying some pressure during the curing process), the thermal resistance added is kept below 1.5 K/W, and very often, below 1K/W. Of course, a good thermal adhesive makes a difference. The Arctic Silver is common and very good.

If possible, better the emitters. Mounting is more complicated, but it worth the thermal enhancement.

The problem of many stars is they are of poor quality. Thick layers of dielectric, poor soldering jobs, bad adhesives. Very often they penalize thermal path. With good ones, there is little difference with the emitter glued on the heatsink directly.

But if you know to mount directly the emitters, its cheaper and better.
knna Reviewed by knna on . Building LED lights from facts, no theories I was going to post this at the Perfect LED Grow Light thread, but as some of what im going to post was posted 2 years ago on the stickied thread about LEDs and people still continue developing lights from wrong ideas, i think a thread about this topic is largelly needed. The main problem is related to efficacy of spectrums. When the firsts LED experiments at Overgrow, we work on the hypothesis that blue and red light are more effective. It was an appealing hypothesis that promises large Rating: 5