Be happy to help.

Ohm's law is the most important, basic law of electricity. It defines the relationship between the three fundamental electrical quantities: current,voltage, and resistance. When a voltage is applied to a circuit containing only resistive elements (i.e. no coils), current flows according to Ohm's Law, which is shown below.
[align=center]I = V / R
[/align]
Where:
[align=right]I = [/align] Electrical Current (Amperes)
[align=right]V = [/align] Voltage (Voltage)
[align=right]R = [/align] Resistance (Ohms)
[align=left]Ohm's law states that the electrical current (I) flowing in an circuit is proportional to the voltage (V) and inversely proportional to the resistance (R). Therefore, if the voltage is increased, the current will increase provided the resistance of the circuit does not change. Similarly, increasing the resistance of the circuit will lower the current flow if the voltage is not changed. The formula can be reorganized so that the relationship can easily be seen for all of the three variables.

Using this simple formula I trick a dirt cheap 3 terminal voltage regulator into regulating the current for my emitters.
I tie a 1 ohm 10W. resistor to the regulator's output pin and draw current through it to the leds.
Then I tie the sense lead of the regulator to the other end of the resistor.
That sense lead, often labeled adj ,is used to set the output voltage and is ordinarily tied to a potentiometer to vary the out put voltage.
Because of it's internal construction it needs 1.4V. on the sense lead to start regulating.
So, it will try to change it's output voltage to keep the math happy.
As we draw more current the voltage across that resistor will try to climb.
The chip will throttle it's output to keep it from doing so.

So, my "driver" is a 40 cent regulator and a 10 cent resistor that regulates current flow of up to 1.5A.
If you need to control higher current you could splurge and spend 2 bucks on a Mosfet and 2 more resistors to hang off the end.


Attachment 291333Attachment 291334

For power, it's:
Watts equals voltage times current. W = VA in a purely resistive circuit
So if we drop 2 volts across the regulator and draw 1.2 amps through it, it will have to dissipate 2.4W. (Won't even need a heat sink)
And the resistor will be a little over 1 ohm at the same current. so ~ 1.3 Watts.
Why do I use a 10W.?
Headroom!
Resistance increases with temperature and we want a cool and stable supply.
I strap the regulator to a heatsink for the same reason.
The cooler your electronics run, the longer they will live.

That's is why we overspec drivers too. If you want to drive 40 Watts worth of emitters, it's best to use a 50-60 Watt driver.
Same reason you should never try to draw more than 80 amps from a 100 amp service panel.

It's foolhardy to load any electrical circuit to 100% of it's capacity.
A very tiny voltage spike can cost you the whole game.
If you google ohm's law you will get a much clearer explanation.
I'm just a talented amateur at electronics, and not very good at teaching.



Aloha,
Weeze

[/align]