darpin
05-09-2005, 05:30 AM
Hello folks.
I'm trying to grow my first batch right now, and as such am asking all kinds of questions. I work as an electrical engineer, mainly in the broadcast realm.
There's this common problem with broadcast, where in order to double your power output, you need to consume the square of your original power. For example:
To double the output of 100W, (100 x 100), you need 10,000 watts.
To double the output of 10W, (10 x 10), you need 100 watts.
Just curious if lighting obeys the same laws, but I wouldn't see why not. The spectrum I'm used to dealing with is VHF, but there are more commonalities than differences as you move up and down the spectrum, so the generation of visible light, ultra red, and ultra blue should behave much the same way.
If this is the case, it would explain why someone might get decent performance with a 100W HPS bulb, and need to go to 1000W to really notice a difference, and even at then, you're only maybe 30% better (math is off).
Thoughts, theories?
I'm trying to grow my first batch right now, and as such am asking all kinds of questions. I work as an electrical engineer, mainly in the broadcast realm.
There's this common problem with broadcast, where in order to double your power output, you need to consume the square of your original power. For example:
To double the output of 100W, (100 x 100), you need 10,000 watts.
To double the output of 10W, (10 x 10), you need 100 watts.
Just curious if lighting obeys the same laws, but I wouldn't see why not. The spectrum I'm used to dealing with is VHF, but there are more commonalities than differences as you move up and down the spectrum, so the generation of visible light, ultra red, and ultra blue should behave much the same way.
If this is the case, it would explain why someone might get decent performance with a 100W HPS bulb, and need to go to 1000W to really notice a difference, and even at then, you're only maybe 30% better (math is off).
Thoughts, theories?