I found this a few years ago I just can't remember where I found it but I hope it helps.

To figure how much amperage your unit is using, simply use the following formula: Wattage divided by Voltage equals Amperage.

The average household circuit breaker is rated at 15 amps. In other words, if the total amperage drawn from that circuit exceeds 15 amps, the circuit breaker will trip off. You have probably one or perhaps two circuits per room. If you have a 1000 watt unit running off a 120 volt circuit, it will draw approximately 9 amps (1000 watts/110 volts = 9.1 amps).

Make sure your household circuits or fuses are in good condition and are rated at least 20% over what the load will be (e.g. No more than 12 amps load on a 15 amp circuit).

As for the cost of electricity to run your grow light, we recommend that you check with your local power company, since the cost will vary depending on the geographical area. Find out what you are charged for one kilowatt hour (kW/h) of Power. 1 kwh = 1000 watts for 1 hour. I.E. 10 ea 100 watt light bulbs for 1 hr.

Example - If your charge for 1 kilowatt hour in your area is 6 cents per kW/h, a 1000 watt fixture will use 1.1 kW per hour.
Lets say you burn your light 14 hours per day; 14 hours x .066 (.06 x 1.1) = .92 cents per day.
If you burn it 30 days per month the cost will be 30 x .92 per day = $27.60 per month.
If you burn 400 watt lamps you will use .46 (46%) of 1 kW hour.
So it will cost will cost .06 kW x .46 = .028 or 2.8 cents per hour x 14 hours per day = .39 cents per day x 30 days per month = $11.70 month. 1000w fixtures use 1100 watts per hour, 400 watt fixtures use 460 watts per hour, 250 watt fixtures use 295 watts per hour and 175 watt fixtures use 210 watts per hour.

The reason the fixture uses more watts than the lamp rating is because of ballast inefficiency and heat loss.