Power Distribution

Recommended Posts

I'm hoping someone can help answer a quick question.  I'm trying to figure out how many 12 volt power supplies I can put on a single 20 amp circuit?  Here are the details...

- I have a single dedicated 20 amp circuit that I am using to power my display.

- I am using quite a few different controllers that each run off  350 watt / 12vdc power supplies.  Assume that each power supply has enough pixels attached to max out each power supply.

So...If I'm maxing out each of these low voltage power supplies, how many of these (loaded) systems can I put on my 20 amp circuit?

I looked online and found a ton of information about calculating the number of pixels per controller, but I found nothing on the how many low voltage power supplies (DC), could be added to a high voltage branch circuit.

I would love if there was a math equation to figure this out as well.

Thanks,

Tim

Share on other sites

It's all about total watts - and how many amps that results in.  Watts divided by volts equals amps.  A 20 amp circuit should not be loaded beyond 16 amp, so 16 x 115 volts is 1840 watts.  Power supplies are not 100% efficient either, but switching supplies are pretty good.  Power supplies also should not be fully loaded so figure about a 300 watts on the DC side for a 350 watt supply is a good target - which will come out to about around 350 watts on the AC side.  That means that if your supplies are really fully loaded (with reserve capacity), then about five of them would come close to maxing out a single 20A circuit.  These are all based on every pixel involved being at full white.  This is usually not the case, so you very likely can exceed that five power supply.

Share on other sites

First rule is called the 80% rule. The main reason is prevent HEAT buildup in the house systems.

This is how much you can load an Outlet AND the breaker. Why did i mention both? the standard outlet (5-15) is rated at 15A BUT it can exist (if there are multiple sets) on a 20A breaker. 80% applies to continuous Loads (4 hours or more)[ which is not likely for our shows] or 12A per socket/plug (~1500W). The breaker is 20A but 80% (assumes either 20A, 5-20 devices or multiple 5-15) is 16A

Next rule is: Nothing is 100% efficient 🤕

Your power supplies can run from ~80% to 95% . FWIW my 400W supplies IDLE (inc LOR board logic, no lights connected) at ~10W, so 100W of lights will consume ~120W. Again, we rarely run 100% duty cycle (or intensity).

Bottom line is watch out for heat (cord is warm to the touch) and use cords with wire Ga rated for the load

Share on other sites

Ninjad (I was typing when you posted)

Funny, our replies were similar

Share on other sites

Thank you both very much.  This is helpful, but it led to a bit more confusion on my part.

I completely understand the 80% rule of thumb, and I follow that.  Here is where I get confused...

try this as an example....

- Say i have 550 pixels being powered by a single 12v power supply.  Each pixel consumes .54 watts (full white).  Therefore, these 550 pixels will take 297 watts.  (550 * 0.54 = 297 watts)

- I then figure out how many amps these pixels consume... watts/volts=amps  (297 watts / 12 volts = 24.75 amps)

- Right here is where I get confused!  How do i now take that 24.75 amp result....and figure out how many amps this is in AC?

Thanks,

Tim

Share on other sites

27 minutes ago, pyrotech said:

Thank you both very much.  This is helpful, but it led to a bit more confusion on my part.

I completely understand the 80% rule of thumb, and I follow that.  Here is where I get confused...

try this as an example....

- Say i have 550 pixels being powered by a single 12v power supply.  Each pixel consumes .54 watts (full white).  Therefore, these 550 pixels will take 297 watts.  (550 * 0.54 = 297 watts)

- I then figure out how many amps these pixels consume... watts/volts=amps  (297 watts / 12 volts = 24.75 amps)

- Right here is where I get confused!  How do i now take that 24.75 amp result....and figure out how many amps this is in AC?

Thanks,

Tim

You don't really. You guesstimate.  (and it is easier just to work in Watts.   300*1.2 = 360)   1.2 is the efficiency compensation

then work on Line calcs. ~3A BUT how long is FULL All white? Right 🙃  not much. that is your headroom.

What many forget, is PSU INRUSH. You probably notice that when you first insert the plug, you get a pop. That is the input caps filling (and that is limited by design: Slow startup).

Have multiple 350W PSU on a single cord, and the inrush can trip breakers, while run current meets the 80% target

💡 Get a 'KilllAWatt' test meter (available at Amazon and many hardware stores) so you have some real life numbers. remember, that meter is slow.  Pause your show (leave lights ) to see the number.

BTW You really do not ever want to exceed the PSU Outputs load Watts. One clue is the voltage at the terminals is sagging. Some folk keep the lights dim, to manage power.

Share on other sites

1 hour ago, pyrotech said:

I then figure out how many amps these pixels consume... watts/volts=amps  (297 watts / 12 volts = 24.75 amps)

That's 24.75 amps at 12 volts - not 24.75 amps at 115 volts.  297 watts is about 2.5 amps at 120 volts.

Share on other sites

The Mean Well LSR-350 Power Supply is listed as 6.8A/115vac

Share on other sites

Just now, Jimehc said:

The Mean Well LSR-350 Power Supply is listed as 6.8A/115vac

That seems awfully high for a switching power supply.

Share on other sites

14 minutes ago, k6ccc said:

That seems awfully high for a switching power supply.

That is the 90V (low end of the input 90-132 ) value

Share on other sites

2 hours ago, k6ccc said:

That's 24.75 amps at 12 volts - not 24.75 amps at 115 volts.  297 watts is about 2.5 amps at 120 volts.

So...is it really that simple?  Do I simply only need to pay attention to the wattage of all my lights, divide the total by 120 volts (NOT 12 volts), and that is roughly how many amps I will use??

If so...I can power all of the following on a single 20 amp circuit...

- Power supply #1:  300 watts / 120 volts = 2.5 amps

- Power supply #2:  150 watts / 120 volts = 1.25 amps

- Power supply #3:  200 watts / 120 volts = 1.67 amps

- Power supply #4:  150 watts / 120 volts = 1.25 amps

- Power supply #5:  300 watts / 120 volts = 2.5 amps

- Power supply #6:  250 watts / 120 volts = 2.08 amps

- Power supply #7:  180 watts / 120 volts = 1.5 amps

- Power supply #8:  150 watts / 120 volts = 1.25 amps

- Power supply #9:  300 watts / 120 volts = 2.5 amps

Total AMPS = 16.5

Share on other sites

Not quite.  You did not take into consideration power supply efficiency.  You also assumed 120 volts.  Typical AC power in the US gets referred to as various numbers between 110 and 120.  What it ACTUALLY is at your house will generally somewhere in that range, and will likely vary some.  If the voltage is lower, the current will be higher.

Share on other sites

Thank you all for the help!  Much appreciated.

Thanks,

Tim

Create an account

Register a new account