110v vs 220v Comparison
Has anyone done a comparison on the bigger wattage power supplies and figured out the power savings and difference in heat output running them on 220v vs 110v??
It would seem to me as pc power supplies get into the 1000w range you could start saving a bit of $$$ on power. if the near future has larger than 1500w, 220v is going to be a sure thing...
I'm kinda hoping that computers get more effecient.... but very interesting train of thought anyway :thumb:
I wouldn't lose any sleep over it ...
Not many will purchase a 1000watt psu's even among geeks, who is crazy enough to need one :haha:
Just a lot of hype ...
In my limited knowledge of power I believe that the power draw will be the same 110 vs 220v. There will not be any savings made on the basis of voltage alone, any efficiencies will have to be from the physical construction of the psu. 220v is capable of delivering more amperage, as is 480v, with all related systems built increasingly more robust to handle the higher amperage.
When efficiency testing is done, 220V always fairs better from everything I've seen. However it's not a significant amount where you'd save hundreds of dollars.
If you want to save power, don't overclock and leave all the power management stuff enabled so that it down-clocks when idle.
With wattage being volts x amps it means that the "draw" from the wall in amps will be half at 220 vs 110.
Easier to design something to handle 10 amps of current than 20 and a lot less heat should be involved, I think.
It sounds like you're equating voltage as being tied to amperage but the two values are completely seperate.
So anyway, if you have a system that eats 1100 Watts of power from the wall at 110 volts that'll be 10 amps of current. Likewise, 1100 watts of power from the wall at 220 would be 5 amps of current.
Sorry if I misunderstood what you said. I don't have an overly fantastic understanding about how it all works but I *think* I know the basics....
Efficiency at 220V is ~2-3% more that equivalent power draw at 110V. No big deal to me.
When you draw 1kW from outlet saving 20-30W looks like a joke to me. :biggrin:
I know on paper you get some gains, was just wondering if anyone has done real world tests....
Im thinking of a PC based FOLDING machine that runs 24/7 @ 100% duty, you have to be saving some $$$ @ 220v
ok so I want to buy a power BIG power supply that will be running full out 24/7 its silly not to run it @ 220v
I think it would be a interesting fact to post its cost savings at that voltage and if the manufacturer is saying its %80 eff is it 80% at 110v or 220v?
I have a feeling that if it does both they will use the 220v numbers to make it look better.
I just figured someone must have done this but maybe im wrong, everyone is tweaking the hell out of their machines trying to get more speed for less, trying to cool them...
220v means more eff so less $$$ and less heat..
hahaha maybe there just needs to be a PSUID program before tweaking your PS or getting a high end one will become important ;)
Thanks burebista thats exactly what I wanted to see.
looks like on that one power supply you need to run less than 50% of its rated load to get the 83% eff...
that 700w PS needs to have a 250-300w draw for it to be in the zone anything over or under that it drops off big time.
I was under the impression that the eff window was allot higher than that...
I still will need a big power supply im planning to draw about 1200w @ full duty.
|All times are GMT -7. The time now is 01:52 PM.|