View Single Post
  #39 (permalink)  
Old August 14, 2010, 06:13 PM
Arinoth's Avatar
Arinoth Arinoth is offline
Join Date: May 2009
Location: Halifax
Posts: 9,158

Originally Posted by 0o0 View Post
It's just Zero and his buddies complaining that nvidia uses a couple more watts than ati I guess It's sad because it's stupid, no one in their right mind would pick a vehicle over another one because the engine runs cooler. Even if it uses 1L less per 100km it shouldn't be a factor. That's why nobody gives a goodgoddamn if fermi uses a couple more watts or runs a bit hotter, ati fanboys have completely blown it out of proportion and reality. I'd take a 470 over a 5870 any day, and I've had a 5870 before. If you were to care about the exact temperature every square cm your PC runs at and how your neighbours are doing and sh1t, you'd probably have a head full of white hair. Nvidia comes with cuda/physx and ati doesn't have something like that which most games actually use, so it's a winning factor for me.

I dunno about metro 2033 as a benchmark though.. it looks to be pretty much a port from console, added a few useless rendering features that make 'high' detail mode look exactly the same as 'low' with a tint of yellow. Personally I don't look at benchmarks for any game that first had a console version, so it's pretty much Crysis 1 that's the deciding factor for me now...

Its actually a legitimate complaint. Increased wattage or power consumption results then in your power supply need to give more juice to said cards. That juice doesn't come from anywhere, it comes from the wall outlet you plugged into. This is then metered by hydro companies. If you are folding 24/7 or gaming quite a bit something that has more wattage consumption will result in your bill going up quite a bit per month and over the year overall.
Here I am: here I remain
Reply With Quote