Go Back   Hardware Canucks > HARDWARE > Video Cards

    
Reply
 
LinkBack Thread Tools Display Modes
  #361 (permalink)  
Old March 28, 2010, 09:33 PM
b1lk1's Avatar
Hall Of Fame
 
Join Date: Feb 2007
Location: Lindsay, Ontario
Posts: 2,531

My System Specs

Default

Coming from a tried and true ATI fanboy (me) I will go on record to state that Fermi is not a fail, it just isn't nearly enough win after a 6 month wait with promised never met. All the arguments aside, I still have always said these cards will sell fast and they should, they are excellent cards. They are just underwhelming, that is all.

PS: Anyone that doubts SKYMTL's talents really needs a reality check. Statements like that are pure troll and won't provoke a reaction. You don't just plug cards in and run tests, that is just ignorance at it's finest......
__________________
MY HEATWARE
Reply With Quote
  #362 (permalink)  
Old March 28, 2010, 10:29 PM
SKYMTL's Avatar
HardwareCanuck Review Editor
 
Join Date: Feb 2007
Location: Montreal
Posts: 11,835
Default

Quote:
Originally Posted by thorn View Post
Maybe the difference between HardOCP and your results for DiRT 2 (both using patch 1.1) comes down simply to choosing a different level. HardOCP is using the Battersee Bridge level it isn't mentioned what level was chosen for the HWC review. That is the only thing I can see that would cause such a wide difference in results.
I wasn't talking about HardOCP since I know they used the right settings and patch. However, their results just can't be compared to anyone else's because of their methodology.
__________________
Reply With Quote
  #363 (permalink)  
Old March 28, 2010, 10:41 PM
SKYMTL's Avatar
HardwareCanuck Review Editor
 
Join Date: Feb 2007
Location: Montreal
Posts: 11,835
Default

Quote:
Originally Posted by traitoR View Post
No offence at all, thanks for the grounded reply, i do tend to shy away from change so you are right there, at no time did i say he didn't do an adequate job, i simply said he seemed a bit forgiving of the negatives in light of the incremental performance increase, i offered up what i thought were it's negatives, obviously this did'nt go over so well with the crowd here. Why that is is anybodies guess.
I respect your opinions since your statements have opened a good debate.

It all depends on what you deem "incremental". I might have been "forgiving" in your books simply because I didn't have unrealistically high expectations for the architecture itself in the first place. Many people did. Others just want to find any reason to find fault with anything NVIDIA.

I knew what the architecture was capable of and what it wasn't capable of. As such, I didn't go into this review and then was disappointed because of my own overblown expectations. I told it for what it is: a good performing card that has issues with power consumption and heat.

However, you have to look at it from a different perspective. I don't count between 8-20% incremental at all but it's not a quantum leap either. Yes, someone going from a HD 5870 would have zero reason to upgrade but what other option is there right now for people who want more performance than a HD 5870 can deliver but don't want to spend $700+ on the HD 5970? Dual card solutions? You and I both know how horribly picky those are.

I'll admit I have been too lenient in the past. Namely for not taking ATI to task for their approach to increasing prices, riding on NVIDIA's coat tails when it came to GPGPU implementation, the substitution of FP16 render targets for lower quality ones in a number of games....the list goes on. That is an issue that every website shares (including us) unfortunately and it has proven to be a massive issue when it comes fair and unbiased reviews not of NVIDIA cards but from those in the red camp. It very rarely has anything to do with credibility but rather time since many of us editors just don't have the time to investigate things these days. Others are more satisfied to put in the game, run a script that automatically benches a ton of games in one go, walk away and don't do ANY investigating regarding the proper way to benchmark certain apps.

With this review on the other hand, I think I gave credit where it was due and NVIDIA took lumps where they deserved it (power, heat, poor 2560 resolution performance, etc.) and credit where it was due (pricing, performance, etc.).

So, I fail to see what the issue is? I see one card which didn't win an award and one which won exactly the award it deserved at the time of writing. How either of those situations changes in the coming weeks in terms of pricing in particular is anyone's guess.
__________________

Last edited by SKYMTL; March 28, 2010 at 10:49 PM.
Reply With Quote
  #364 (permalink)  
Old March 28, 2010, 11:33 PM
burebista's Avatar
Allstar
 
Join Date: Sep 2007
Location: Romania
Posts: 599

My System Specs

Default

Quote:
Originally Posted by SKYMTL View Post
C) The 15% performance increase is based off of the combination of DX9 / DX10 AND DX11 games. If you single out DX11 performance at the resolutions most people play at (1680 and 1920) the GTX 480 simply eats the HD 5870 alive, especially when it comes to minimum framerates. Seriously, what fool buys a DX11 card and cares about DX9 performance when 99.99% of today's $200 GPUs don't have an issue in DX9 High Quality?
Absolutely agree.
But what games we have with DX11 and Tessellation? I show them here. Here in UE GTX 480 is ~50% pricey than HD 5870. It worths now for gaming?

BTW SKY if you still have both Fermis's around I have something to ask you. Can you made a benchmark in Metro 2033 1920x1200 with 4AA/16AF, DoF, Tessellation, PhysX and everything on max? I did't see those settings benchmarked anywhere (understable because ATI cannot render PhysX) but I want to see how's doing Fermi in their marketing game.
Remember that Optimum settings recommended for Metro are Nehalem, Fermi, SSD and 8 GB RAM.
__________________
If it ain't broke... fix it until it is.
Reply With Quote
  #365 (permalink)  
Old March 29, 2010, 12:18 AM
Rookie
 
Join Date: Mar 2010
Posts: 10
Default

Interesting review. Just some points to consider.

Minimum FPS is important but showing just 1 number as a minimum does not highlight the real situation. A Graphics card might dip to a low number for a single second in a benchmark run for a number of reasons but for the rest of the time may actually have a higher minimum FPS. The proper way to display the minimum FPS is using a graph.

Fermi was obviously going to be faster than the 5800 series because the 5800 was already released and performance is a known factor. For example Fermi could choose the clocks that the architecture runs at to insure it was faster at release. What is interesting from all this is the clocks that Nvidia chose. The card runs very close to its maximum limits on air cooling. The 5800 on the other hand was very conservatively clocked on release in order to meet a good TDP.

Lets take a 5850 for example clocked at 725Mhz core and 1000Mhz Memory. The memory on the card is actually rated at 5Ghz and the core clocks to 1ghz with a very slight voltage increase. Nothing is stopping ATI from releasing such a card and they are already being released by companies such as MSI (Lightning). Keep in mind that this new card is still using less power and runs cooler on air than the fermi yet has a 37% faster core speed.

If the situation had been reversed and fermi was released 6 months ago and ATI now would the clocks on either card have been the same? The answer is probably no. Fermi would have been clocked lower and the ATI would have been clocked higher. So the most important comparison between these cards should be performance per watt which has been overlooked as important by many reviews.

As for performance in Directx 11 games the results in most reviews have shown that both series are very close to each other. Dirt 2, AvsP, Bad Company 2, Stalker all show a very small difference in performance. Metro 2033 is very new and Nvidia had more access to the game before launch so performance might even out in that game also.

Tessellation is being publicised as an advantage for fermi yet little proper investigation has been performed to show if this is really true outside of synthetic tests. The method each card uses to perform tessellation greatly favours Nvidia in a synthetic test, for example fermi can use more of its processing power to perform tessellation than ATI but does this make it better? In a game the GPU performs many other tasks and Fermi cant use as many resources to perform tessellation as it can in a synthetic test. A fair test would show the difference between FPS with tessellation on and off during a game. This has not been done properly yet.

I think the best review of the 480 was performed by Guru3D.
Reply With Quote
  #366 (permalink)  
Old March 29, 2010, 01:50 AM
MonsterSound's Avatar
Allstar
F@H
 
Join Date: May 2007
Location: YYZ
Posts: 691

My System Specs

Default ...some of these comments ???

Quote:
Originally Posted by sideeffect View Post
Interesting review. Just some...

So the most important comparison between these cards should be performance per watt which has been overlooked as important by many reviews. ...
.
So, how green your vid card is, is supposed to be the most important thing to judge when reviewing the latest most powerful vid card technology? Better get me a 5450.
__________________
Code:
 my Heatware,  RFD &  eBay
MAIN: i5 3570 @ 4.5Ghz wH100 / evga GTX780 SOC / MSI Z77A-G45 / 8gb G.Skill 2133 1T / 30" 2560x1600 / Antec P280 / PCP&C 950w / Intel 520 SSD 240gb / Eclipse / G3-Lazer / z5300 / Win7 Pro64 / APC Xs1300
HTPC : nMedia6000B / AMD X2 4400+ / M2A-VM / Corsair 4GB 800 / LiteOn Blu-Ray / 8800GT / Samsung 46"LCD / 2x2TB / OCZ500 / ATI550 / Wrless Kbrd & Rmte / z-5500 / Win7Pro / APC700
"You must be the change you want to see in the world" 
- Mahatma Gandhi  
Reply With Quote
  #367 (permalink)  
Old March 29, 2010, 02:18 AM
Rookie
 
Join Date: Mar 2010
Posts: 28
Default

Quote:
Originally Posted by MonsterSound View Post
So, how green your vid card is, is supposed to be the most important thing to judge when reviewing the latest most powerful vid card technology? Better get me a 5450.
Monster, i don't feel you really grasped Sideeffect's statment. Why not address that! It was about clock speeds/ timing of release. Valid point he made. Why not actually address that point!

Last edited by wembler; March 29, 2010 at 04:23 AM.
Reply With Quote
  #368 (permalink)  
Old March 29, 2010, 03:09 AM
MonsterSound's Avatar
Allstar
F@H
 
Join Date: May 2007
Location: YYZ
Posts: 691

My System Specs

Default

Quote:
Originally Posted by wembler View Post
Monster, i don't feel you really grapsed Sideeffect's statment. Why not address that! It was about clock speeds/ timing of release. Valid point he made. Why not actually address that point!
A litany of “what if” hypothetical’s and conjecture from 6 months ago until now. What’s to grasp that I didn't grasp?
How about dealing with what we have now instead. Contrary to SideFx’s own statement, tessellation has been shown to be processed more effectively by GF100 than the 5000; by many including Sideefx’s fav Guru3D review. “As you can see, the NVIDIA GTX 400 series kicks ass.” So much so that they have a hard time believing their own tessellation data with StoneGiant.
The GF100’s advantages and disadvantages make sense examining its architecture, not by what clocks nVidia chose, or by looking for excuses, or picking at its minimum frame rate, number vs. graph, or speculation about the clocks ATI chose for their release ~6 months ago.
P.s. I'm am not going to undervolt my toaster just to have the most important performance per watt statistic.
__________________
Code:
 my Heatware,  RFD &  eBay
MAIN: i5 3570 @ 4.5Ghz wH100 / evga GTX780 SOC / MSI Z77A-G45 / 8gb G.Skill 2133 1T / 30" 2560x1600 / Antec P280 / PCP&C 950w / Intel 520 SSD 240gb / Eclipse / G3-Lazer / z5300 / Win7 Pro64 / APC Xs1300
HTPC : nMedia6000B / AMD X2 4400+ / M2A-VM / Corsair 4GB 800 / LiteOn Blu-Ray / 8800GT / Samsung 46"LCD / 2x2TB / OCZ500 / ATI550 / Wrless Kbrd & Rmte / z-5500 / Win7Pro / APC700
"You must be the change you want to see in the world" 
- Mahatma Gandhi  
Reply With Quote
  #369 (permalink)  
Old March 29, 2010, 03:36 AM
Top Prospect
 
Join Date: Jan 2008
Location: Mj,Sk
Posts: 83

My System Specs

Default

Quote:
Originally Posted by SKYMTL View Post
I respect your opinions since your statements have opened a good debate.

It all depends on what you deem "incremental". I might have been "forgiving" in your books simply because I didn't have unrealistically high expectations for the architecture itself in the first place. Many people did. Others just want to find any reason to find fault with anything NVIDIA.

I'm not looking for a reason to fault Nvidia, it's looking to me like they are pushing the chip beyond reasonable thermals, i foresee the hardware forums heating up with Fermi problems shortly ( pun intended )

Quote:
Originally Posted by SKYMTL View Post
I knew what the architecture was capable of and what it wasn't capable of. As such, I didn't go into this review and then was disappointed because of my own overblown expectations. I told it for what it is: a good performing card that has issues with power consumption and heat.
I'm not trying to butt heads but i read more than once your suggestion that others should wait on an ATi purchase for Fermi and alluded to it looking to be very strong in comparison. I at the time thought you must have some inside info but in retrospect it looks like you did indeed have high expectations that carried over somewhat over enthusiastically.

Quote:
Originally Posted by SKYMTL View Post
However, you have to look at it from a different perspective. I don't count between 8-20% incremental at all but it's not a quantum leap either. Yes, someone going from a HD 5870 would have zero reason to upgrade but what other option is there right now for people who want more performance than a HD 5870 can deliver but don't want to spend $700+ on the HD 5970? Dual card solutions? You and I both know how horribly picky those are.
You have to factor in the cost of additional cooling with increased power capabilities as well as the base price of Fermi, it's not a drop in replacement for just any old system, don't forget the added noise from your harder working PSU as well as the already loud cooler, i see mention of jumps in clockspeed when more than one monitor is used ( who doesn't really ?) which causes a horrible spike in idle draw and idle temps, and what of aftermarket heatsinks, just what is going to replace a 5 large pipe, direct touch cooler with a crazy loud fan and have better cooling ?

You have to wonder if as per the norm these are cherry picked parts that run cooler and clock well, for the initial seeding of reviewers, just what will the mainstream cards a guy can buy........ eventually be like ? Kind of scarey really.


Quote:
Originally Posted by SKYMTL View Post
I'll admit I have been too lenient in the past. Namely for not taking ATI to task for their approach to increasing prices, riding on NVIDIA's coat tails when it came to GPGPU implementation, the substitution of FP16 render targets for lower quality ones in a number of games....the list goes on. That is an issue that every website shares (including us) unfortunately and it has proven to be a massive issue when it comes fair and unbiased reviews not of NVIDIA cards but from those in the red camp. It very rarely has anything to do with credibility but rather time since many of us editors just don't have the time to investigate things these days. Others are more satisfied to put in the game, run a script that automatically benches a ton of games in one go, walk away and don't do ANY investigating regarding the proper way to benchmark certain apps.
Ati certainly comes in last when it comes to pushing their tech into useability via software, they need to rethink their priorities, without going overboard and locking out features or purposefully crippling the competitors performance.

Don't get me wrong i appreciate the effort at providing good content as much as the next guy and i suppose i'm being critical.

Quote:
Originally Posted by SKYMTL View Post
With this review on the other hand, I think I gave credit where it was due and NVIDIA took lumps where they deserved it (power, heat, poor 2560 resolution performance, etc.) and credit where it was due (pricing, performance, etc.).

So, I fail to see what the issue is? I see one card which didn't win an award and one which won exactly the award it deserved at the time of writing. How either of those situations changes in the coming weeks in terms of pricing in particular is anyone's guess.
We're all entitled to our own opinion, i'm glad you don't take immediate offence to mine nor i to yours, i was simply voicing my concerns and it seemed a good deal of posters here take umbrage to people having differing values.
Reply With Quote
  #370 (permalink)  
Old March 29, 2010, 03:39 AM
Rookie
 
Join Date: Mar 2010
Posts: 28
Default

Quote:
Originally Posted by MonsterSound View Post
A litany of “what if” hypothetical’s and conjecture from 6 months ago until now. What’s to grasp that I didn't grasp?
?????????? Hello?

Quote:
Originally Posted by MonsterSound View Post
The GF100’s advantages and disadvantages make sense examining its architecture, not by what clocks nVidia chose, or by looking for excuses, or picking at its minimum frame rate, number vs. graph, or speculation about the clocks ATI chose for their release ~6 months ago.
P.s. I'm am not going to undervolt my toaster just to have the most important performance per watt statistic.
The ''advantages and disadvantages'' make sense in relation to what? What excuses? ''Speculation about clock speeds'' - wheres the speculation? The clock speeds were released to the public along with the cards 6 mos ago.

Quote:
Originally Posted by MonsterSound View Post
I'm am not going to undervolt my toaster just to have the most important performance per watt statistic.
If your speaking from a competitive gamers perspective at least, why would you? However, if you even have a half decent card right now, the 480 will not take you from mid-pack to top three in the ladders. Only you can do that!
Reply With Quote
Reply


Thread Tools
Display Modes

Similar Threads
Thread Thread Starter Forum Replies Last Post
The Official GTX480 & GTX 470 Thread FiXT Press Releases & Tech News 667 April 28, 2010 06:23 AM
NVIDIA Clarifies Official GTX 480 and GTX 470 Launch Date matsta31 Press Releases & Tech News 76 March 9, 2010 02:07 AM
NVIDIA's Next-Gen Parts Get Named: GTX 480 & GTX 470 SKYMTL Press Releases & Tech News 67 March 6, 2010 09:23 PM
Comment Thread for HD 4890 & GTX 275 Reviews SKYMTL Video Cards 125 April 27, 2009 06:35 PM