Go Back   Hardware Canucks > HARDWARE > Video Cards

    
Reply
 
LinkBack Thread Tools Display Modes
  #81 (permalink)  
Old November 7, 2013, 05:20 AM
ipaine's Avatar
Hall Of Fame
F@H
 
Join Date: Apr 2008
Location: Edmonton, AB
Posts: 2,076

My System Specs

Default

Quote:
Originally Posted by Gordon freemen View Post
Sigh the 780ti cannot use more than 4GB Vram let alone 8GB Vram. Don't fall into this Vram hype because it is absolutely BS that has not even been very well quantified yet.

First this is complete mis-information, why? Because it would use 3, 6 or as some rumors suggest 12GB of vram, not 4 or 8.

Second, yes you very much can use that much vram. Yes, chances are that at 1080p right now 3GB is fine, but for how much longer into the future, let alone 1440p, 1600p, and beyond. This is very apparent if you happen to mod a game with high res textures like you can with games like Skyrim. Hell it is even more apparent if you start talking about 4k displays which while pretty rare now, but won't be forever.

Point is while 3GB may be enough for you, it will not be enough for everyone.
__________________
"Nothing sucks more than that moment during an argument when you realize you're wrong."


Reply With Quote
  #82 (permalink)  
Old November 7, 2013, 05:50 AM
sf101's Avatar
MVP
 
Join Date: Jun 2012
Posts: 316
Default

I'm not sure if people realize but even my 7970 @ 1290 core/1950 mem with my cpu @ 4.8 ghz mem @ 2666 mhz 11-13-11 i run into situations where i cant run games at ultra everything with above 120 fps @ 1080 p.

So i really really don't get where all this not needing better than 770 or 280x videocard for 1080p is coming from.

You don't need 1440 p in order to make use of a 780gtx-780ti or 290-290x especially when this next year of games comes out i mean heck most of the games we are playing are still stuck on DX9 and taxing current gen cards my current setup is proof of this.

I like running everything maxed eye candy now if 1440p monitor prices drop they may become main stream and all the better. Until then its still nice to know you can max eye candy and maintain 120 fps solidly even if its only 1080 p.
__________________
My 5 Current Rigs-Guest-Main-Work/Spare-HTPC-Laptop
I7930@4.0GHz-Guest
3770K@ 4.8GHz -Main
2500k @ 4.7GHz -work/Spare
3570K @ 4.5 -HTPC
MSI GT70-2OC/770m-998/4800MHz -Laptop
SF101 HW BOT
GPU's ("xfire"2x XFX290X@1150/1500"EKWB"|GIGAOC7970WF3@1250/1850|Asus480GTX@900/2000|
2xEvga260Core216's SLI in HTPC-Retired gpu's 9600ati|9800proATI|8800GTS"320mb&640mb"
|8800GTX|9800GT|9800GX2
Reply With Quote
  #83 (permalink)  
Old November 7, 2013, 12:37 PM
Banned
 
Join Date: Sep 2012
Posts: 96
Default

Quote:
Originally Posted by sf101 View Post
I'm not sure if people realize but even my 7970 @ 1290 core/1950 mem with my cpu @ 4.8 ghz mem @ 2666 mhz 11-13-11 i run into situations where i cant run games at ultra everything with above 120 fps @ 1080 p.

So i really really don't get where all this not needing better than 770 or 280x videocard for 1080p is coming from.

You don't need 1440 p in order to make use of a 780gtx-780ti or 290-290x especially when this next year of games comes out i mean heck most of the games we are playing are still stuck on DX9 and taxing current gen cards my current setup is proof of this.

I like running everything maxed eye candy now if 1440p monitor prices drop they may become main stream and all the better. Until then its still nice to know you can max eye candy and maintain 120 fps solidly even if its only 1080 p.
I completely agree with this rational. The sweet spot for 1080P would be CF HD 7970Ghz or an = SLI setup and even then there will be some games like Metro LL and Crysis3, Serious Sam BFE, far Cry5 etc ... that will still have drops below the monitors refresh rate even @ 60hz.
Reply With Quote
  #84 (permalink)  
Old November 7, 2013, 01:13 PM
Soultribunal's Avatar
Moderator
F@H
 
Join Date: Dec 2008
Location: Mississauga
Posts: 8,066

My System Specs

Default

Quote:
Originally Posted by rjbarker View Post
^^^^ Just a heads up ZZ.....many games will show that they are using as much VRAM as you can throw at em, this certainly doesn't mean that the game requires it. In other words a 2G Card can show utilization of 1.9 G while the same game running on a 3G Card can show 2.5 G utilization. ........just to clarify ;)
True enough. So far since wargamming revised their Engine (massive upgrade, and texture package) I peg 88% on both of my 2GB 650TI's , and that is 1080p. There are a lot of titles that will pull everything they can. I get 60fps, but just barely. In the city with SLI and my 3770K chugging away, it dips.

I think one of these cards might just be the solution.
That or hopefully find someone selling a 690 I can afford lol.

-ST
__________________




"We know not why he calls for us, only that when he does we must answer" - DMP 2009

"Dear Iceberg, I am sorry to hear about global warming. Karma is a bitch. Signed - Titanic"

I would rather believe and find god doesn't exist than to not believe and find that he does.

www.realhardwarereviews.com
Reply With Quote
  #85 (permalink)  
Old November 7, 2013, 01:47 PM
Rookie
 
Join Date: Jul 2013
Posts: 35
Default

Can't wait for the Classy version of this. Going to be a lot of fun to bench. SLI should go hand in hand with 2560x1440p 120hz G-Sync monitor if those come out next year.
Reply With Quote
  #86 (permalink)  
Old November 7, 2013, 01:50 PM
Banned
 
Join Date: Sep 2012
Posts: 96
Default

Quote:
Originally Posted by SeeThruHead View Post
Can't wait for the Classy version of this. Going to be a lot of fun to bench. SLI should go hand in hand with 2560x1440p 120hz G-Sync monitor if those come out next year.
G sync Um spunds like more mraketing BS. Vysnc is smooth as can be as is.
Reply With Quote
Reply


Thread Tools
Display Modes

Similar Threads
Thread Thread Starter Forum Replies Last Post
NVIDIA Announces GeForce Experience Cloud Optimizations Sam_Reynolds Press Releases & Tech News 10 April 30, 2012 05:31 PM
NVIDIA Announces GeForce Experience Cloud Optimizations SKYMTL Video Cards 1 April 29, 2012 10:10 AM
NVIDIA Announces GeForce GTX 560M at Computex Sam_Reynolds Press Releases & Tech News 0 May 30, 2011 02:02 PM
Thermaltake announces Nvidia Fermi Certified Case - The Element V Technokat Press Releases & Tech News 8 January 3, 2010 10:37 AM
Nvidia Officially Announces SLI Support for Select X58 Motherboards probationer Press Releases & Tech News 3 October 22, 2008 12:56 PM