Go Back   Hardware Canucks > NEWS & REVIEWS > Press Releases & Tech News

    
Reply
 
LinkBack Thread Tools Display Modes
  #21 (permalink)  
Old January 9, 2014, 09:00 AM
muse108dc's Avatar
Hall Of Fame
 
Join Date: Jun 2008
Location: Vancouver North
Posts: 1,714
Default

Quote:
Originally Posted by Dzzope View Post
Like all the new features that need GCN??

I agree I'd like to see less proprietary stuff from Nvidia.. But at least it works and does so reliably (generally)

Both have their flaws but for me Nvidia is (at the moment at least) much more polished rather than a "you could do this but we're not going to fully develop it".
At this point in time nothing NEEDs GCN, Mantle will but try making a low level API set that works across 2 different manufacturers and even then AMD has permitted nvidia to use Mantle if they so wish.

Polish is only worth so much if your going to be an ass about anything you develop. We're in this for the betterment of PC gaming NOT monopolization of it.
Reply With Quote
  #22 (permalink)  
Old January 9, 2014, 09:27 AM
Dzzope's Avatar
Hall Of Fame
 
Join Date: Oct 2010
Location: Irishman in Kiev, wOOoo, I'm an alien...
Posts: 2,909

My System Specs

Default

Ohh, you mean all the feature they have promised but we have yet to see anything solid on..

No.. I'm in it for the best I can get for my money... not to support incompetence, misleading information or at worst flat out lying. Now I'm not so innocent to think either side is above dirty tricks but AMD have a record of not delivering (or at least fully) whereas Nvidia has for the most part.
__________________
Random quotes:

AKG: "So please don't piss on people and claim its raining. "
Soullessone21: "Their the only fat bitches I ever want in my wife :)"
DeeraxTheCoolNemo: "I never preSs CaPLoCK I just HiT my keyS so Hard ThEY Crit"
Reply With Quote
  #23 (permalink)  
Old January 9, 2014, 07:17 PM
Hall Of Fame
F@H
 
Join Date: Dec 2010
Location: Fort St. John, BC
Posts: 1,241

My System Specs

Default

Nvidia has some pretty bad history, but the last 3-4 years they have had a stellar track record imo.
Reply With Quote
  #24 (permalink)  
Old January 9, 2014, 07:36 PM
Shadowarez's Avatar
Allstar
 
Join Date: Oct 2013
Location: Arctic Canada
Posts: 681

My System Specs

Default

I can see it being better optimized on amd hardware then nvidia even though it's not proprietary I'm sure amd will have some optimization somewere to work on there hardware.
Reply With Quote
  #25 (permalink)  
Old January 9, 2014, 07:54 PM
Fortier's Avatar
Rookie
 
Join Date: Jan 2014
Location: Medicine Hat, AB
Posts: 26

My System Specs

Default

I'm really glad to see that they announced this, I was fearful that they were going to try to force unwanted hardware on consumers as well.
Reply With Quote
  #26 (permalink)  
Old January 9, 2014, 08:15 PM
MVP
 
Join Date: Mar 2010
Location: Ottawa
Posts: 497
Default

Quote:
However, Petersen quickly pointed out an important detail about AMD's "free sync" demo: it was conducted on laptop systems. Laptops, he explained, have a different display architecture than desktops, with a more direct interface between the GPU and the LCD panel, generally based on standards like LVDS or eDP (embedded DisplayPort). Desktop monitors use other interfaces, like HDMI and DisplayPort, and typically have a scaler chip situated in the path between the GPU and the panel. As a result, a feature like variable refresh is nearly impossible to implement on a desktop monitor as things now stand.

That, Petersen explained, is why Nvidia decided to create its G-Sync module, which replaces the scaler ASIC with logic of Nvidia's own creation. To his knowledge, no scaler ASIC with variable refresh capability exists—and if it did, he said, "we would know." Nvidia's intent in building the G-Sync module was to enable this capability and thus to nudge the industry in the right direction.

When asked about a potential VESA standard to enable dynamic refresh rates, Petersen had something very interesting to say: he doesn't think it's necessary, because DisplayPort already supports "everything required" for dynamic refresh rates via the extension of the vblank interval. That's why, he noted, G-Sync works with existing cables without the need for any new standards. Nvidia sees no need and has no plans to approach VESA about a new standard for G-Sync-style functionality—because it already exists.
Nvidia responds to AMD's ''free sync'' demo - The Tech Report
Reply With Quote
  #27 (permalink)  
Old January 9, 2014, 11:33 PM
Shadowarez's Avatar
Allstar
 
Join Date: Oct 2013
Location: Arctic Canada
Posts: 681

My System Specs

Default

Quote:
Originally Posted by Fortier View Post
I'm really glad to see that they announced this, I was fearful that they were going to try to force unwanted hardware on consumers as well.
As a gamer oriented piece of hardware I don't mind paying for this. It's like going from stone age standard to a industrial standard.

We finally have a solution to a problem that arose in early days of 3D Gaming. I for one welcome this tech as im sure alot of people will.
Reply With Quote
  #28 (permalink)  
Old January 10, 2014, 08:09 PM
Hall Of Fame
 
Join Date: May 2011
Location: Calgary
Posts: 1,495

My System Specs

Default

Quote:
Originally Posted by SKYMTL View Post
Yeah...no.

Sorry, after other "me too!" flubs by AMD (OpenCL physics, HD3D, etc.) I am taking a serious wait and see approach to this one.

That might sound bias and it is, simply because we have been here before and AMD has proven again and again they don't properly support technologies being released to simply take PR steam out of NVIDIA's sails. Fool me once, shame on you; fool me twice, shame on me.

If it works and they continue to support it, great. But forgive me for not being excited.
Been there before? Excuse me for pointing out the obvious man. Nvidia isn't even playing in the same ball park anymore. Nvidia is just a farm team trying to catch up to life moving on.

Apu, Xbox, playstation, etc etc etc. we all know why you are bias to, ahem, certain hardware. Nvidia is gonna make me pay loads more for marginal performance gains. That's a fact.
__________________
__________________________________________________ __________
"Lesson's cost money... good ones costs lots" -Tony Beets (gold rush)
__________________________________________________ __________
Mah Rig now
Mah Rig before
My stuff for sale.
Reply With Quote
  #29 (permalink)  
Old January 10, 2014, 09:19 PM
AkG's Avatar
AkG AkG is offline
Hardware Canucks Reviewer
 
Join Date: Oct 2007
Posts: 4,392
Default

You do realize that Skymtl runs mainly AMD cards in his own systems right? Or are you so blinded by the Red colored KoolAid to see that? AMD have good tech...its their drivers / software that has always been the problem. This 'me too' is never going to happen for the desktop environment in any big way. At best it will help laptop performance...at worst its just another AMD PR move as that is all they have left....PR. Lets see a real time demo using a real monitor and not a laptop. Until then this is all smoke and mirrors for the AMD fanbois who are eating it up and asking for a second helping.

For the record. Those 'marginal performance gains' is what is going to make 4K gaming possible within the next two years. Without G-Sync even tri-SLI rigs dont have the horsepower. Oh and how are those EZ bake oven 290Xs working out for peeps? Bet they are great for keeping the home warm during this cold winter.
__________________
"If you ever start taking things too seriously, just remember that we are talking monkeys on an organic spaceship flying through the universe." -JR

“if your opponent has a conscience, then follow Gandhi. But if you enemy has no conscience, like Hitler, then follow Bonhoeffer.” - Dr. MLK jr
Reply With Quote
  #30 (permalink)  
Old January 10, 2014, 10:56 PM
Hall Of Fame
 
Join Date: May 2011
Location: Calgary
Posts: 1,495

My System Specs

Default

Smoke and mirrors? What I see are shelves full of NVidia cards and completely empty of AMD cards. I'm not gonna get into the "bias" on here AKG because it will just turn into a pissing match and finger pointing he said she said religious hillbilly blood bath.

Fact of the matter is AMD is completely dominating Nvidia in the home ON the TV / Monitor screen. Not to mention their other markets... I haven't seen ANY stock of AMD cards since November and I expect they'll continue to sell out as soon as they hit the shelves. 270's and up. Yes coin mining but it counts... Sales are sales man. When was the last time you saw the price of video cards increasing? Buy stock in AMD not Nvidia.

Gsync is the smoke and mirrors gimmick which requires an all together more expensive build from top to bottom. And for what? Just for games mainly? Whether or not Nvidia is a marginally better card is irrelevant at the end of the day. And all things being equal I've seen my fair share of people whining about Nvidia drivers too. I'll be honest... I am not a fanboi of AMD but I am a fanboi of my wallet.
__________________
__________________________________________________ __________
"Lesson's cost money... good ones costs lots" -Tony Beets (gold rush)
__________________________________________________ __________
Mah Rig now
Mah Rig before
My stuff for sale.
Reply With Quote
Reply


Thread Tools
Display Modes

Similar Threads
Thread Thread Starter Forum Replies Last Post
A Week With NVIDIA's G-SYNC Monitor (Comment Thread) SKYMTL Display Units 46 May 26, 2014 08:07 PM
NVIDIA Officially Unveils Next Generation CUDA GPU Architecture - Codenamed "Fermi" FiXT Press Releases & Tech News 0 October 1, 2009 11:35 AM
NVIDIA Unveils First OpenGL 3.0 Drivers MAC Press Releases & Tech News 3 August 19, 2008 03:26 PM
NVIDIA Unveils Three New GPUs MAC Press Releases & Tech News 15 August 2, 2008 07:22 PM