Go Back   Hardware Canucks > HARDWARE > Video Cards

    
Reply
 
LinkBack Thread Tools Display Modes
  #11 (permalink)  
Old August 11, 2009, 06:30 PM
SKYMTL's Avatar
HardwareCanuck Review Editor
 
Join Date: Feb 2007
Location: Montreal
Posts: 11,680
Default

Did a little searching:

Nvidia disables PhysX when ATI card is present

Use drivers prior to the 186 series it seems.
Reply With Quote
  #12 (permalink)  
Old August 11, 2009, 06:32 PM
SKYMTL's Avatar
HardwareCanuck Review Editor
 
Join Date: Feb 2007
Location: Montreal
Posts: 11,680
Default

Quote:
Originally Posted by ToXic View Post

Im thinking PhysX is a bit of a dead fish now though isnt it?
Well, according to me it works quite well but the issue is that current CPUs are hardly bottlenecked when it comes to properly encoded physics calculations.

Seeing Havok working in Killzone 2 is really awe inspiring...
Reply With Quote
  #13 (permalink)  
Old August 11, 2009, 06:35 PM
ToXic's Avatar
MVP
 
Join Date: Aug 2009
Location: Calgary, AB
Posts: 469
Default

Quote:
Originally Posted by SKYMTL View Post
Well, according to me it works quite well but the issue is that current CPUs are hardly bottlenecked when it comes to properly encoded physics calculations.

Seeing Havok working in Killzone 2 is really awe inspiring...
KillZone 2 was a beautiful game, and it made me proud to own a PS3...

but you think my i7 will be fine doing its PhysX work or should i do the W7 Plunge just for some PhysX support?
__________________
Some kind of pathetic Apple convert

17" Macbook Pro - Stock
15" Macbook Pro Retina - Stock
13" Macbook Air - Stock
Reply With Quote
  #14 (permalink)  
Old August 11, 2009, 06:37 PM
enaberif's Avatar
Hall Of Fame
 
Join Date: Dec 2006
Location: Calgahree, AB
Posts: 10,608
Default

Quote:
Originally Posted by SKYMTL View Post
Well, according to me it works quite well but the issue is that current CPUs are hardly bottlenecked when it comes to properly encoded physics calculations.

Seeing Havok working in Killzone 2 is really awe inspiring...
I wouldn't necessarily say that. I mean I remember when I ran Mirrors Edge with and without PhysX enabled on a single GPU and with it off my cpu didn't work as hard but turning it on it taxed my GPU more.
Reply With Quote
  #15 (permalink)  
Old August 11, 2009, 06:38 PM
SKYMTL's Avatar
HardwareCanuck Review Editor
 
Join Date: Feb 2007
Location: Montreal
Posts: 11,680
Default

IMO, no matter what happens PhysX will not work well on a CPU because it will always be implemented with an eye towards GPUs.
Reply With Quote
  #16 (permalink)  
Old August 11, 2009, 06:41 PM
enaberif's Avatar
Hall Of Fame
 
Join Date: Dec 2006
Location: Calgahree, AB
Posts: 10,608
Default

Quote:
Originally Posted by SKYMTL View Post
IMO, no matter what happens PhysX will not work well on a CPU because it will always be implemented with an eye towards GPUs.
Yup I agree.
Reply With Quote
  #17 (permalink)  
Old August 11, 2009, 06:41 PM
ToXic's Avatar
MVP
 
Join Date: Aug 2009
Location: Calgary, AB
Posts: 469
Default

Quote:
Originally Posted by SKYMTL View Post
IMO, no matter what happens PhysX will not work well on a CPU because it will always be implemented with an eye towards GPUs.
so your saying... i should do the W7 thing, i really do want PhysX but i dont know how well my 750W supply will do being almost maxed out...
__________________
Some kind of pathetic Apple convert

17" Macbook Pro - Stock
15" Macbook Pro Retina - Stock
13" Macbook Air - Stock
Reply With Quote
  #18 (permalink)  
Old August 11, 2009, 06:42 PM
SKYMTL's Avatar
HardwareCanuck Review Editor
 
Join Date: Feb 2007
Location: Montreal
Posts: 11,680
Default

Quote:
Originally Posted by enaberif View Post
I wouldn't necessarily say that. I mean I remember when I ran Mirrors Edge with and without PhysX enabled on a single GPU and with it off my cpu didn't work as hard but turning it on it taxed my GPU more.
That is part of the doubled-edged sword that is GPU physics acceleration: either you shunt the processes to your CPU and get less processing potential or you move it onto the GPU which is usually being fully utilized for rendering.

When PhysX is done well, it provides some really stunning visuals but is the hit in rendering performance worth it? It forces you to make a choice between Image Quality or Physics and I personally don't want to have to make that choice.
Reply With Quote
  #19 (permalink)  
Old August 11, 2009, 06:45 PM
SKYMTL's Avatar
HardwareCanuck Review Editor
 
Join Date: Feb 2007
Location: Montreal
Posts: 11,680
Default

Quote:
Originally Posted by ToXic View Post
so your saying... i should do the W7 thing, i really do want PhysX but i dont know how well my 750W supply will do being almost maxed out...
PhysX + ATI won't work with the newest drivers and Win 7. Are you going to want to mess around with uninstalling / reloading drivers every time you want a PhysX game to work?

Seriously, while there are some seriously AAA titles coming with PhysX, it doesn't look like PhysX will change gameplay enough to call it a must-have feature.
Reply With Quote
  #20 (permalink)  
Old August 11, 2009, 06:49 PM
bojangles's Avatar
Hall Of Fame
F@H
 
Join Date: Jan 2008
Location: Oakville, ON
Posts: 2,683

My System Specs

Default

I really don't see the need for Physx just yet. Heck we're coming out with Quad and Hexa cores with HT and we have yet to use the potential of those. Parallel computing is so far behind, even though that's how computing was first introduced. Lazy programmers....
Reply With Quote
Reply


Thread Tools
Display Modes

Similar Threads
Thread Thread Starter Forum Replies Last Post
question about physx kyle_L Video Cards 6 June 13, 2009 06:22 PM
PhysX Discussion geokilla Video Cards 16 April 2, 2009 09:15 PM
Get Physx on your lappy! Cheator Video Cards 3 August 16, 2008 07:53 PM
New PhysX driver burebista Video Cards 19 August 12, 2008 06:23 AM
nVidia to Support CUDA and PhysX in ATI Radeon Spaceman-Spiff Video Cards 6 July 10, 2008 02:01 PM