Mirror's Edge PhysX Side-by-Side Video
by Derek Wilson on December 8, 2008 9:00 AM EST- Posted in
- Dell
A little while back, NVIDIA brought us the news that Mirror's Edge for the PC would feature PhysX support and include some neat effects physics. Effects physics, as you may recall, is the physical simulation of things that don't impact gameplay but simply enhance the visual impact of a game. This can range from particle systems to persistent debris enhanced destructibility or more accurate simulation of fluids, smoke or other volumetric effects. The impact is in immersiveness but it doesn't bring game changing aspects of hardware accelerated physics to the table quite yet.
And we haven't seen anything, until Mirror's Edge, that looked promising in terms of adding anything really compelling to a game. The previous video we posted showed some nice potential, but we still haven't gotten the opportunity to play with it ourselves and really feel the difference. We requested a side-by-side video hoping to get a better handle on what, exactly, is improved in Mirror's Edge. NVIDIA delivered.
Here's the original video of Mirror's Edge we posted.
Here is the side by side video showing better what DICE has added to Mirror's Edge for the PC with PhysX. Please note that the makers of the video (not us) slowed down the game during some effects to better show them off. The slow downs are not performance related issues. Also, the video is best viewed in full screen mode (the button in the bottom right corner).
The effects in there can be simulated on either a CPU or an NVIDIA GPU. The advantage to the GPU is performance and NVIDIA indicates that even an Intel Core i7 processor will have a tough time without GPU support. So these effects aren't anything we've never seen before, but it certainly looks like there is just a lot more of it in Mirror's Edge (and not in that really bad too many particles/too much debris sort of way). The glass breaking itself honestly looks the same (or close enough) to us, but the persistent particles are where it's at. Having a little debris stick around and be affected by the character is a nice touch. The cloth, plastic and tarp effects are what look like the real icing on the cake in the game though. The complete absence of the cloth objects when physics is disabled makes an already sparse looking world look pretty empty by comparison.
We still want to really get our hands on the game to see if it feels worth it, but from this video, we can at least say that there is more positive visual impact in Mirror's Edge than any major title that has used PhysX to date. NVIDIA is really trying to get developers to build something compelling out of PhysX, and Mirror's Edge has potential. We are anxious to see if the follow through is there.
Extending this story is the fact that today NVIDIA is announcing that EA and 2K games have both licensed PhysX and will be working with NVIDIA to include the technology in future titles they publish. All EA and 2K development studios will now have license to develop with PhysX for all platforms. This means Mirror's Edge may not be the only EA title going forward to get the PhysX treatment, and 2K will bring PhysX to the table with Borderlands (which is being developed by Gearbox).
81 Comments
View All Comments
danchen - Monday, December 8, 2008 - link
well this might sound stupid, but since i know nothing about physX, would someone explain if it possible to have a ATI HD4870x2 in the 1st PCI-E slot powering the graphics (GPU), then have a 8800GT in the second PCI-E slot working as a physics processing unit (PPU)?DerekWilson - Monday, December 8, 2008 - link
One of the current limitations of Vista is that you can only run one display driver using the WDDM. Since the NVIDIA driver needs to be running for PhysX to work, you cannot use ATI as a main card and NVIDIA as a PhysX card. I'm sure NVIDIA would love to allow their cards to be used even in systems with ATI graphics. Hopefully Windows 7 will change that ... though I'm not going to hold my breath.strikeback03 - Tuesday, December 9, 2008 - link
Maybe it would be possible for NVIDIA to write a driver which sees the card you want to use for PhysX as something other than a display driver?haukionkannel - Monday, December 8, 2008 - link
Yep. That is the case. DX11 change some thing that you can use CPU for GPU tast and vice versa, but nothing about different GPU's so far.Creig - Monday, December 8, 2008 - link
In fact, it sounds as if DX10 and DX10.1 hardware may already be DX11 compatible."GAMEFEST 08: New version adds compute shaders for GPGPU; completely compatible with DirectX 10 hardware
Microsoft has revealed the first details on the latest version of its DirectX SDK at the Gamefest event in Seattle.
Chief new features in version 11 are the new Compute Shader technology, which allows GPUs to be used for general purpose computing; support for tesselation, allowing models to be refined and smoother up-close; and multi-threaded resource handling to help games utilise multi-processor set-ups more effectively.
Rather than require new hardware as DirectX 10 did, DirectX 11 will be completely compatible with DirectX 10 and 10.1 cards - but will, like its predecessor, only support Windows Vista."
ltcommanderdata - Monday, December 8, 2008 - link
I'm guessing that right now PhysX is written using CUDA. I believe it's been said that nVidia has no objection if someone were to write a CUDA interface for ATI cards. But could the same be done by third-parties for PhysX? I'm guessing not since I assume PhysX is proprietary and the source-code not available.I wonder what the chances of nVidia porting PhysX to OpenCL is so all GPUs can be supported, not that OpenCL is finalized and awaiting final ratification.
SuperGee - Monday, December 15, 2008 - link
iNtel AMD nVidia are supporting OpenCL.So I wonder if PhysX get ported to OpenCL. This means ATI can be supported inderectly. Using PhysX without hacks.
nV was in battle with iNtel about CPU vs GPGPU. The larabee thing.
So nV can see ATI(AMD) as mij direct smaller enemie is also the enemie of a bigger nV enemy so the first become a 'friend'.
nV might need AMD to pull of GPGPU Physics fast. Within the next 2 á 3 years till larabee get populair.
For nV the enemy is iNTel with havok. And AMD in the middle.
The first sign of this is wenn nV give support to the ATI hacker but AMD sticks with iNtel Havok.
shin0bi272 - Wednesday, December 10, 2008 - link
one of the options nvidia could take would be to go with a pci-express 4x or 8x physx add in card. The old ones are pci and produce half the number of particles that the 8800gtx produces on its own. So a faster interface would do wonders for an add in card... then nvidia could corner the market through licensing the idea out to companies (including ati) for production just like they do their video cards. Then the people who want to buy ati vid cards and ati physx cards can do so ... or ati can add a physx chip to their cards like nvidia did when they made the 8800gtx to try to do SLI physics.chizow - Monday, December 8, 2008 - link
They've said in the past that they would love ATI to jump on the PhysX bandwagon, but I'm sure that means they would want royalties to use it as well. One of the early reports was a few pennies per GPU, which was apparently too much to pay.Its clear at this point that ATI does not want to support PhysX directly and that Havok is going to be a dead-end until Intel can accelerate beyond the capabilities of a CPU. The only hope for ATI imo is DX11 support of hardware physics and some kind of wrapper and licensing agreement for native PhysX titles.
ltcommanderdata - Monday, December 8, 2008 - link
I had a spelling mistake in my previous post. I meant "now" that OpenCL is finalized and awaiting final ratification.It's surprising how quiet Intel has been with Havok. There seems to have been quite a few PhysX related announcement recently, but no real response from Intel. And it doesn't look like anything has come out of AMD's partnership with Havok either.