Archive Home arrow Guides arrow NVIDIA APEX PhysX: CPU vs GPU Efficiency








NVIDIA APEX PhysX: CPU vs GPU Efficiency E-mail
Articles - Featured Guides
Written by Olin Coles   
Friday, 17 September 2010
Table of Contents: Page Index
NVIDIA APEX PhysX: CPU vs GPU Efficiency
NVIDIA APEX PhysX Enhancements
Testing and Initial Results
APEX PhysX: GPU vs CPU Tests

NVIDIA APEX PhysX Efficiency: CPU vs GPU

Benchmark Reviews tests NVIDIA APEX PhysX efficiency using Mafia II - compares CPU vs GPU performance.

According to the August 2010 Steam hardware survey, PC gamers are using NVIDIA GeForce desktop video cards nearly 80% more than AMD/ATI counterparts. Great products have come from both GeForce and Radeon brands, yet based on this survey NVIDIA owns almost 60% of the entire graphics market compared to AMD's 33%. Gamers might rely on NVIDIA's hardware for its superior graphical processing power and affordable price point, but it's their gaming technologies that have helped deliver complete market dominance (among Steam users). NVIDIA's "The Way It's Meant to be Played" is a trademarked slogan denoting a direct involvement in software development as much as they focus on hardware. When the Ageia PhysX software physics technology was purchased back in early 2008, that commitment sharpened NVIDIA's growing double-edge sword. Adding 3D Vision only helped consummate their efforts.

In this article, Benchmark Reviews will demonstrate how far PhysX technology has come using the recently-released Mafia-II video game by 2K Games. In this single-player third-person action shooter developed by 2K Czech for 2K Games, players assume the life of World War II veteran Vito Scaletta, the son of small Sicilian family who immigrates to Empire Bay. Mafia II makes use of DirectX-11 extensions on 2K Czech's proprietary Illusion game engine, which introduces NVIDIA APEX PhysX and GeForce 3D-Vision technology enhancements. NVIDIA's APEX PhysX modeling engine adds new Destruction, Clothing, Vegetation, and Turbulence physics into games such as Mafia II. While adding PhysX support to a video game is nothing new for NVIDIA, allowing APEX PhysX features to be computed by the computer's central processor is new territory. For this NVIDIA APEX PhysX: CPU vs GPU Efficiency demonstration, our tests compare GeForce and Radeon GPU's against the Intel Core i7 CPU.

NVIDIA-APEX-PhysX-Efficiency-CPU-vs-GPU_Splash.png

This article isn't intended to become a NVIDIA vs AMD topic, but it becomes impossible to avoid since ATI does not license PhysX. NVIDIA offers a free software development kit so CUDA drivers can be built for AMD products, yet all ATI Radeon graphics cards (up to the HD 5000 series) still do not compute PhysX commands without using modified drivers. As a result, PhysX hardware acceleration is presently available only on GeForce GPUs unless gamers research unsupported options for their Radeon products. NVIDIA has opened their PhysX platform to AMD and Intel processors in Mafia II, allowing hardware acceleration to be calculated my the system's central processor. The narrative of this article is how well PhysX is processed by the CPU and GPU, and where the different GeForce Fermi graphics processors (GF100, GF104, GF106) stack up in regards to PhysX efficiency.

Mafia2_3d-Vision_Debris.jpg

NVIDIA APEX PhysX Destruction, Clothing, and Particles in Mafia II

Mafia II is the sequel to Mafia: The City of Lost Heaven released in 2002. Growing up in the slums of Empire Bay teaches Vito about crime, and he's forced to join the Army in lieu of jail time. After sustaining wounds in the war, Vito returns home and quickly finds trouble as he again partners with his childhood friend and accomplice Joe Barbaro. Vito and Joe combine their passion for fame and riches to take on the city, and work their way to the top in Mafia II. While this premise makes for an interesting storyline, it's the graphical effects that keep players immersed in very realistic settings. NVIDIA's APEX PhysX is the glue that binds in Mafia II, giving the game life-like physics to create a virtual reality.

Mafia II was developed with NVIDIA's PhysX version 2.8.3, the available software development kit at the time. This version supports only single-core single-threaded PhysX CPU processing, which is minimal in comparison to the available hardware of most PCs. The current PhysX SDK (version 2.8.4) supports SSE2 instructions, but this feature must be enabled by the developer. According to NVIDIA, the forthcoming PhysX SDK 3.0 is said to introduce multi-threaded CPU support for PhysX extensions, and SSE is enabled by default.



 

Comments 

 
# Hahasmodtactical 2010-09-16 16:55
LOL then why all the slobbering and bias towards nVidia? Also what is the logic in testing ATI cards UNDER physX when they don't do physx? The frame drop is utterly irrelevant because an ATI user would never turn physX on because there would be no actual physX occurring despite the software being installed.
Report Comment
 
 
# RE: HahaOlin Coles 2010-09-16 17:40
The majority of your comments have been unpublished because you have demonstrated an inability to respectfully discuss this topic.

I might also add that you're clearly picking a fight without knowing the subject matter. Those are factual numbers I'm quoting from the Steam Hardware Survey. Look it up for yourself. Also, if you don't think ATI users are playing Mafia II with PhysX enabled perhaps you should read the comments in our review of that game: benchmarkreviews.com/index.php?option=com_content&task=view&id=582&Itemid=64
Report Comment
 
 
# RE: Hahasmodtactical 2010-09-16 17:41
PhysX is an open standard, DESIGNED BY ATI's COMPETITORS. Yes smart strategy, lets implement technology from the opposing company. You should be the new head ATI strategist.

I apologize for missing the CPU driven physX statement. You are totally right there and although physx is good, its hardly revolutionary or even practical fps wise.
Report Comment
 
 
# RE: NVIDIA APEX PhysX: CPU vs GPU EfficiencyDavid Ramsey 2010-09-16 17:03
The only "slobbering and bias" here is what you're imagining. PhysX is an open standard ATI has chosen for whatever reason not to implement. This article shows how it can enhance a game, especially when it's designed in from the first.

And there IS "actual PhysX" occurring, even with an ATI card, since the game allows you to use the CPU to run PhysX. You did read the article, right? Or are you just a frustrated ATI fanboi?
Report Comment
 
 
# smodtacticalMikeS 2010-09-16 17:35
According to the August 2010 Steam hardware survey, PC gamers prefer NVIDIA GeForce desktop video cards nearly 80% more than AMD/ATI counterparts.

Try slowly reading the first six words of the above sentence.

A survey by "Steam" not Benchmarks.

Learn to read properly then you can have intelligent conversations.
Report Comment
 
 
# RE: smodtacticalStan 2010-09-16 21:59
According to the August 2010 Steam hardware survey, PC gamers prefer NVIDIA GeForce desktop video cards nearly 80% more than AMD/ATI counterparts.

The above statement is an "utter statistics", according to Mark Twain classification.
Report Comment
 
 
# Yasmodtactical 2010-09-16 17:43
That is my whole point. I know you are referring to the steam study but to claim the nVidia video cards have such market dominance is the fanboyish part of the statement (thats why I quoted that). A more balanced statement would have been 'perhaps the reason Nvidia is favored in the sphere of Steam users or in the Steam market' etc.

The steam market does not = the entire market nor can such an extrapolation be made.
Report Comment
 
 
# Get over it.Olin Coles 2010-09-16 17:47
I notice how you ignore the second sentence in this intro: "Great products have come from both GeForce and Radeon brands". If you've got another anonymous hardware poll that proves otherwise, feel free to suggest it. In the mean time, those are the facts and you'll have to find a way to live with them.
Report Comment
 
 
# Hrmsmodtactical 2010-09-16 17:54
If you read my post I am not disputing the fact only the way it was phrased. Please re-read my post.

Again your almost making an extrapolation saying because steam says it that means thats how the entire market is and then plugging your ears saying 'lalalala can't hear you'.
Report Comment
 
 
# RE: HrmOlin Coles 2010-09-16 17:58
"According to the August 2010 Steam hardware survey, PC gamers prefer NVIDIA GeForce desktop video cards nearly 80% more than AMD/ATI counterparts. Great products have come from both GeForce and Radeon brands, yet based on this survey NVIDIA owns almost 60% of the entire graphics market compared to AMD's 33%."

Yep, it's totally biased... towards Steam. I'm waiting on you to suggest another survey. Reply when you have one.
Report Comment
 
 
# Steam survey.Olle P 2010-09-17 01:25
The only objection I have to how this is presented is the wording that users "prefer" nVidia cards.
Steam don't ask what they prefer, but just do an automated check on what's actually being used.
Therefore it would be much more correct to state that gamers (currently) "use" more nVidia that ATI cards.
Report Comment
 
 
# RE: NVIDIA APEX PhysX: CPU vs GPU Efficiencymihai 2010-09-16 17:46
i have a ageia physx card which cost about 15 dollars, and a 1055t cpu, 5870 gpu and the game run very well, before you buy , just think for a solution...but also you can buy a nvidia 8800gt second hand is almost for free and use it for physx
Report Comment
 
 
# Mihaismodtactical 2010-09-16 17:50
Well said Mihai, I mean a second card is an option if your PSU and mobo can handle it.

In regards to this post:

# RE: Haha ? Olin Coles ? 2010-09-16 17:40
Look it up for yourself. Also, if you don't think ATI users are playing Mafia II with PhysX enabled perhaps you should read the comments in our review of that game: benchmarkreviews.com/index.php?option=com_content&task=view&id=582&Itemid=64

===================

Olin I already apologized about not reading the part of cpu physx. It was my mistake, sorry.

Although I do find it intriguing that your other users share my sentiment about the nVidia sponsorship nature of your article writing. :)
Report Comment
 
 
# RE: MihaiOlin Coles 2010-09-16 17:55
What could they have possibly sponsored for this article? I have already published full reviews of each individual product long ago, and this article is only meant to supplement the Mafia II review. Several of the comments in that review stated that Radeon video games play with PhysX just fine in this game, yet the benchmark results prove otherwise.

If you believe that quoting the results of Steam's Hardware Survey indicate some level of bias, you're clearly fishing for something that doesn't exist. Point me to another large-scale survey, and I'll be happy to add that perspective.
Report Comment
 
 
# Picard facepalmsmodtactical 2010-09-16 18:01
You are not listening to me. I am not disputing the results of the steam survey and your quoting them in the article. What I am disputing is your almost virtual extrapolation of the steam market equaling the entire pc gaming market which obviously cannot be made. Instead of saying 'market dominance' you might want to say 'dominance among steam users or in the steam market'. That would be the more balanced statement.

I won't say this again, please read it as many times as it takes you to understand.

Thank you.
Report Comment
 
 
# RE: Picard facepalmOlin Coles 2010-09-16 18:11
Beginning with "According to the August 2010 Steam hardware survey" should have set the tone, and including "yet based on this survey" should have punctuated it. But alas, it seems intro needs to point out how these are Steam statistics a third time for the message to stick: "Gamers might rely on NVIDIA's hardware for its superior graphical processing power and affordable price point, but it's their gaming technologies that have helped deliver complete market dominance (among Steam users)."

Satisfied now?
Report Comment
 
 
# Thankssmodtactical 2010-09-16 18:14
Sounds better. But remember its not just me that commented on how the writing in general on the site seems biased towards nVidia.

Just stay neutral and objective and you will be fine. :)
Report Comment
 
 
# Bias is all yours.MikeS 2010-09-16 18:38
Your the only one who has made a biased comment about this article.
Your neither neutral or objective yourself.

Your mad at the Benchmark editors?
Get over it.

This is one of the best if not the best electronics review sites on the web.
You don't agree? That is fine.

Why come to a site you don't like?

Have a nice life.
Mike
Report Comment
 
 
# RE: ThanksDavid Ramsey 2010-09-16 18:43
This time last year, ATI ruled the graphics roost, and our review of the Radeon 5780 said "AMD has retaken the crown for superior graphical power with their ATI Radeon HD 5870 video card, and consumers have confirmed that this is the hottest graphics accelerator of the moment." and "While NVIDIA toils away with CUDA and PhysX, ATI is busy delivering the next generation of hardware for the gaming community to enjoy."

We slammed NVIDIA for saying that DirectX 11 wouldn't drive game development, and for crippling anti-aliasing for non-NVIDIA cards in games like "Batman: Arkham Asylum".

That was then, and this is now. NVIDIA now has a significant advantage over ATI, especially with mid-range cards like the 460. Will this all change with the introduction of the rumored ATI 6800 series? Who knows? But we'll test it, and if it beats NVIDIA's offerings, someone else with an axe will be back accusing us of being in ATI's pocket.
Report Comment
 
 
# wellsmodtactical 2010-09-16 18:53
Even that sounds fanboyish towards ATI. Why not be more balanced in your phrasing... honestly take a look at other sites for what I mean (like anand or tom).

They were kind of right about dx-11 so far. The reason? DX doesn't drive development... consoles do. For the most part as consoles advance so do pc ports and games advance (which is not much of course).

When the next-gen consoles arrive tech like DX-11 if its shared by them will be all the rage. Before that not so much.
Report Comment
 
 
# LOLSteven Iglesias-Hearst 2010-09-17 01:33
dont bring consoles into the argument, they are too out of date for our liking,

Xbox 360 was released in 2005 and the PS3 was released a year later, around that time nvidia was making the the 6800 ultra NV45 core (360) and 7800 GTX G70 core (PS3) and ATI's was the Radeon X850 XT PE R481 core (360) and Radeon X1800 XT R520 core both of which use Direct X 9.0.

The PS3 uses a nvidia 7800 core for its GPU and the 360 uses a ATI Xenos (R500 core) for its GPU. Thats why games allways look better on a PC, which is where PhysX comes in.
Report Comment
 
 
# OF CourseJames Smith 2010-09-16 18:34
...And OF Course , A game with a Kind of Physxs is Going to Favor Nvidia , Nvidia "Gimps" Physxs on the CPU to Probably Hurt ATI , See this and Other Articles: ##realworldtech.com/page.cfm?ArticleID=RWT070510142143&p=5
Report Comment
 
 
# WOWsmodtactical 2010-09-16 18:56
Amazing article James! Hey benchmarkreview guys! Take a look at this, pretty thought provoking!
Report Comment
 
 
# Says it alljnanster 2010-09-20 08:14
This article kinda says it all, Physx is written clunky (old code and not multi threaded) and will probably stay that way cause Nvidia wants it that way. I personally have a dedicated Physx card which works well.
Report Comment
 
 
# RE: NVIDIA APEX PhysX: CPU vs GPU EfficiencyFederico La Morgia 2010-09-16 22:02
920 @ 4 GHz or 980 @ 4.5 GHz and repeat benchmark cpu physx. ;)
Report Comment
 
 
# RE: RE: NVIDIA APEX PhysX: CPU vs GPU EfficiencyOlin Coles 2010-09-16 22:05
Really Federico? What do you imagine that will change in the CPU PhysX results? 1-FPS at most, is my estimate.
Report Comment
 
 
# RE: RE: RE: NVIDIA APEX PhysX: CPU vs GPU EfficiencyFederico La Morgia 2010-09-17 00:39
boh,
Google translate : I hope that something will change (2.66 ----> 4 GHz)
Report Comment
 
 
# RE: NVIDIA APEX PhysX: CPU vs GPU EfficiencyZogrim 2010-09-17 01:45
Theme is interesting, but article is way too superficial:

1) No dedicated PhysX GPU was used -> APEX Clothing module is running on CPU in this case
2) Why no CPU scaling (cores/clocks) graphs ? Would be interesting to see if CPU APEX is multi-threaded at all.
3) Same with APEX tweaks, would be nice to see what effects should be removed, to make Mafia II playable on Radeon with APEX On.
Report Comment
 
 
# physxinfo.comOlin Coles 2010-09-17 07:50
Very interesting website, Zogrim! I will agree that my article only scratches at the surface, but that's exactly what it's intended to do. I've learned (the hard way) that visitors don't absorb very much if you give them information overload... evidenced by the fact that our first comment proved the visitor hadn't even read past the intro. A website like yours is specifically designed to dive into the topic, and I urge our readers to learn more in your specialized articles. The good news is that I can always write add-on articles that dive deeper into the subject... something I plan to do as time allows.
Report Comment
 
 
# Hehsmodtactical 2010-09-17 08:29
I did read past the intro but your quality of writing made it hard to do so. A sentiment evidenced by all the negative comments that came after mine. :D
Report Comment
 
 
# RE: NVIDIA APEX PhysX: CPU vs GPU Efficiencyanthony 2010-09-17 03:39
I really enjoyed this review. And really don't see a bias towards either manufacture by the author. Back when physx was first introduced to the gaming community you had to purchase a card made by Ageia and when nVidia purchased the technology they introduced it into there software. I have tried numerous cards over the years in my system and have used both manufactures it just depends on the offering of technology that pushes me to what I buy. If you solely purchase a card due to its brand name you truly are doing yourself a disservice.
Report Comment
 
 
# Physx DOES NOT run properly on cpusweatshopking 2010-09-17 04:21
dudes. google physx x87. it runs single core with no sse. physx on cpu is a joke. 4 cores on the cpu? what for? it only USES ONE. and with a dead language. Physx blows. I have nvidia gpu's, and I'd way rather have dx11, and according to steam survey's over 80% of the dx11 are amd
Report Comment
 
 
# RE: NVIDIA APEX PhysX: CPU vs GPU Efficiencymihai 2010-09-17 05:57
this is the purpose of this article....to create storm and people to react,,,but at the end of discussion the mighty radeon 6000 series is in the market
and yes this is a fanboy statement :)
Report Comment
 
 
# RE: NVIDIA APEX PhysX: CPU vs GPU EfficiencyAhmad Saleem 2010-09-17 06:41
Mafia 2 is designed by using Nvidia Physx 2.8.3 SDK, while new 2.8.4 SDk gives support of SSE2 instructions thats mean new games have better Physx and new 3.0 SDK gives multi-threaded support to physx, thats what I know, maybe I am wrong. Whatever the position of ATi, if it is that almighty it should release Physx counter attack Physx. Intel Havoc Physx engine needs 6 core cpu, now imagine...
Report Comment
 
 
# Added into article:Olin Coles 2010-09-17 08:17
Thank you for this lead, Ahmad. I researched the SDKs and SSE, and there were several keen articles on the topic. I have added this text to the article based on your comment: "2K Games designed Mafia II using NVIDIA's PhysX 2.8.3 SDK, which supports only single-threaded PhysX CPU processing. PhysX SDK version 2.8.4 supports SSE2 instructions (which are not enabled by default for backwards compatibility), allowing updated games to compute PhysX more efficiently if developers enable the function. Finally, the forthcoming PhysX SDK 3.0 is said by NVIDIA to introduce multi-threaded CPU support to PhysX with SSE enabled by default, which could really change the game for everyone."
Report Comment
 
 
# RE: Added into article:Andrew H 2010-09-18 16:57
Right, for 'backwards compatibility' purposes... That's NVIDIA for you, trying to pass the buck to the game developer, when in the end they're just making it as difficult as possible to not use an NVIDIA card and get the full experience. They could do it themselves. "SSE not enabled by default" is a joke. Every CPU has been SSE compatible since the Pentium 2. Any PC you could build with a CPU that wasn't SSE compatible could never run any game that uses PhysX, period. If that's true, then why haven't any game developers, including the ones that released the absolutely latest games (Mafia II, etc) enabled it themselves? So their games run worse without NVIDIA cards because they really like NVIDIA and want to get them more money? Right.
Report Comment
 
 
# RE: Added into article:Andrew H 2010-09-18 16:58
I guarantee you these game developers aren't using x87 to code their games. Why would they just to implement PhysX?
Report Comment
 
 
# RE: NVIDIA APEX PhysX: CPU vs GPU Efficiencymihai 2010-09-17 06:48
i like people well informed Ahmad Saleem :)
Report Comment
 
 
# RE: NVIDIA APEX PhysX: CPU vs GPU Efficiencymihai 2010-09-17 07:08
shame for game developers how concentrate to make a game run better with "specific" hardware ,,,,and not concentrate to make a better story line, etc etc


Nvidia forget this : they are GPU manufacture company in first place not software bribe company, for me is simple now they are a weak company and they play dirty shame again...
Physx should be a free software because physics belongs to the nature :) ageia or nvidia didn't invent anything
Report Comment
 
 
# RE: RE: NVIDIA APEX PhysX: CPU vs GPU EfficiencyDavid Ramsey 2010-09-17 09:29
Ageia and NVIDIA "didn't invent anything"? It's easy to say the results of other people's work should be free, isn't it? Do you work for free? If not, why is what you do worth anything?
Report Comment
 
 
# Keep discussions on-topicOlin Coles 2010-09-17 10:56
NVIDIA pays employees to engineer and develop PhysX, which is why there's a commercial licensing fee. It's free for non-commercial use, which is means you're not using their software to make money for yourself. I think this is entirely fair.

Please keep comments and discussion on-topic with this article.
Report Comment
 
 
# RE: NVIDIA APEX PhysX: CPU vs GPU Efficiencyanthony 2010-09-17 08:46
Just a thought, but how can anyone write a review like this without leaning towards the manufacture that owns the technology?

Yes they are opening it up too other manufactures chipsets and they really don't have too. But what I think the author is trying to express is that it is a long way from being ready for use with CPU's. To get the full benefit of any technology you are stuck to what the developer has built it around.
Report Comment
 
 
# Why such an old CPU?Lee 2010-09-17 09:03
The test setup used the latest gfx cards from NV and AMD, but why the 2.66 GHz Nehalem CPU? As shown in the last chart, FPS is significantly lower when PhysX runs on the CPU vs. on the NV card. Why not try this on the latest CPU and see if it's truly a hardware limitation, or if there's additional optimization that could be done to make PhysX run better on CPU?

If it's a matter of getting the hardware please let me know. -Lee
Report Comment
 
 
# RE: Why such an old CPU?Olin Coles 2010-09-17 09:06
In your estimation, what is 'the latest' CPU? I hope that you realize the only difference between a Core i7-920 and the i7-930 is 133 MHz.
Report Comment
 
 
# RE: RE: Why such an old CPU?Lee 2010-09-17 09:41
There are 8 CPU SKUs better than the one used. The top end Core i7-980x has 6 cores, 3.33 GHz (3.6 with Turbo) and 12MB cache, vs. the 4c/2.66GHz/8MB from the test. It just seems like top end GPUs should be tested with top end CPUs.
Report Comment
 
 
# RE: RE: RE: Why such an old CPU?Olin Coles 2010-09-17 10:51
Those are faster CPUs, but not necessarily better. Since this version of PhysX is single-threaded, it doesn't make much difference. Once PhysX 3.0 is used in games, which enabled SSE by default, it will begin to matter... presuming you don't use GeForce GPUs to handle PhysX.
Report Comment
 
 
# RE: RE: RE: RE: Why such an old CPU?Lee 2010-09-17 11:52
They are definitely better CPUs. The problem is you are giving Nvidia a pass on the fact that PhysX is single-threaded, even though the August Steam data shows 85% of users have 2 or more cores. With the quotes below it sounds like you blame the hardware for the reduced frame rates, but the fact is it's the lack of effort to optimize PhysX for CPU.

Nvidia doesn't sell CPUs so I understand this from their position. However I would hope to see pressure from objective reviewers, as well as game developers, to fix this.

"when it comes to video games NVIDIA has proven that the GPU trumps all"
"Our Intel Core i7-920 quad-core CPU just doesn't compare to the hundreds of cores available in a graphics processor"
Report Comment
 
 
# RE: RE: RE: RE: RE: Why such an old CPU?Olin Coles 2010-09-17 12:20
So how exactly are these newer/faster CPUs of the same architecture better for PhysX 2.8.3? Really, I'd like for you to please explain, since version 2.8.3 will only process on one core.

Also, I'm not giving anyone a 'pass' as I've already stated in this article (and these comments) that Mafia II APEX PhysX was the focus and not NVIDA PhysX vs AMD vs Intel. I also mention how PhysX 3.0 will include multi-theaded processing and SSE instructions by default.

When PhysX 3.0 is used, your argument about CPU speed will matter much more than it does now.
Report Comment
 
 
# RE: RE: RE: RE: RE: RE: Why such an old CPU?Lee 2010-09-17 13:10
Then I guess I'm waiting for PhysX 3.0. In the meantime developers can use a thoroughly multi-threaded physics solution like Havok to achieve all of these effects and more, without requiring an Nvidia GPU.

The quotes I mentioned are still not accurate as they're comments about the GPU vs. the CPU, while the real reason for the difference in frame rates is due to software, not hardware.
Report Comment
 
 
# OptimizationSetsunayaki 2010-09-17 12:55
I have both card sets and I tried the game on both. Its true that Mafia II is optimized for Nvidia cards. This isn't the first time a game has been released where max settings are optimized somewhere...

The cute thing in reviews is that ATI and Nvidia go neck and neck in Framerates with a 480 GTX pulling ahead in games, but once you go into GPGPU and game programming...You see a different story.....An HD 5870 loses vs a 285 GTX...
Report Comment
 
 
# RE: NVIDIA APEX PhysX: CPU vs GPU EfficiencyRobert17 2010-09-17 16:04
All bickering aside, thanks to all for a lively discussion. I had no idea how much talent has been invested in the gaming circles to show such realistic eye candy. (I'm not much of a gamer any more.) To the question: will such dynamic visual effects benefit all games or other applications, such as web video downloads such as YouTube, or even cross-over to enhance performance on somewhat arcane products as massive Excel spreadsheets? Any bleedover benefits?
Report Comment
 
 
# RE: RE: NVIDIA APEX PhysX: CPU vs GPU Efficiencydlb 2010-09-18 20:42
I'm not sure about this, I have no idea if it's a PhysX thing or not, but the newer web browsers (like IE9 for example), will support extra eye-candy. Of course, this will depend on the web site you're visiting and whether it is coded to deliver this 'optical sugary goop', but from what I've read (not much LOL), graphical enhancement is HUGE thing being worked on with the newer browsers. Again- I don't know if PhysX has anything to do with this (at least not >yet< )....

(I'm trying to avoid more "lively" ATI vs NV "discussion" LOL)
Report Comment
 
 
# 32fps average on ATi 5870KaptainKhaos 2010-09-17 16:24
Just add a Nvidia GPU to do the Physx, my old 8800GTS does this well. Showing that the 5870 can match the GTX480 in this game.CPU Physx has been deliberately crippled to show Nvidia GPU's in a better light but nothing stops you (even the Nvidia tries at a driver level) getting this going with ATI as primary GPU.
Report Comment
 
 
# RE: NVIDIA APEX PhysX: CPU vs GPU EfficiencyAhmad Saleem 2010-09-18 01:43
Peopel are atalking that Game developers should use Havock engine, look at it it also own by Intel, it is also not Free, so why are you pointing your fingers towards nvidia, Read IDF 2010 details, in which they mentioned to run Havock engine nicely ones need 6 core, isn`t it favouring CPU, just like people accuse nvidia. Ati announced we are thinking of launch Open Source OpenCL based Physx, whee it is? they said earlier this year, does you ever heard any other news after that?
Report Comment
 
 
# RE: NVIDIA APEX PhysX: CPU vs GPU EfficiencyAndrew H 2010-09-18 08:03
PhysX is just a way to sucker game developers to push Nvidia by offering them money to use their physics engine instead of developing their own. The PhysX drivers are intentionally crippled so that they not only disable PhysX support when an ATI card is present, but the code to run PhysX on a CPU rather than GPU is intentionally crippled:

##thinq.co.uk/2010/7/7/cpu-physx-deliberately-crippled/

#arstechnica.com/gaming/news/2010/07/did-nvidia-cripple-its-cpu-gaming-physics-library-to-spite-intel.ars

There is actually no reason that a GPU would run physics code better than a CPU other than intentionally crippling the CPU implementation, especially in the age of multi-core processing. One could argue that a CPU is actually more adept for these kinds of calculations, whereas a GPU is more tuned for rendering the graphics.
Report Comment
 
 
# RE: RE: NVIDIA APEX PhysX: CPU vs GPU EfficiencyOlin Coles 2010-09-18 10:07
PhysX cannot be supported unless drivers are coded with functionality. AMD licensed Havok from Intel, just like they could license PhysX from NVIDIA, but they refuse to code their Catalyst drivers to make use of a technology from a direct competitor. That's not anyone crippling functionality, that's AMD making a business decision. It?s pretty easy to blame NVIDIA for something when your favorite team doesn?t have the wit to play ball.
Report Comment
 
 
# RE: RE: RE: NVIDIA APEX PhysX: CPU vs GPU EfficiencyLee 2010-09-18 12:08
AMD didn't need to license anything from Havok. Havok supports the HW their customers need/request on their own, including multi-threaded CPUs from Intel and AMD. NV could make PhysX run well on CPU, so far they have chosen not to.
Report Comment
 
 
# RE: RE: RE: NVIDIA APEX PhysX: CPU vs GPU EfficiencyAndrew H 2010-09-18 15:51
Yeah... Check the articles I posted. They have nothing to do with ATI, and everything to do with the CPU instructions used for doing PhysX on the CPU. They use an extremely outdated and inefficient instruction set to do it, and there's no reason possible other than they want it that way.

And yes, NVidia did in fact block out PhysX when an ATI GPU was present. There was a workaround where you could run an ATI GPU as your main board, and some cheap geforce just to do PhysX, but NVidia blocked that in a driver update. They even made a public statement regarding it.
Report Comment
 
 
# RE: RE: RE: NVIDIA APEX PhysX: CPU vs GPU EfficiencyAndrew H 2010-09-18 16:38
Also, saying that a company "doesn't have the wit to play ball" because they won't shell out money to a competitor to use their competing technology is just plain silly. The PhysX driver could support all hardware on it's own, it's NVidia's choice to have it not. The PhysX SDK's EULA states the only licensed platforms are: Macs and PCs using NVIDIA cards to do PhysX, Macs and PCs running PhysX on the CPU only, PS3, 360, and Wii. Those last two are important because it shows that NVidia already has written the code to have PhysX run on an ATI chipset (Both 360 and Wii have an ATI chips), and purposefully don't release it on PC to gain more business in the GPU market. It would take little effort to take it from 360 to PC.
Report Comment
 
 
# RE: NVIDIA APEX PhysX: CPU vs GPU EfficiencyAhmad Saleem 2010-09-18 09:42
@ Andrew H: then you are suggesting, we should use Havock engine, which sucks 6 cores and does not gives good graphics as compare to nvidia Physx...
Report Comment
 
 
# RE: RE: NVIDIA APEX PhysX: CPU vs GPU EfficiencyLee 2010-09-18 12:11
@Ahmad, Havok uses as much or as little hardware as it is given. If there are 6 cores, fantastic. 1 or 2 cores, that's fine too.

I'd like to understand what you mean when you say PhysX supports better graphics?
Report Comment
 
 
# RE: NVIDIA APEX PhysX: CPU vs GPU EfficiencyAndrew H 2010-09-18 16:17
Um.. no? I wasn't aware PhysX and Havoc were developers only options for implementing physics code... Oh wait, they're not... Besides, I think you're a little misguided regarding Havoc's operation.
Report Comment
 
 
# Just a simple question...dlb 2010-09-18 20:32
On the last page of the review, I see in the 'CPU PhysX' section of the graph, that the GTS450 delivers higher FPS than all the other GPUs tested. To be honest, I didn't read all the text below the graph very carefully, but how did the GTS450 'beat' the GTX480 (and all others listed) in this test by roughly 15-20%?

(the "NVidia own PhysX" and "ATI is being blackballed" crapola was inevitable LOL)
Report Comment
 
 
# RE: Just a simple question...Olin Coles 2010-09-18 20:38
Your guess is as good as mine. I pointed that out, and I find it interesting that CPU performance is inverse of GPU performance with these video cards. It's almost as if there's more restraint coming from the higher-end. Very odd, and I wish people would talk more about.
Report Comment
 
 
# High time to prosecute NVIDIAStan 2010-09-19 08:53
Well, NVIDIA has all rightd to implement PhysX the way it likes.

But not to subside (or racket) the article like this one, as this is a 100% swindle - to compare top-notch GPU implementation with last century CPI version.

The "Unfair competition" is still a criminal cause and this article is a bright example of it.
Report Comment
 
 
# 100% swindle?Olin Coles 2010-09-19 09:22
How can my article be constrewed as 100% swindle? I test the products and reveal the results. If there's a complaint, it shouldn't be against me or my article.
Report Comment
 
 
# The question always hold the resultStan 2010-09-19 10:41
Hi, Olin.

If your article would reveal that NVIDIA is not supporting modern CPUs and is forcing the end-user to use NVIDIA products only, then it would be a correct one. Otherwise it it (or at least looks as) a sponsored and highgly incorrect in conclusions article, continuing NVIDIA swindles - demand to pay for SLI moderboards, blocking the work of NVIDIA cards as the secondary for PhysX or ind AMD/NVIDIA set with Hydra chip, demand for x16 slot, because they can not make the internal interface to work even at PCIe v.1.1, etc.
Report Comment
 
 
# RE: The question always hold the resultOlin Coles 2010-09-19 10:46
Let me get this straight: NVIDIA is forcing people to buy their products? All this time I thought that it was a free market, and that AMD's products were doing just fine. Nobody is claiming that you MUST use PhysX, in fact I've often said the benefits outweight the costs. However, when it comes to Mafia II, I couldn't imagine playing this game without those special effects. This was an article about CPU vs GPU PhysX in Mafia II, not an AMD vs NVIDIA article. It's unfortunate that I must continue to point this out.
Report Comment
 
 
# RE: RE: The question always hold the resultAndrew H 2010-09-19 11:07
I don't buy that you were paid by NVIDIA to write this. Even if you were you couldn't admit it and there'd be no way to prove it, so it's pointless to even talk about. But I'm sorry, this wasn't an article about Mafia II.. The beginning of the article even says "In this article, Benchmark Reviews will demonstrate how far PhysX technology has come using the recently-released Mafia-II," and the headline reads "NVIDIA APEX PhysX Efficiency: CPU vs GPU". This was about PhysX, not Mafia. That said, it lacks some important frame of reference when comparing it's performance on NVIDIA GPUs vs CPUs. It's important to consider the roots of PhysX, discuss it's necessity, and consider the differences in codebase as well as NVIDIA's motives behind those differences when doing this comparison. Not doing so definitely leaves yourself open to the accusation of being paid by NV (not that I think you are). Also because of all this, you can't consider ATI's position and approach off topic. It's integral.
Report Comment
 
 
# RE: RE: RE: The question always hold the resultOlin Coles 2010-09-19 11:15
Wow, thank you for confirming the sheer skeptism of which this audience is capable. Additionally, thank you for pointing out that I was incorrect in my estimation of what I was writing about in my own article. Thank you so much.

Unless I'm wrong (again), APEX PhysX is new to PC video games and Mafia II was the first time it's been used. You might know more about all of this than I do, so please tell me how PhysX and APEX PhysX are 100% the same thing, and how a statement like "how far PhysX technology has come using the recently-released Mafia-II" doesn't apply to the introduction of APEX PhysX.

Seriously though, if this wasn't about APEX PhysX in Mafia II, then why did I spend four pages using the game as my central discussion? The only people who think that this article was about NVIDIA vs AMD are those readers who really want it to be. That being the case, it's a lot easier just to ask me for an article that focuses on that topic.
Report Comment
 
 
# RE: RE: RE: RE: The question always hold the resultStan 2010-09-19 11:30
Would agree - it is not NVIDIA vs. AMD - it's a continuation of NVIDIA vs. Intel

:)
Report Comment
 
 
# RE: RE: RE: RE: The question always hold the resultAndrew H 2010-09-19 11:49
Whoa, calm down buddy. I was just pointing out that saying article is about Mafia II and saying it's about PhysX has come have completely different implications, and saying that not addressing some stuff in the article left yourself open to the 'paid by NVIDIA' accusation...

As far as APEX PhysX vs. PhysX, correct me if I'm wrong, but APEX is just a set of tools to allow graphic artists access to PhysX capabilities without having to know how to do low-level code work (vs the low-level API in use before), and has zero to do with it's performance on a GPU vs CPU since it's still the standard PhysX code base (albeit a newer version than previously used) telling the hardware what to do. APEX is simply about PhysX being easier to adopt by game makers, as it saves them money hiring developers.
Report Comment
 
 
# RE: RE: The question always hold the resultStan 2010-09-19 11:16
Would agree, if I could add a secondary (cheap) NVIDIA card to allow PhysX effects on AMD card (PhysX needs only a smallportion even of the entry level card), but NVIDIA intentially blocked that use of their card - if this is not an unfair concurension swindle, then what else?

Yes, they left some CPU possibilities - but again, intentially very out-of the date - one more examle of unfair concurension swindle - this time with CPU solutions.

Your article title is "NVIDIA APEX PhysX Efficiency: CPU vs GPU" - not "Mafia II is much better on NVIDIA". Subtitle also says that Mafia II is only an example - and I'd agree - an example of pre-paid by NVIDIA demand not to use other technologies.
Report Comment
 
 
# RE: RE: RE: The question always hold the resultAdam 2010-09-19 12:09
You can use a second Nvidia card paired up with an ATI so you can use PhysX, there are custom drivers out and about which will allow you to do it.

On another note there's a rediculous number of ATI fanboys replying to this fairly innocent article, grow up chaps. They're two companies who both do a decent job of supplying graphical technologies to us, stop complaining about the one which you did not buy into.
Report Comment
 
 
# RE: NVIDIA APEX PhysX: CPU vs GPU Efficiencymihai 2010-09-19 12:53
Adam pls....

nvidia is not an honest company.... why? is very simple
nvidia is censoring a lot of things....sincerely i don't want to be nvidia partner because on hard times it's want just "her"to survive

i think nvidia doesn't have an scalable gpu architecture in terms of efficiency and they are trying to remain floating with pieces of wood from his last ships aka: physx, cuda or other payed software

a say again: to remain competitive you have to discover new stuff not a new scheme

it's a phrase in my country "some lady different clothes"
Report Comment
 
 
# RE: RE: NVIDIA APEX PhysX: CPU vs GPU EfficiencyPSM 2010-09-19 17:13
This article has already been blown out of proportion but the way it was presented I'm not surprised. And I don't think it's only fanboys that have fear of getting into a not so far future where you've to buy a gpu for just the sake of optimization, even if it has comparatively lower throughput. I'm not saying that NVIDIA should do charity with Physx and make it freesource but not optimizing it to be used with the latest HW really doesn't push things forward. while in MafiaII Physx looks amazing this fact should've been emphasised more. We all know what open marketing is and how it works but whenever it tends block the way of technological advancement, nothing stops the enthusiast community from slamming it. Isn't it a simple truth?
Report Comment
 
 
# RE: RE: NVIDIA APEX PhysX: CPU vs GPU EfficiencyAdam 2010-09-20 06:53
Wait, what? What are they censoring?

So cuda and PhysX are the only things keeping Nvidia going even though they currently hold the best value for money cards and currently the most powerful single gpu?

"a say again: to remain competitive you have to discover new stuff not a new scheme "
Eh, that makes little to no sense, no idea what you're getting at.


I've owned cards from both manufacturers in the past and although currently own a GTX460 would not hesitate in the future to buy an ATI if they have the right card for the right price.
Stop being a fanboy.
Report Comment
 
 
# RE: RE: RE: NVIDIA APEX PhysX: CPU vs GPU EfficiencyStan 2010-09-20 07:20
1. Nvidia was (and still is) traditionally ahead of AMD in drivers ability to get the maximum of the hardware. So goes for other software.

Think, if NVIDIA will through away (nearly half of the code) "protecting" their hardware from "not supported" configuration use it will become even better.

2. Also traditionally NVIDIA was (and is) bthind AMD in hardware implementation. This is not the topic of the article in question, but may be prooved easile using Benchmark Reviews materials.

And w/o big changes in H/w architecter, the next generation chips will be the "swan song".
Report Comment
 
 
# A few words about censorshipStan 2010-09-20 08:02
Sorry for some mistipes in the upper post - was really furiouse, as my post about the unfair (really, criminal) NVIDIA practices and proposal to the author to admit that camparison of highly optimased GPU software with intensionally deoptimised CPU software was his big mistake.
Report Comment
 
 
# Remain on-topicOlin Coles 2010-09-20 08:07
Please keep comments and discussion on-topic with this article.

This is not your personal sounding-board.
Report Comment
 
 
# okmark 2010-09-20 11:52
firstly the only real value oriented card is the 460 in nvidias stable, that wins that slot.
but as for the other towering infernos of power hungry monsters no thanks.

also since when is physx an open standard....nvidia owns it
##google.com.au/search?sourceid=navclient&ie=UTF-8&rlz=1T4ADSA_enAU395AU396&q=physx+open+standard

anyway either way you look at it nvidia need to get off its ass and make a dedicated physx card
Report Comment
 
 
# RE: NVIDIA APEX PhysX: CPU vs GPU Efficiencymihai 2010-09-20 12:15
gtx card for couple of games? is not worth-it maybe i didn't like the games
Report Comment
 
 
# RE: NVIDIA APEX PhysX: CPU vs GPU Efficiencymihai 2010-09-20 12:28
The fact is that : Nvidia always has optimised their gpu with bribe not with new gpu architecture or new technology, sold old card for generations with brand new name...sdk 283 and the list goes on from the beginning in to the future
Report Comment
 
 
# Two questions:Olin Coles 2010-09-20 12:39
I have two questions for you mihai:

1) Do you realize there's a 'Reply' button? You could avoid all of the new threads.
2) Do you understand what 'bribe' means? It means to give money in exchange for something.

First you claimed that NVIDIA bribed me to write an article about PhysX (which is disrespectful and very insulting), and then you claim NVIDIA optimized their GPU with 'bribe'.
Report Comment
 
 
# RE: Two questions:mihai 2010-09-20 13:08
Olin Coles...wow thank you man ! reply...that's cool ..i was under physx effect
i did not say Olin Coles takes bribe , sorry for you to understand that
but a can say that you insult our intelligence with this kind of reviews we (few billions) know how a global company sell their products
and of course i don't understand exactly what is bribe, because i am not native English....but i saw this word use it often in this kind of debates
anyway you, me, other, people we have a chance to communicate the facts as it is ...and this is enough, dont you think?....and this is the real mass-media
...how can you advise million of people? after an review?
an GPU is a unit that can do a lot of things ... we don't buy cars (GPU) for few roads right?
your advise point in one and only direction
Report Comment
 
 
# RE: NVIDIA APEX PhysX: CPU vs GPU EfficiencyDanny 2011-02-10 05:17
Seems like the High-end Forien Sports car Community are getting Highend Nvidia Based Customizable Graphical Instrument cluster panels and 10" or larger GPS units.

Well let me see, how bout a I7 2600k, Asus WS Revolution, one Quadro 5000, Three Tesla C2050's, Two Muskin 60 GB SSD's RAID 0, Four WD 640's RAID 10, 16 GB 1600 CAS7 Quad Channel memory. Sound Fast enough for ya, when you get an AMD/ATI setup to match, let dual....

I not saying AMD/ATI missed the bus, what i am saying is they took the short bus, not the ones the rest of us took to school!

:0)

Danny
Report Comment
 
 
# RE: This nonsense.John Mason 2011-02-22 00:53
Thanks for this outstanding article, now I'll just throw in my old 9800GTX card and use it for physics, woohoo.

Danny, is the winner!

nVidia is the world leading in visual computing technologies. There is no doubt, how you guys are fan boys or not, it doesn't give any sense. I ran my 4750 Crossfire cards for nearly 3 years, then boom, the Fermi launched and I jumped on that train. Then it seems that ATI is sort of making the 5970+ by just slapping on 2GB Video Memory to flex around with.

Even at the highest resolutions of benchmarking; through OCN, Guru3D.. I mean. Here is an old example: #img109.imageshack.us/img109/8202/vtgpuscp.jpg

We have one GPU chip on a card, almost outrunning ATI's monster with two chips slapped on the card. The rumors of nVidia's 590 is almost scaring me, and what is ATI coming out with? I might even change! As well as getting a free cloth-dryer when buying either of the monsters.

And as Danny said, Tesla C2050s.. supercomputing so far, I've haven't seen one machine using ATI/AMD; it's just IBM/Intel/nVidia chaos, and when even opening any GPU programming software, your card just choke without CUDA, but then again, the limitation is due to low texture rendering on early cards and so forth.

But check out ##geeks3d.com/20100606/gpu-computing-nvidia-cuda-compute-capability-comparative-table/ and perhaps you'll learn a bit of your favorite subject, or whatever you're discussing; it's like fanboy echos, or some sort of sci-fi-speak-stealth-war-N-vs-ATI-but-none-wants-to-yell-out-their-argument.
Report Comment
 
 
# RE: RE: This nonsense.John Mason 2011-02-22 00:55
nVidia is just growing, and ATI is sinking, it's weird, but it's true; and as of January 2011, Intel signed partnership with nVidia. Now that both world leading microchip semiconductor, Intel, and nVidia..

It's ATI's fault even back from 2006, putting out cards to sell on the market; unlike nVidia which lost revnue, due to focusing and developing next-gen- chips. AMD CEO Jen-Hsun Huang just left and joined nVidia for a pretty obvious reason...
Bla bla bla. Read the whole story about it where ever you like.. and this comment was totally off topic what so ever, but I've been reading it, and all the posts, and an article is an article; and thumbs up for contribution. What's up with people being enemies over some hardware? Just think: "Great, now have that, soon we're on the moon."

I grew up with black/white TV, and your thoughts of games? Stamped as sort of crazy.

Appreciate it! More educating, less arguing, it's 2011, enjoy it; thinking back, this was impossible-space-technology. (Star Trek fan!)
Report Comment
 
 
# The thrueMeFire 2012-02-16 07:12
Is that Nvidia corp has buyed the extinct Ageia PhisX, thats wy it's an exclusivity for Nvidia cards, and also a read something about using X87 instructions, its kinda primitive and its harder for the new Cpu process, this is called unfair play, as the PhisX can run only in Nvidia Gpu.
Report Comment
 

Comments have been disabled by the administrator.

Search Benchmark Reviews

Like Benchmark Reviews on FacebookFollow Benchmark Reviews on Twitter