PDA

View Full Version : physics PPU - vaporware - RIP



JIMINATOR
03-20-2006, 08:39 PM
You may have seen the announcement about the physics GPU addin card last year, nothing apparently ever came of it. Well, it turns out that nvidia is going to be teaming with havok to offer a development kit that allows the video card gpu to be used for physics calculations. Now - this is not going to be a new video card, but instead new software, that will work on 6x00 series and up, and on ati's also. The current crop of cards has enough muscle to spare for physics, and this also may be a reason to upgrade. Pretty damn cool. Has to be the nail in the coffin for that PPU nonsense.

http://theinquirer.net/?article=30413

Caged Anger
03-20-2006, 10:40 PM
cool beans man, good find

Sirc
03-21-2006, 01:35 AM
The current crop of cards has enough muscle to spare for physics

Erm, no, they don't. Where did that come from? I can run max settings for all of the current games, but Doom 3 still can get a bit laggy occasionally. There is no way that the gpu can be handling the physics of a large particle system and not have it thump the graphic performance. Lies!

And the next generation of graphics cards won't have the spare muscle, nor will the next, or the next. As the graphics slowly become more and more photo-realistic the gpu will be called upon to perform more and more graphics-oriented calculations.

You have taken the word of nVidia as truth! Have you learned nothing? :P

Wiper
03-21-2006, 03:01 AM
At least u can make a choice between: (occasionally) laggy or lower graphics ;)

I think it will be an outcome for my pc where the cpu the bottleneck is and so my card not fully used...

not fully used = spare muscle :P

Sirc
03-21-2006, 03:16 AM
If your cpu/system is a bottle-neck for your graphics card, I doubt that it would very well anyway. Unless the physics calcs stay on the card and don't require system memory or the mobo's cpu. I dunno. The whole thing smells of smoke and mirrors to me.

JIMINATOR
03-21-2006, 04:17 AM
hrm, doom3 is capped at 60 or 70 fps, I forget which. I am able to run quake4 with a 7800gt and no issues. the thing about the physics is that the video cards are doing the geometry anyway. in reading another article, it looks like nvidia is saying that it will be mainly for sli systems. They had some boulder benchmark, with 15000 bouncing boulders, and the sli system managed 64 fps while just a cpu did 6.2. so it was a huge difference.

more info here...

http://enthusiast.hardocp.com/article.html?art=MTAwNSwxLCxoZW50aHVzaWFzdA==

and here:

http://www.tgdaily.com/2006/03/20/nvidia_sli_forphysics/

so you have a sli system, i am sure it is underutilized by most games. what if you can now have awesome physics with only the need to download drivers? certainly nvidia is doing it to sell more sli systems, but now at $300x2 for a pair of 7900gt's, it is very affordable. (remember, a 7900gt = a 7800gtx, if you have evga cards, they have an upgrade program that you should attempt....)

Sirc
03-21-2006, 06:23 AM
Geometry != physics. But, I agree that the SLI gpus aren't being used to their capacity. Doesn't the newest Asus mobo use dual PCIx to run the SLI at 2x16 instead of 2x8? I thought I read something like that somewhere - lol, Google "something like that somewhere" and you should find the article. :thumbs:

Still, by the time games actually support this, my hardware will be obsolete anyway. It sounds great in theory, but we'll see how it stands up to future cpus/gpus. Hardware implementations are always better than software, and I can see just about the time (or before) anything like this actually becomes a reality there will be a better hardware solution integrated into either gpus or cpus.

Wiper
03-21-2006, 01:04 PM
If your cpu/system is a bottle-neck for your graphics card, I doubt that it would very well anyway. Unless the physics calcs stay on the card and don't require system memory or the mobo's cpu. I dunno. The whole thing smells of smoke and mirrors to me.


Yes, I'm not sure if it will work indeed, but it can't hurt to try...

Reasons why I still got hope is cuz I have a 6600GT card (core 500, memory 1000) While the normal 6600 has 300/550 wich also runs the latest games on fairly good settings.

With my processor clocked at 1.9 ghz an increase of both (core and memory: allmost twice as much especially when u overclock) might be "very noticable" during gaming :)

ME BIGGD01
03-21-2006, 05:18 PM
SLI seemed to be a promising technology but overall it has to many bugs and most games are not written for it. The problem in my opinion is not SLI but the programmers. I take this approach with all of the buggy probles in software. I am no where near a programmer or can I releate to their workload but from a business stance, I see that they are given a limted time to finish a prodoct which causes shortcuts to be made. As far as lag in a game, it is not always the video card that can cause this. Lets not forget the limits of the pc platform. Everything requires its numbers to be crunched a different way which is not only utilizing the vpu but it uses the cpu, memory, hardrive. At this time I think Crossfire from ATI has a better plan as far as using 2 cards together. Still, I have avoided each because I just don't think either are worth buying an extra video card for due to the problems and performance gains. Mostly I usually play my games with no more then 1024 resolution which defeats the purpose of either.

Most people who are running SLI usually are not running or fitting the true requirements. This makes things even more expensive. Although most will see things running smoothly, the tax will eventually catch up and damage one of the parts in their pc or just the power supply. the 6x series of cards in SLI are more power hungry then the 7x series. Still I would not considr anyone getting less then a 600 watt top graded supply.

If you know me I love innovation regadrless if it works well or not. I like to see changes because they always are cleaned up in future revisions. As far as bandwidth of the PCIxpress lanes, there is more then enough and should not have anything to do with lag or cause of lag. I blame the drivers and coding for this.

I also will hold off buying any SLI solution considering the time of life/performance vs next generation cards. I think the competition is moving things too fast for anyone to get a worthy investment. It was not long ago that2 6800 cards were the top of the line and would of costed 1000 bucks or more to have in sli. Considering the lack of support for SLI it seems that was a bad investment considering the newer cards run faster then 2 of those cards in single mode. That does not include the electric bill either.

JIMINATOR
03-22-2006, 11:18 PM
hmmm, i may have been wrong about the physics thing.
take a look at http://physx.ageia.com/footage.html
they have some videos, including a with and without video.
the effects look pretty sweet. i don't really care for the ageia solution
(dell includes the card in their quad-sli system for $9000)
but the thought of having a couple of video cards and a
slider to adjust the physics processing allocation, well that is extremely
appealing.