PDA

View Full Version : Geforce FX Benchmarks?



Drunken Warrior
01-04-2003, 02:12 AM
I found this site with a Geforce Ti 4600 and Geforce FX comparison.

Benchmark? (http://www.aceshardware.com/read_news.jsp?id=60000459)

OUTLAWS Spike
01-04-2003, 03:26 AM
Holy Sh1t Batman!!!
The fx blew away the ti4600

OUTLAWS The Machine
01-04-2003, 04:28 AM
Thanks Whymeeee. I wish I caoul see some comparison between the FX and 9700.

Falcor
01-04-2003, 03:58 PM
Holy Jeez that's some leaps and bounds.

OUTLAWS Behind You?
01-04-2003, 06:02 PM
Originally posted by OUTLAWS The Machine@Jan 3 2003, 10:28 PM
Thanks Whymeeee. I wish I caoul see some comparison between the FX and 9700.
I agree with Machine, these benchmarks are like comparing apples to oranges. We all knew the FX would blow the current 4600 away. I want to see how it compares to its real competition, the 9700.

Aries
01-05-2003, 04:37 PM
Originally posted by OUTLAWS The Machine@Jan 4 2003, 04:28 AM
Thanks Whymeeee. I wish I caoul see some comparison between the FX and 9700.
I found this link on a Belgian gamessite

Although I have no idea what they are talking about (I'm technically challenged)... I think this might be a small comparison between the FX and the Radeon

http://www.maximumpc.com/features/feature_...2003-01-03.html (http://www.maximumpc.com/features/feature_2003-01-03.html)


I hope this helps

OUTLAWS The Machine
01-05-2003, 05:17 PM
Thanks Aries!

The following preview is of an early GeForce FX sample that was hand-delivered to the Maximum PC Lab by an Alienware representative. Our full preview of Alienware’s new prototype machine and the GeForce FX can be found in the February issue of Maximum PC.



We first heard about the GeForce FX, then code-named NV30, in June 2002. We received a run-down of its feature set -- pixel and vertex shaders that exceed the DirectX 9 spec, 128-bit floating-point precision throughout the 3D pipeline, and support for DDR II memory -- but weren’t able to finagle access to working silicon. Until now.



Of course, the GeForceFX card that came inside our Alienware prototype system was just as “beta” as the rest of the system. With early drivers and freshly fabbed silicon, the card we tested isn’t quite what you’ll find in stores when the card ships in February or March. In fact, the board and its drivers were so unpolished, nVidia initially refused to let us benchmark it at all, and relented only when we agreed to limit our tests to pre-approved benchmarks running at stipulated resolutions and AA settings. We gave in to all these conditions because we were intent on reporting the first GeForce FX benchmark scores, however beta they may be. Driver refinement is an ongoing process -- before and after a videocard launch -- and frame rates will improve as nVidia optimizes more and more for specific engines.



It would be silly to extrapolate fine details about the card’s performance from such a small benchmark sample. It would also be unfair, considering the un-optimized condition of the drivers. But we can make some broad guesses about the strengths and weaknesses of nVidia’s new technology. In Quake III running at 1600x1200, 32-bit color and 2x anti-aliasing, the GeForce FX is about 40 percent faster than the ATI Radeon 9700 Pro at the same settings. The GeForce is almost 20 percent faster than the 9700 Pro in the Unreal Tournament 2003 Asbestos fly-by demo at these same settings. However, in the 3DMark 2001:SE Game 4 benchmark at these settings, the Radeon 9700 is about 10 percent faster than the GeForce FX.



What does this suggest? That the GeForce FX is very fast -- particularly when memory bandwidth isn’t an issue. Remember that the GeForce FX’s 128-bit memory bus runs at 500MHz, but has a maximum bandwidth of just 16GB/sec. Meanwhile, the Radeon 9700’s 256-bit memory interface accommodates 19.8GB/sec, even though it runs at just 325MHz.



The GeForce FX’s core graphics processor is much faster than the Radeon 9700’s, so it will be able to draw as many polygons and fill as many pixels as will fit across the memory pipeline. Our hunch is that turning on 4x anti-aliasing at 1600x1200 would diminish the GeForce’s performance lead over the Radeon, or maybe even nix it entirely. But that’s just a guess based on the scores we achieved, and the fact that nVidia wouldn’t let us run anything that would stress the memory pipeline.



We are much more surprised by the Game 4 scores. We expected to see the GeForce FX’s 500MHz core flex its programmable-shader muscle in this DirectX 8 benchmark. nVidia says that the FX’s programmable shaders are able to run more complex shader programs than those mandated by the DirectX 9 spec. Our guess is that the nVidia drivers just aren’t tuned for this particular benchmark yet.



The practical upshot is that if next year’s games -- specifically DooM III and its programmable-shader brethren -- require more raw GPU power than sheer memory bandwidth, the GeForce FX architecture will be a perfect fit. On the other hand, if next year’s games are starved for memory bandwidth, the Radeon 9700 could very well be a better choice for frame rate–hungry gamers. This is just the first round, though. We have no doubt that ATI has plans for a souped-up Radeon that will be ready to roll as soon as the GeForce FX ships. And if you really twisted our arms, we’d bet money that it will be running on a 0.13-micron core and using 256-bit DDR II memory.



Dare to Compare: GeForce FX Early Benchmarks

GeForce FX

Quake3 Demo001, 1600x1200 2xAA: 209fps

UT 2003 Asbestos, 1600x1200 2xAA: 140fps

3DMark Game4, 1600x1200 2xAA: 41fps



Radeon 9700 Pro

Quake3 Demo001, 1600x1200 2xAA: 147fps

UT 2003 Asbestos, 1600x1200 2xAA: 119fps

3DMark Game4, 1600x1200 2xAA: 45fps

Tests were run in the Alienware prototype system.

Grimmy
01-05-2003, 06:37 PM
any set date on when that new FX card will be out?

OUTLAWS DirtGod
01-06-2003, 02:22 PM
Nope but you should see them before febuary ends

Death-Dude
01-06-2003, 10:47 PM
I saw that very early look at the FX card recently, too, and it is exciting, as the card we will see will be that or more, right? It will also be interesting to see how ATI counters - no doubt in a way to maximize the fatter memory pipe they have. I'm a little surprised that nvidia hasn't moved ahead there, as ATI already has. Maybe the 6-month-old versions that nvidia loves to put out will have that improvement on them. Either way, it is a damn fast card, and will probably be the card of choice for Doom 3. And, if you can't reach that deep in your wallet, 'settling' for a 9700 Pro ain't bad either!

Sirc
01-06-2003, 10:51 PM
You can bet that the FX drivers from those benchmarks have not been fully optimized yet either. The numbers will no doubt get better. :thumbs:

Death-Dude
01-07-2003, 02:48 AM
Originally posted by OUTLAWS Sirc@Jan 6 2003, 10:51 PM
You can bet that the FX drivers from those benchmarks have not been fully optimized yet either. The numbers will no doubt get better. :thumbs:
Yeah, I'm sure the retail jobs with better drivers will enhance those numbers yet. Can you imagine playing at 16X12, with 4X AA on, liquid-smooth? Or Doom 3 with AA at all? Me be horny!! :woot: :rofl:

MR. SLiK
01-07-2003, 02:53 AM
Originally posted by Death-Dude+Jan 6 2003, 06:48 PM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (Death-Dude @ Jan 6 2003, 06:48 PM)</td></tr><tr><td id='QUOTE'> <!--QuoteBegin--OUTLAWS Sirc@Jan 6 2003, 10:51 PM
You can bet that the FX drivers from those benchmarks have not been fully optimized yet either. &nbsp;The numbers will no doubt get better. &nbsp; :thumbs:
Yeah, I&#39;m sure the retail jobs with better drivers will enhance those numbers yet. Can you imagine playing at 16X12, with 4X AA on, liquid-smooth? Or Doom 3 with AA at all? Me be horny&#33;&#33; :woot: :rofl: [/b][/quote]
mmmm... Doom3&#33;&#33;&#33;&#33;