Source Engine Benchmarks!
DOOManiac
Worst. Critic. Ever. Join Date: 2002-04-17 Member: 462Members, NS1 Playtester
![DOOManiac](http://members.cox.net/doomaniac/mynewhead.jpg)
<div class="IPBDescription">ATI and NVidia fans, PAY ATTENTION!</div>Now that CS:S Beta is out and came w/ the Video Stress Test, a benchmarking tool for HL2, some sites have started to do some benchmarks on the beta, obviously keeping in mind that it is a beta and there may be a few bugs in both performance and visual fidelity (but there weren't many).
Now I'm sure you all remember the fiasco last year when Valve announced ATI stomped the crap out of NVidia cards and people startced prophesizing bankruptcy claims and other silly nonsense. A few people (*cough cough*) mentioned that Valve would get the NVidia performance up before release but we were for the most part cast aside. Well we we right.
So without further ado: <a href='http://www.techreport.com/etc/2004q3/source-engine/index.x?pg=1' target='_blank'>The Tech Report</a> CS:Source benchmarks.
Now, I'll cut to the chase. Here's the important part kidos:
<!--QuoteBegin--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->mong the $499 "image products," the Radeon X800 XT Platinum Edition outdoes both flavors of GeForce 6800 Ultra, the "regular" 400MHz version and the 450MHz "overclocked in the box" model. The X800 XT PE's advantage is most pronounced at with antialiasing and anisotropic filtering enabled. For instance, with 4X AA and 8X aniso at 1280x1024 resolution, the Radeon hits 87 frames per second, while the GeForce 6800 Ultra OC averages 80 FPS and the Ultra 74 FPS. This isn't exactly dominance, but ATI is clearly on top.
Down at $399, though, it's a different story. The GeForce 6800 GT slightly but surely outperforms the Radeon X800 Pro without aniso and AA. With 4X AA and 8X aniso, the two cards are virtually tied across all four resolutions we tested.
At $299, we approach the sorts of graphics cards that many folks might actually consider buying. Here, the aging Radeon 9800 XT faces off against the brand-new GeForce 6800, and the NVIDIA card has the edge in the majority of our tests. Only in the most brutal conditions, at 1600x1200 with AA and aniso enabled, does the Radeon prevail.
Jump down near the $199-ish range, and the field gets a little crowded, with various flavors of Radeons and GeForce FXs vying for attention. I'd pick the battle of the Radeon 9600 XT versus the GeForce FX 5700 Ultra as the most interesting comparison here. The FX card isn't doing reflective water and can't run with antialiasing in this CS: Source beta version, but otherwise, the two cards pump out frames at nearly the same rate.<!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
Also worth noting is that in addition to pure speed, the visual fidelity between the two cards at their perspective tiers is nearly indistinguishable.
What does this mean to you and me? One simple thing:
<b><u>Do not listen to the fan boys.</u></b> (On either side)
Performance and visual quality is neck and neck with both NVidia and ATI. Go with your personal preference, price, or simply whatever you can get your hands on easiest/first. Regardless of which card you get, you're gonna be good to go.
Now I'm sure you all remember the fiasco last year when Valve announced ATI stomped the crap out of NVidia cards and people startced prophesizing bankruptcy claims and other silly nonsense. A few people (*cough cough*) mentioned that Valve would get the NVidia performance up before release but we were for the most part cast aside. Well we we right.
So without further ado: <a href='http://www.techreport.com/etc/2004q3/source-engine/index.x?pg=1' target='_blank'>The Tech Report</a> CS:Source benchmarks.
Now, I'll cut to the chase. Here's the important part kidos:
<!--QuoteBegin--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->mong the $499 "image products," the Radeon X800 XT Platinum Edition outdoes both flavors of GeForce 6800 Ultra, the "regular" 400MHz version and the 450MHz "overclocked in the box" model. The X800 XT PE's advantage is most pronounced at with antialiasing and anisotropic filtering enabled. For instance, with 4X AA and 8X aniso at 1280x1024 resolution, the Radeon hits 87 frames per second, while the GeForce 6800 Ultra OC averages 80 FPS and the Ultra 74 FPS. This isn't exactly dominance, but ATI is clearly on top.
Down at $399, though, it's a different story. The GeForce 6800 GT slightly but surely outperforms the Radeon X800 Pro without aniso and AA. With 4X AA and 8X aniso, the two cards are virtually tied across all four resolutions we tested.
At $299, we approach the sorts of graphics cards that many folks might actually consider buying. Here, the aging Radeon 9800 XT faces off against the brand-new GeForce 6800, and the NVIDIA card has the edge in the majority of our tests. Only in the most brutal conditions, at 1600x1200 with AA and aniso enabled, does the Radeon prevail.
Jump down near the $199-ish range, and the field gets a little crowded, with various flavors of Radeons and GeForce FXs vying for attention. I'd pick the battle of the Radeon 9600 XT versus the GeForce FX 5700 Ultra as the most interesting comparison here. The FX card isn't doing reflective water and can't run with antialiasing in this CS: Source beta version, but otherwise, the two cards pump out frames at nearly the same rate.<!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
Also worth noting is that in addition to pure speed, the visual fidelity between the two cards at their perspective tiers is nearly indistinguishable.
What does this mean to you and me? One simple thing:
<b><u>Do not listen to the fan boys.</u></b> (On either side)
Performance and visual quality is neck and neck with both NVidia and ATI. Go with your personal preference, price, or simply whatever you can get your hands on easiest/first. Regardless of which card you get, you're gonna be good to go.
Comments
They are comparing price range, they are basically the same price, therefore the same price range. 6800 is brand new but they had to take a lot of stuff of off it to make it that cheap.
This is what I expected the benchmarks to be around, not the blantantly wrong benchmarks in that other post a couple days ago. Personally I can't wait for the 6600 models to come out and battle with the even lower end cards around $150-$200 dollars.
Those benchmarks were severly flawed and completely wrong.
Im sure they know what they are doing. Just because you can get cards cheaper doesnt mean they can.
Now accept and dont fanboy it
the card im getting with my new PC isnt even on that list
(but i had a VERY tight budget on it, so oh well)
<!--QuoteBegin-DOOManiac+Aug 20 2004, 04:38 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (DOOManiac @ Aug 20 2004, 04:38 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> What does this mean to you and me? One simple thing:
<b><u>Do not listen to the fan boys.</u></b> (On either side) <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
You're going to get a great HL2 experience, both visually and speed-wise, regardless of which of the two you go with.
Except those FX cards. They suck. (And I'm an NVidia fan, so I can say it :P)
As for the FX cards, I feel sorry for anyone duped into buying them <!--emo&:p--><img src='http://www.unknownworlds.com/forums/html//emoticons/tounge.gif' border='0' style='vertical-align:middle' alt='tounge.gif' /><!--endemo--> You could probably stick a waffle into your AGP slot and get better performance.
It'll be interesting how ATI's current mid-range generation compares against nVidia's however (X600 vs 6600).
[EDIT]
This is just to settle the ATI vs nVidia quality debate. Click for full-res pics.
<a href='http://www.doriangray.ca/images/gfx/atidoom3.jpg' target='_blank'><img src='http://www.doriangray.ca/images/gfx/atidoom3small.jpg' border='0' alt='user posted image' /></a>
<a href='http://www.doriangray.ca/images/gfx/nvdoom3.jpg' target='_blank'><img src='http://www.doriangray.ca/images/gfx/nvdoom3small.jpg' border='0' alt='user posted image' /></a>
Can you tell which is from which manufacturer? ATI's is slightly sharper, but theres not a huge difference (mind you this is from the top-of-the-line cards from each manufacturer, on ultra quality in Doom3). Anyways, the top one is an X800XTPE and the bottom is a GeForce 6800Ultra OC.
<!--QuoteBegin-DOOManiac+Aug 20 2004, 04:38 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (DOOManiac @ Aug 20 2004, 04:38 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> What does this mean to you and me? One simple thing:
<b><u>Do not listen to the fan boys.</u></b> (On either side) <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
You're going to get a great HL2 experience, both visually and speed-wise, regardless of which of the two you go with.
Except those FX cards. They suck. (And I'm an NVidia fan, so I can say it <!--emo&:p--><img src='http://www.unknownworlds.com/forums/html//emoticons/tounge.gif' border='0' style='vertical-align:middle' alt='tounge.gif' /><!--endemo-->) <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
/me punches DooManiac in the gut
Don't listen to him, ol 5900XT girl. <!--emo&:(--><img src='http://www.unknownworlds.com/forums/html//emoticons/sad-fix.gif' border='0' style='vertical-align:middle' alt='sad-fix.gif' /><!--endemo-->
I'm sticking with DooManiac too, he seems to be the only person capable of creating a balanced argument.
I'm and nVidia fan anyways, mainly because of the 2 ATI GPU's i have had, one exploded and the other is just plain crap taking into account it's a mobility card. I'm sure people have had similar experiences with nVidia, but from my experience nVidia FTW.
BTW, does anyone have any info on the Gainward GeForce 6800 256DDR golden sample vs an ATI card? Using the standard software which comes with the card allows you to overclock them, just wondering how they fair to ATI in that state?
I'm sticking with DooManiac too, he seems to be the only person capable of creating a balanced argument.
I'm and nVidia fan anyways, mainly because of the 2 ATI GPU's i have had, one exploded and the other is just plain crap taking into account it's a mobility card. I'm sure people have had similar experiences with nVidia, but from my experience nVidia FTW. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
i agree doom and coil are the only ones i really didnt want to crap myself while reading their arguments.
That said (and to repeat myself), this is great news. Get whichever card you like. (:
Instant signature quote.
<img src='http://www.doriangray.ca/images/gfx/nvzoom.jpg' border='0' alt='user posted image' />
<img src='http://www.doriangray.ca/images/gfx/atizoom.jpg' border='0' alt='user posted image' />
The edges are more pronounced in the Radeon image than in the Geforce image. Not much of a difference, but it exists.
Basically which card you want depends on the game. If you want top-of-the-line HL2, go for an ATI. If you want uberly good Doom3, go for nVidia. Both brands seem to do both games quite well, but since each game was designed specifically with either one of the cards in mind, one will obviously perform slightly better.
Also, neither company has released drivers with major optimization for the Source engine yet. From Doom3, those drivers can make a significant difference...
<!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
I would prefer you not to say things like that. Dorian is being VERY netural.
<!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
The games where mostly designed with directX and openGL in mind respectively. It's not like VALVe went yeah lets make a really shader intensive game to make nV's fx hardware suck as much as possible. They wanted to take their games in those directions and different hardware just happened to be suited better to different games. Only fan boys are afraid to admit that ATi and nVidia both have their own sets of problems and merits.
One, you saved them in .jpg format, which is lossy. NO ONE who is comparing Image Quality (IQ) will save in a JPG, only BMP, TIFF, or possibly PNG. You lose a good bit of the worst artifacting in the compression.
Two, texture corruption on the nVidia pic is still visible. (Look at the 'plates' on the walls, along the edges)
Three, you saved them with filenames to identify which was which.