Gamersdepot Pans Nvidia, Cheers Ati

coilcoil Amateur pirate. Professional monkey. All pance. Join Date: 2002-04-12 Member: 424Members, NS1 Playtester, Contributor
<div class="IPBDescription">no, they're not sellouts.</div> Gamersdepot did a review of the <b>ATI Radeon 9800 Pro 128MB</b> and the <b>nVidia GeForce 5900 Ultra 256MB</b>, examining how they ran the 1.5 Press Release of Halo PC, as well as Tomb Raider AoD (yes, it's a crap game, but it's a visually intensive crap game). The results were... telling.

Here are the benchmarks for Halo PC:
<img src='http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_desktop/halo_beta.gif' border='0' alt='user posted image'>

Here are the benchmarks examining Shader performance (a key element of DirectX 9, and one of the major reasons DX9-enabled games like HL2 and Doom3 look as good as they do) in Tomb Raider AoD at 1024x768:
<img src='http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_desktop/tomb_raider.gif' border='0' alt='user posted image'>

Read the full article <a href='http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_desktop/001.htm' target='_blank'>here</a>.

Doesn't leave a lot to the imagination - 2.0 shader performance on the nVidia card is abysmal compared to the ATI card. But wait... there's more!

Gabe Newell announced HL2 benchmarks yesterday; here are some telling remarks from the opening of the GamersDepot article on it:
<!--QuoteBegin--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->In his speech, Gabe outlined several key points of both personal and professional frustration:
- Valve take serious issue with "optimizations" from NVIDIA as of late
- In no certain words, Valve is highly disappointed with current NV3x hardware as a high-performance DX9 accelerator
- Valve will have Half-Life 2 treat NV3x hardware as DX8.1 hardware for the default settings.

According to Gabe, the rumors and speculation of ATI paying them off is nothing but bull - he said Valve's top priority has everything to do with them wanting HL2 players to have the best experience. After doing some early-on benchmarking between NVIDIA and ATI, the choice was clear from Valve's standpoint of who to partner with.

Microsoft's DirectX team was on-hand to give full blessing to Valve's upcoming HL2 benchmark - in fact it's being referred to as the <b>most complete DX9 benchmark to date</b>.<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
<i>"Valve will have Half-Life 2 treat NV3x hardware as DX8.1 hardware for the default settings."</i> -- While HL2 can still run DX9 detail with an nVidia card, this means that if you have a GeForce FX, your <i>default </i>visual options will *not* include the extra DX9 goodies. You can enable them yourself, but expect a performance hit.

Here are benchmarks from the press release.
This graph is each card running DX9, and the nVidia cards running both "mixed mode" (less detail) and the so-called "Full Precision" mode, which is the way Valve "intended" HL2 to be seen. The Radeon is running Full Precision:
<img src='http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_desktop/HL2_benchmarks/dx9_fullvpre.gif' border='0' alt='user posted image'>

In a FPS-per-dollar comparison, both the 9600 (it's actually a pro; that's a typo in the image) and the 9800 were better buys than any of the nVidia cards:
<img src='http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_desktop/HL2_benchmarks/dx9_full.gif' border='0' alt='user posted image'>

Why will HL2 default to DX8.1 for nVidia cards? Because of these results. Note the most amazing point -- a GeForce 4 Ti 4600 running DX8.1 actually gets BETTER performance than two of the three FX cards when THEY are running DX8.1!
<img src='http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_desktop/HL2_benchmarks/hl2_valve_dx9v8.gif' border='0' alt='user posted image'>

Full article is <a href='http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_desktop/HL2_benchmarks/001.htm' target='_blank'>here</a>.

With all this, on top of the fact that the 9800 Pro costs less than 75% of the FX 5900... all I can say is that I know what card I'm going to be upgrading to.
«1

Comments

  • BeastBeast Armonkyi Join Date: 2003-04-21 Member: 15731Members, Constellation
    Just something which likely complicates this more, there are good and bad nvidia cards, and they have not said which of the nvidia cards they tested..so...this is a bit meaningless in my opinion. Same goes for the radeons I think.
  • MrMojoMrMojo Join Date: 2002-11-25 Member: 9882Members, Constellation
    I don't trust sites doing back to back card reviews.
  • Siberian_DingoSiberian_Dingo Join Date: 2003-01-15 Member: 12326Members
    i still love my Gerforce 3 ti 500, and i will continue to buy Nvidia cards. and no benchmark is goig to change my mind. i usaly get the VisionTek or PNY boards for them.
  • MerciorMercior Join Date: 2002-11-02 Member: 4019Members, Reinforced - Shadow
    You have to remember that the ATI drivers are far, FAR <b>FAR</b> worse than the geforce ones, and have been optimised for benchmarking tests to try to trick people into buying their card. When you actually come to play a game with a radeon, you'll find it doesnt perform even close to the geforce series. I had a voltmodded radeon 9700 pro not long ago and it got about 40fps in half life, and 15fps in battlefield if I was lucky. My current geforce 4 is >100fps on both those games.
  • DOOManiacDOOManiac Worst. Critic. Ever. Join Date: 2002-04-17 Member: 462Members, NS1 Playtester
    Indeed, after seeing this I really want to get a 9800 Pro. Then I remember I'm about $900 in debt, so I guess I should pay that off first.. :P

    BTW November 15th or so = video card price drop day. Remember kiddos!
  • coilcoil Amateur pirate. Professional monkey. All pance. Join Date: 2002-04-12 Member: 424Members, NS1 Playtester, Contributor
    Mercior, did you miss the fact that the first article explicitly put the cards to work on two *games*, and not on benchmark tests?

    Regarding ATI drivers, they had this to say about their new architecture. Yes, it was said by ATI, so take that for what it's worth:
    <!--QuoteBegin--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->The R300 architecture was built from the ground up for performance in DirectX 9. DX9 instructions <i>map naturally to our hardware, without any tweaking or driver optimizations.</i> This is important, because the vast majority of games out there won't have the benefit of driver optimizations from anyone (unlike certain game benchmarks), because no-one has the engineering resources to spare. Whether you look at brute force (our 8 pipes versus the competitor's 4) or elegance (we can run many shader operations in parallel that our competitor can't) we have a fundamental advantage with our hardware. With more and more DirectX 9 games coming onto the market, the battle will be all about who can run shaders faster and more efficiently. And our shader performance is hugely better. ShaderMark and other tests show shader performance that is three to six times better on ATI's hardware than Nvidia's. This architectural advantage is evident in shipping games like Tomb Raider: Angel of Darkness. You will also see this in Half-Life 2 and every DX9 game coming out before the holiday season."<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
    Yeah, benchmarks aren't the best indicators... but results in actual games are rather hard to refute, Mercior.
  • SillyGooseSillyGoose Join Date: 2003-03-16 Member: 14572Members, Constellation
    <!--QuoteBegin--Siberian Dingo+Sep 11 2003, 03:35 PM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Siberian Dingo @ Sep 11 2003, 03:35 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> i still love my Gerforce 3 ti 500, and i will continue to buy Nvidia cards. and no benchmark is goig to change my mind. i usaly get the VisionTek or PNY boards for them. <!--QuoteEnd--> </td></tr></table><span class='postcolor'> <!--QuoteEEnd-->
    Visiontek only makes ATi cards now
  • DOOManiacDOOManiac Worst. Critic. Ever. Join Date: 2002-04-17 Member: 462Members, NS1 Playtester
    which makes me sad, unless I do cave in and get the 9800 pro, then I can go "yay visiontek!"

    Visiontek = best video card maker *EVER*
  • DOOManiacDOOManiac Worst. Critic. Ever. Join Date: 2002-04-17 Member: 462Members, NS1 Playtester
    <!--QuoteBegin--coil+Sep 11 2003, 02:12 PM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (coil @ Sep 11 2003, 02:12 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> In a FPS-per-dollar comparison, both the 9600 (it's actually a pro; that's a typo in the image) and the 9800 were better buys than any of the nVidia cards:
    <img src='http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_desktop/HL2_benchmarks/dx9_full.gif' border='0' alt='user posted image'> <!--QuoteEnd--> </td></tr></table><span class='postcolor'> <!--QuoteEEnd-->
    I found this part to be the most impressive myself, as the 9600 Pro is definately an affordable card.

    What'd I'd love to see is a huge benchmark including every major video card type as far back as at least Geforce 3 generation, maybe even back to GF1, all on low, med, and high qualities.
  • WheeeeWheeee Join Date: 2003-02-18 Member: 13713Members, Reinforced - Shadow
    edited September 2003
    You mean the drivers that nVidia "optimizes" things on by forcing them to run in lower floating-point precision, Mercior?

    *edit* btw, voltmodding is dangerous, and can ruin your board. 40fps is ridiculously low, I get 99-100 fps most of the time with my 9500 non-pro. Dunno what your card's problem was.
  • Soylent_greenSoylent_green Join Date: 2002-12-20 Member: 11220Members, Reinforced - Shadow
    I run a 9600 pro and I can't remember the last time I saw less than 100 fps(vsync on 1280x1024 no AA no AF). Makes me wonder, did you volt mod the card before or after getting 40 fps in HL <!--emo&;)--><img src='http://www.unknownworlds.com/forums/html/emoticons/wink.gif' border='0' style='vertical-align:middle' alt='wink.gif'><!--endemo-->.

    Did you bother to properly install the drivers?
  • MerciorMercior Join Date: 2002-11-02 Member: 4019Members, Reinforced - Shadow
    edited September 2003
    I got the voltmodded card off a friend who does a lot of overclocking (he was the first guy to acheive 4ghz without liquid nitrogen \o/), he assured me it was the best radeon he had. I suspect that the radeon cards will outperform geforce cards (very slightly) in Half-Life 2, but radeons can barely run half life (or any older games) because of bugs in their drivers which ATI dont seem to care about fixing. Personally I am sticking with the GF-Series for their solid drivers, regular updates & best all-round performance.

    edit: The card ran GTA Vice City & Unreal 2 just fine, so I'm pretty sure that its driver issues causing the half life problems. Another friend of mine recently returned a radeon card because it couldnt run half life reasonably.
  • CForresterCForrester P0rk(h0p Join Date: 2002-10-05 Member: 1439Members, Constellation
    *sniffs the air lightly* I smell fair-weather fans on their way over to ATi's side until the next nVidia card.

    ATi forever. <!--emo&;)--><img src='http://www.unknownworlds.com/forums/html/emoticons/wink.gif' border='0' style='vertical-align:middle' alt='wink.gif'><!--endemo-->
  • DOOManiacDOOManiac Worst. Critic. Ever. Join Date: 2002-04-17 Member: 462Members, NS1 Playtester
    edited September 2003
    The plot thickens.

    NVidia has responded with:
    <!--QuoteBegin--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->The optimal code path for ATI and NVIDIA GPUs is different - so trying to test them with the same code path will always disadvantage one or the other. The default settings for each game have been chosen by both the developers and NVIDIA in order to produce the best results for our consumers.

    In addition to the developer efforts, our driver team has developed a next-generation automatic shader optimizer that vastly improves GeForce FX pixel shader performance across the board. The fruits of these efforts will be seen in our Rel.50 driver release. Many other improvements have also been included in Rel.50, and these were all created either in response to, or in anticipation of the first wave of shipping DirectX 9 titles, such as Half Life 2.

    We are committed to working with Gabe to fully understand his concerns and with Valve to ensure that 100+ million NVIDIA consumers get the best possible experience with Half Life 2 on NVIDIA hardware.<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->(which is what some have already pointed out)
  • MavericMaveric Join Date: 2002-08-07 Member: 1101Members
    Now, this sort of stuff really p***es me off to no end.

    "BUY [THIS] CARD AND USE DX9 ON [THAT GAME] AND GET MORE VISUALS!!!"
    "BUY [THE OTHER] CARD AND USE DX 8.1 ON [THAT GAME] AND GET MORE FPS!!!"


    ____
    F***! Now what am i supposed to do when i buy HL 2?! Will i be gimped because HL 2 doesn't "like" nVida cards but "loves" ATI cards?! Can i be the first to say: "WT*H AM I SUPPOSED TO DO NOW?!?" ?

    <!--emo&:(--><img src='http://www.unknownworlds.com/forums/html/emoticons/sad.gif' border='0' style='vertical-align:middle' alt='sad.gif'><!--endemo-->
    /me is sad as me = confuzed about this ATI > nVida for HL 2.
  • MerciorMercior Join Date: 2002-11-02 Member: 4019Members, Reinforced - Shadow
    Interesting Doomaniac. I thought the figures on those valve "benchmarks" were very dodgy, but this explains it a lot. From reding that article, I get the impression that ATI have worked closely with valve to produce a driver set that can run Half Life 2 quite smoothly, and NVidia havent been given a chance to look at the game.

    They can now release these benchmarks of half life running with a highly optimised driver set for ATi cards and with old nvidia drivers that have 0 optimisations for the game and get stupidly large performance differences between the cards that will trick your average joe gamer into thinking the radeons really are faster than the geforce cards. the reality is that both cards are pretty much equal in processing power, each has its own slight advantages but overall they are pretty even. Nvidia will release a driver set after the release of half life 2 which will boost the geforce series performance up to line with the radeons (Or knowing nvidia, probably outperforming the radeon series), they just dont have the chance to do this beforehand because ATI paid valve a lot of money to support their card exclusively.

    Its posts like this, coil that are helping ATI in their mission to spread lies about NVidia.
  • ArgoArgo Join Date: 2002-11-01 Member: 2961Members
    Well, if you read the articles carefully, Valve have worked with Nvidia VERY hard to try to optimize HL2 code to run at an acceptable rate on FX class cards. To the point where they actually created a completely separate rendering path JUST for the NV cards, with mixed 8.1/9 shaders to allow the FX cards to run HL2 at a decent rate. All of this versus the ATI cards, which run on a pure, un-optimized DX9 path.
  • DOOManiacDOOManiac Worst. Critic. Ever. Join Date: 2002-04-17 Member: 462Members, NS1 Playtester
    Here's what I think:

    1) NVidia can't afford for Half-Life 2 to run like crap, they'll get it running great.
    2) After ATI screwed up with id, they were probably very desperate to find another software partner
    3) Valve would be stupid to have over 50% of their customer base (last time I checked NVidia still had the majority of the market) run their game like crap...
  • WheeeeWheeee Join Date: 2003-02-18 Member: 13713Members, Reinforced - Shadow
    edited September 2003
    Iunno. I tend to think that a lot of people will want to upgrade for HL2.

    *edit* The number of people with GFFX 5800+ class cards is relatively small. You'll probably get people upgrading to play HL2, so I wouldn't be so sure that Valve would be alienating anyone. And it's not Valve's fault that the FX series can't handle pixel shader 2.0 routines worth crap.
  • JavertJavert Join Date: 2003-04-30 Member: 15954Members
    DooManiac, you keep referring to Nov. 15 as a price drop date. Not that I doubt you (price drops are always in the fall), I wonder what your source is?
  • DOOManiacDOOManiac Worst. Critic. Ever. Join Date: 2002-04-17 Member: 462Members, NS1 Playtester
    At Quakecon, the guy from HardOCP and also representatives from both NVidia and ATI, while they couldn't give away too much, kept naming that date as a date where massive price drops should happen.

    I can't gurantee anything in stone but... Probably a new product launch of some kind, that always pushes current hardware prices down down.
  • airyKairyK Join Date: 2002-12-19 Member: 11126Members
    Well check out this <a href='http://www.3dcenter.org/artikel/cinefx/index_e.php' target='_blank'>article from slashdot</a> , alot of technical stuff but it explains alot. and this <a href='http://www.hardocp.com/article.html?art=NTIw' target='_blank'>is the rebuttle from nvidia</a> about the benchmarks. seems to be the 50.XX series detonators will be optimized for better shader performance and other stuff. Dont start screaming about how what company sucks until its all done w/ , but it sucks to be a nvidia owner right now.
  • WheeeeWheeee Join Date: 2003-02-18 Member: 13713Members, Reinforced - Shadow
    I'd be very suspicious of nVidia's drivers if you started seeing 30% improvements in pixel shader 2.0 performance, which is pretty much what they'll need to beat the radeon...
  • Nil_IQNil_IQ Join Date: 2003-04-15 Member: 15520Members
    Well i'm about to buy a graphics card TODAY, and now you have confused me. Thank you very much.

    You say you got 40fps with a 9700? I get better than that with a 7000! What was your processor? Or maybe your friend just overclocked it so much he borkefied it.

    I'm getting a Radeon 9500 pro, don't try to stop me! Raaaargh!!!
  • TalesinTalesin Our own little well of hate Join Date: 2002-11-08 Member: 7710NS1 Playtester, Forum Moderators
    edited September 2003
    I can't help but be amused as the nVidia fanboys struggle to find an explanation (even delving into conspiracy theories) for why their cards are falling behind on new-generation (DX9) games.

    As has been stated before, nVidia cards are all about power. Cranking the theoretical fillrate to boost the actual rate slightly. The problem is, *stagnation*. They're starting to fall into the same trap that 3Dfx did, using their pipelines and adding on little bits now and again.
    As a result, nVidia cards handle rendering like the Pentagon handles its budget.. getting everything, doing the project, and throwing the 'other stuff' away at the end, when it was not really needed in the first place.

    ATI has been playing the catch-up game for a while now. They've been forced to provide a more efficient and eloquent solution... and are only now starting to ramp up the power, starting with the 9500/9700.
    The difference is, nVidia is still working off an old framework. Their pixel shader 2.0 routines were added on, rather than being integral to the current architecture. Which is why their support is abysmal in comparison. ATI is essentially working off a 'superstructure' based around clean performance, which will allow future games (which will rely heavily on PS2.0+) to run more effectively, even at a lower core clockspeed. As noted in those articles, ATI cards do not need the drivers to translate a good chunk of the instructions. Just passes them straight through.

    And I don't know when the last time you looked at the Catalyst drivers was, but they've come a LONG way since the crappy Rage Fury MAXX (dual Rage128) days. Admittedly, those were a complete pain. The 3.x Cats are easy to install and throw me no errors, beyond bad coding on individual applications. (kRO forgetting to refresh the level-geometry after an alt-tab, for example, assuming nothing else is touching the card buffers)
  • NecrosisNecrosis The Loquacious Sage Join Date: 2003-08-03 Member: 18828Members, Constellation
    As an aside, I guess either way I'll have to put my voodoo 4 out to pasture then.

    <!--emo&:(--><img src='http://www.unknownworlds.com/forums/html/emoticons/sad.gif' border='0' style='vertical-align:middle' alt='sad.gif'><!--endemo-->
  • MajinMajin Join Date: 2003-05-29 Member: 16829Members, Constellation
    I have a GF 4 ti 4600 128mb Xstasy
    It's a good card and its most likely hurting due to my computer
    P3 866MHz
    512Mbs pc 133MHz
    Windows 2000 pro
    But I am not happy with it.
    In RB6:3RS I am lucky if I can hold more than 20 FPS, it will jump to 60 and then to 10 or lower.
    3D mark test I was getting low 20s to 0 fps
    In the HW2 demo my computer starts to die on me but the card isnt far behind I get mid 30 to low 10
    In NS I get any where from 80 to 20 FPS
    I think when i get around to dropping 4 grand on my next computer (waiting for Intell to come out with a processor with 64 bit arc.) I am going to go with an ATI card.
    I have never been a big ATI fan, but they are starting to impress me alot.

    *Plus, they are from my home town!
    Brampton, Ontario Canada!
  • WheeeeWheeee Join Date: 2003-02-18 Member: 13713Members, Reinforced - Shadow
    <a href='http://www.techreport.com/etc/2003q3/valve/index.x?pg=1' target='_blank'>link</a>

    interesting note, Gabe Newell said that Valve had to spend 5 times the amount of time optimizing for the NV3x codepath than the standard DX9 codepath.

    Btw, it's pretty funny that one of Gabe's points was "Our customers will be ****."
  • alius42alius42 Join Date: 2002-07-23 Member: 987Members
    This problem stems from the fact that Nvidia didn't create the FX series cards to DX9 spec and this can be blamed on no one but them. Newell has also mentioned that the 50 revs that nvidia has in the works do things like eliminating fog and they also detect attempts to screenshot and then produce a higher quality image then is actually seen. Basically they are using the same types of methods that got them in hot water with futuremark. This does not bode well for nvidia. <!--emo&:(--><img src='http://www.unknownworlds.com/forums/html/emoticons/sad.gif' border='0' style='vertical-align:middle' alt='sad.gif'><!--endemo-->
  • xioutlawixxioutlawix Join Date: 2002-11-05 Member: 7118Members, Constellation
    Have to say, I think a good majority of people in the culture seem to identify themselves with the products they buy, ie. causing brand loyalty.
    What would possibly cause a person to just blatantly badmouth a company simply because they've had more experience with another.

    Having loyalty to Nvidia was good...a few years ago. Things change. Read the reports, realize the reality of the current situation, and get over it. The situation could just as easily be reversed a few months from now, and any brand loyalty some might have to ATI would be equally retarded.
Sign In or Register to comment.