Gamersdepot Pans Nvidia, Cheers Ati
coil
Amateur pirate. Professional monkey. All pance. Join Date: 2002-04-12 Member: 424Members, NS1 Playtester, Contributor
in Off-Topic
<div class="IPBDescription">no, they're not sellouts.</div> Gamersdepot did a review of the <b>ATI Radeon 9800 Pro 128MB</b> and the <b>nVidia GeForce 5900 Ultra 256MB</b>, examining how they ran the 1.5 Press Release of Halo PC, as well as Tomb Raider AoD (yes, it's a crap game, but it's a visually intensive crap game). The results were... telling.
Here are the benchmarks for Halo PC:
<img src='http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_desktop/halo_beta.gif' border='0' alt='user posted image'>
Here are the benchmarks examining Shader performance (a key element of DirectX 9, and one of the major reasons DX9-enabled games like HL2 and Doom3 look as good as they do) in Tomb Raider AoD at 1024x768:
<img src='http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_desktop/tomb_raider.gif' border='0' alt='user posted image'>
Read the full article <a href='http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_desktop/001.htm' target='_blank'>here</a>.
Doesn't leave a lot to the imagination - 2.0 shader performance on the nVidia card is abysmal compared to the ATI card. But wait... there's more!
Gabe Newell announced HL2 benchmarks yesterday; here are some telling remarks from the opening of the GamersDepot article on it:
<!--QuoteBegin--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->In his speech, Gabe outlined several key points of both personal and professional frustration:
- Valve take serious issue with "optimizations" from NVIDIA as of late
- In no certain words, Valve is highly disappointed with current NV3x hardware as a high-performance DX9 accelerator
- Valve will have Half-Life 2 treat NV3x hardware as DX8.1 hardware for the default settings.
According to Gabe, the rumors and speculation of ATI paying them off is nothing but bull - he said Valve's top priority has everything to do with them wanting HL2 players to have the best experience. After doing some early-on benchmarking between NVIDIA and ATI, the choice was clear from Valve's standpoint of who to partner with.
Microsoft's DirectX team was on-hand to give full blessing to Valve's upcoming HL2 benchmark - in fact it's being referred to as the <b>most complete DX9 benchmark to date</b>.<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
<i>"Valve will have Half-Life 2 treat NV3x hardware as DX8.1 hardware for the default settings."</i> -- While HL2 can still run DX9 detail with an nVidia card, this means that if you have a GeForce FX, your <i>default </i>visual options will *not* include the extra DX9 goodies. You can enable them yourself, but expect a performance hit.
Here are benchmarks from the press release.
This graph is each card running DX9, and the nVidia cards running both "mixed mode" (less detail) and the so-called "Full Precision" mode, which is the way Valve "intended" HL2 to be seen. The Radeon is running Full Precision:
<img src='http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_desktop/HL2_benchmarks/dx9_fullvpre.gif' border='0' alt='user posted image'>
In a FPS-per-dollar comparison, both the 9600 (it's actually a pro; that's a typo in the image) and the 9800 were better buys than any of the nVidia cards:
<img src='http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_desktop/HL2_benchmarks/dx9_full.gif' border='0' alt='user posted image'>
Why will HL2 default to DX8.1 for nVidia cards? Because of these results. Note the most amazing point -- a GeForce 4 Ti 4600 running DX8.1 actually gets BETTER performance than two of the three FX cards when THEY are running DX8.1!
<img src='http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_desktop/HL2_benchmarks/hl2_valve_dx9v8.gif' border='0' alt='user posted image'>
Full article is <a href='http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_desktop/HL2_benchmarks/001.htm' target='_blank'>here</a>.
With all this, on top of the fact that the 9800 Pro costs less than 75% of the FX 5900... all I can say is that I know what card I'm going to be upgrading to.
Here are the benchmarks for Halo PC:
<img src='http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_desktop/halo_beta.gif' border='0' alt='user posted image'>
Here are the benchmarks examining Shader performance (a key element of DirectX 9, and one of the major reasons DX9-enabled games like HL2 and Doom3 look as good as they do) in Tomb Raider AoD at 1024x768:
<img src='http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_desktop/tomb_raider.gif' border='0' alt='user posted image'>
Read the full article <a href='http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_desktop/001.htm' target='_blank'>here</a>.
Doesn't leave a lot to the imagination - 2.0 shader performance on the nVidia card is abysmal compared to the ATI card. But wait... there's more!
Gabe Newell announced HL2 benchmarks yesterday; here are some telling remarks from the opening of the GamersDepot article on it:
<!--QuoteBegin--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->In his speech, Gabe outlined several key points of both personal and professional frustration:
- Valve take serious issue with "optimizations" from NVIDIA as of late
- In no certain words, Valve is highly disappointed with current NV3x hardware as a high-performance DX9 accelerator
- Valve will have Half-Life 2 treat NV3x hardware as DX8.1 hardware for the default settings.
According to Gabe, the rumors and speculation of ATI paying them off is nothing but bull - he said Valve's top priority has everything to do with them wanting HL2 players to have the best experience. After doing some early-on benchmarking between NVIDIA and ATI, the choice was clear from Valve's standpoint of who to partner with.
Microsoft's DirectX team was on-hand to give full blessing to Valve's upcoming HL2 benchmark - in fact it's being referred to as the <b>most complete DX9 benchmark to date</b>.<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
<i>"Valve will have Half-Life 2 treat NV3x hardware as DX8.1 hardware for the default settings."</i> -- While HL2 can still run DX9 detail with an nVidia card, this means that if you have a GeForce FX, your <i>default </i>visual options will *not* include the extra DX9 goodies. You can enable them yourself, but expect a performance hit.
Here are benchmarks from the press release.
This graph is each card running DX9, and the nVidia cards running both "mixed mode" (less detail) and the so-called "Full Precision" mode, which is the way Valve "intended" HL2 to be seen. The Radeon is running Full Precision:
<img src='http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_desktop/HL2_benchmarks/dx9_fullvpre.gif' border='0' alt='user posted image'>
In a FPS-per-dollar comparison, both the 9600 (it's actually a pro; that's a typo in the image) and the 9800 were better buys than any of the nVidia cards:
<img src='http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_desktop/HL2_benchmarks/dx9_full.gif' border='0' alt='user posted image'>
Why will HL2 default to DX8.1 for nVidia cards? Because of these results. Note the most amazing point -- a GeForce 4 Ti 4600 running DX8.1 actually gets BETTER performance than two of the three FX cards when THEY are running DX8.1!
<img src='http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_desktop/HL2_benchmarks/hl2_valve_dx9v8.gif' border='0' alt='user posted image'>
Full article is <a href='http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_desktop/HL2_benchmarks/001.htm' target='_blank'>here</a>.
With all this, on top of the fact that the 9800 Pro costs less than 75% of the FX 5900... all I can say is that I know what card I'm going to be upgrading to.
Comments
BTW November 15th or so = video card price drop day. Remember kiddos!
Regarding ATI drivers, they had this to say about their new architecture. Yes, it was said by ATI, so take that for what it's worth:
<!--QuoteBegin--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->The R300 architecture was built from the ground up for performance in DirectX 9. DX9 instructions <i>map naturally to our hardware, without any tweaking or driver optimizations.</i> This is important, because the vast majority of games out there won't have the benefit of driver optimizations from anyone (unlike certain game benchmarks), because no-one has the engineering resources to spare. Whether you look at brute force (our 8 pipes versus the competitor's 4) or elegance (we can run many shader operations in parallel that our competitor can't) we have a fundamental advantage with our hardware. With more and more DirectX 9 games coming onto the market, the battle will be all about who can run shaders faster and more efficiently. And our shader performance is hugely better. ShaderMark and other tests show shader performance that is three to six times better on ATI's hardware than Nvidia's. This architectural advantage is evident in shipping games like Tomb Raider: Angel of Darkness. You will also see this in Half-Life 2 and every DX9 game coming out before the holiday season."<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
Yeah, benchmarks aren't the best indicators... but results in actual games are rather hard to refute, Mercior.
Visiontek only makes ATi cards now
Visiontek = best video card maker *EVER*
<img src='http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_desktop/HL2_benchmarks/dx9_full.gif' border='0' alt='user posted image'> <!--QuoteEnd--> </td></tr></table><span class='postcolor'> <!--QuoteEEnd-->
I found this part to be the most impressive myself, as the 9600 Pro is definately an affordable card.
What'd I'd love to see is a huge benchmark including every major video card type as far back as at least Geforce 3 generation, maybe even back to GF1, all on low, med, and high qualities.
*edit* btw, voltmodding is dangerous, and can ruin your board. 40fps is ridiculously low, I get 99-100 fps most of the time with my 9500 non-pro. Dunno what your card's problem was.
Did you bother to properly install the drivers?
edit: The card ran GTA Vice City & Unreal 2 just fine, so I'm pretty sure that its driver issues causing the half life problems. Another friend of mine recently returned a radeon card because it couldnt run half life reasonably.
ATi forever. <!--emo&;)--><img src='http://www.unknownworlds.com/forums/html/emoticons/wink.gif' border='0' style='vertical-align:middle' alt='wink.gif'><!--endemo-->
NVidia has responded with:
<!--QuoteBegin--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->The optimal code path for ATI and NVIDIA GPUs is different - so trying to test them with the same code path will always disadvantage one or the other. The default settings for each game have been chosen by both the developers and NVIDIA in order to produce the best results for our consumers.
In addition to the developer efforts, our driver team has developed a next-generation automatic shader optimizer that vastly improves GeForce FX pixel shader performance across the board. The fruits of these efforts will be seen in our Rel.50 driver release. Many other improvements have also been included in Rel.50, and these were all created either in response to, or in anticipation of the first wave of shipping DirectX 9 titles, such as Half Life 2.
We are committed to working with Gabe to fully understand his concerns and with Valve to ensure that 100+ million NVIDIA consumers get the best possible experience with Half Life 2 on NVIDIA hardware.<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->(which is what some have already pointed out)
"BUY [THIS] CARD AND USE DX9 ON [THAT GAME] AND GET MORE VISUALS!!!"
"BUY [THE OTHER] CARD AND USE DX 8.1 ON [THAT GAME] AND GET MORE FPS!!!"
____
F***! Now what am i supposed to do when i buy HL 2?! Will i be gimped because HL 2 doesn't "like" nVida cards but "loves" ATI cards?! Can i be the first to say: "WT*H AM I SUPPOSED TO DO NOW?!?" ?
<!--emo&:(--><img src='http://www.unknownworlds.com/forums/html/emoticons/sad.gif' border='0' style='vertical-align:middle' alt='sad.gif'><!--endemo-->
/me is sad as me = confuzed about this ATI > nVida for HL 2.
They can now release these benchmarks of half life running with a highly optimised driver set for ATi cards and with old nvidia drivers that have 0 optimisations for the game and get stupidly large performance differences between the cards that will trick your average joe gamer into thinking the radeons really are faster than the geforce cards. the reality is that both cards are pretty much equal in processing power, each has its own slight advantages but overall they are pretty even. Nvidia will release a driver set after the release of half life 2 which will boost the geforce series performance up to line with the radeons (Or knowing nvidia, probably outperforming the radeon series), they just dont have the chance to do this beforehand because ATI paid valve a lot of money to support their card exclusively.
Its posts like this, coil that are helping ATI in their mission to spread lies about NVidia.
1) NVidia can't afford for Half-Life 2 to run like crap, they'll get it running great.
2) After ATI screwed up with id, they were probably very desperate to find another software partner
3) Valve would be stupid to have over 50% of their customer base (last time I checked NVidia still had the majority of the market) run their game like crap...
*edit* The number of people with GFFX 5800+ class cards is relatively small. You'll probably get people upgrading to play HL2, so I wouldn't be so sure that Valve would be alienating anyone. And it's not Valve's fault that the FX series can't handle pixel shader 2.0 routines worth crap.
I can't gurantee anything in stone but... Probably a new product launch of some kind, that always pushes current hardware prices down down.
You say you got 40fps with a 9700? I get better than that with a 7000! What was your processor? Or maybe your friend just overclocked it so much he borkefied it.
I'm getting a Radeon 9500 pro, don't try to stop me! Raaaargh!!!
As has been stated before, nVidia cards are all about power. Cranking the theoretical fillrate to boost the actual rate slightly. The problem is, *stagnation*. They're starting to fall into the same trap that 3Dfx did, using their pipelines and adding on little bits now and again.
As a result, nVidia cards handle rendering like the Pentagon handles its budget.. getting everything, doing the project, and throwing the 'other stuff' away at the end, when it was not really needed in the first place.
ATI has been playing the catch-up game for a while now. They've been forced to provide a more efficient and eloquent solution... and are only now starting to ramp up the power, starting with the 9500/9700.
The difference is, nVidia is still working off an old framework. Their pixel shader 2.0 routines were added on, rather than being integral to the current architecture. Which is why their support is abysmal in comparison. ATI is essentially working off a 'superstructure' based around clean performance, which will allow future games (which will rely heavily on PS2.0+) to run more effectively, even at a lower core clockspeed. As noted in those articles, ATI cards do not need the drivers to translate a good chunk of the instructions. Just passes them straight through.
And I don't know when the last time you looked at the Catalyst drivers was, but they've come a LONG way since the crappy Rage Fury MAXX (dual Rage128) days. Admittedly, those were a complete pain. The 3.x Cats are easy to install and throw me no errors, beyond bad coding on individual applications. (kRO forgetting to refresh the level-geometry after an alt-tab, for example, assuming nothing else is touching the card buffers)
<!--emo&:(--><img src='http://www.unknownworlds.com/forums/html/emoticons/sad.gif' border='0' style='vertical-align:middle' alt='sad.gif'><!--endemo-->
It's a good card and its most likely hurting due to my computer
P3 866MHz
512Mbs pc 133MHz
Windows 2000 pro
But I am not happy with it.
In RB6:3RS I am lucky if I can hold more than 20 FPS, it will jump to 60 and then to 10 or lower.
3D mark test I was getting low 20s to 0 fps
In the HW2 demo my computer starts to die on me but the card isnt far behind I get mid 30 to low 10
In NS I get any where from 80 to 20 FPS
I think when i get around to dropping 4 grand on my next computer (waiting for Intell to come out with a processor with 64 bit arc.) I am going to go with an ATI card.
I have never been a big ATI fan, but they are starting to impress me alot.
*Plus, they are from my home town!
Brampton, Ontario Canada!
interesting note, Gabe Newell said that Valve had to spend 5 times the amount of time optimizing for the NV3x codepath than the standard DX9 codepath.
Btw, it's pretty funny that one of Gabe's points was "Our customers will be ****."
What would possibly cause a person to just blatantly badmouth a company simply because they've had more experience with another.
Having loyalty to Nvidia was good...a few years ago. Things change. Read the reports, realize the reality of the current situation, and get over it. The situation could just as easily be reversed a few months from now, and any brand loyalty some might have to ATI would be equally retarded.