TalesinOur own little well of hateJoin Date: 2002-11-08Member: 7710NS1 Playtester, Forum Moderators
edited September 2003
The identity thing more stems from the fact that people have a finite amount of money, and don't want to admit 'I was an idiot for buying Z', to themselves. Which, in truth, they weren't. It was decent at the time, and shared a name with (or WAS) the top-grade at the time.
It's not brand-loyalty... cards are cards. People just don't want to feel stupid, and so start to rabidly defend 'their' product, which is by extension a direct result of their own decision. The difficulty comes in when ten years down the line, they STILL don't want to feel like an idiot (admit that they were wrong) when their card is in the gutter. This is why we still have folks screaming about how 3Dfx should rule the world.
Get over it. For the moment, ATI has the faster, higher image-quality card. Their drivers are coming up to snuff, and nVidia's are getting loaded down with cheat-attempts... which is one reason the .50 Detonators were NOT used in benchmarking. They catch an attempt to screenshot and re-render at full visual option quality. So you get a beautiful screenshot... but you don't actually get to PLAY at that beautiful setting. Hmmmm... who posts screenshots, along with benchmarking information? That's right! Reviewers. What a lot of people will base their purchasing choices from.
Just another disappointing market-share grab from nVidia, which (thankfully) a number of viewers caught and gave the finger.
I was looking for this and finally found it, so read <a href='http://games.slashdot.org/article.pl?sid=03/09/13/1923213&mode=thread&tid=127&tid=137&tid=152&tid=185&tid=186&tid=204' target='_blank'>this</a> and the links within it before making your decision
I suppose that counted as an opinion, but if what they said is true, and better drivers were available and not used, the benchmarks in this thread aren’t correct. Furthermore, and im not saying its true, but if the "prequel" to HL will now be bundled with ATI cards then we would have another justification for those ridicules scores.
Funny, my 5900 tested buttloads faster than my 9800 <!--emo&???--><img src='http://www.unknownworlds.com/forums/html/emoticons/confused.gif' border='0' style='vertical-align:middle' alt='confused.gif'><!--endemo-->
You have both? O_O Oh, and the pricecuts I think are close. I would do a guesstimate that the NVidia should be at £300 or less by november, and maybe close to £200 by Christmas. The ATi should follow but slower (higher demand I think?). Come november time anyway(??), a lot of these poblems with either card on "current" games will be sorted, when supposedely NVidia and ATi are releasing their next card (fx6000/radeon 9900?)
TalesinOur own little well of hateJoin Date: 2002-11-08Member: 7710NS1 Playtester, Forum Moderators
<!--QuoteBegin--GreyPaws+Sep 15 2003, 03:45 PM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (GreyPaws @ Sep 15 2003, 03:45 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> I suppose that counted as an opinion, but if what they said is true, and better drivers were available and not used, the benchmarks in this thread aren’t correct. Furthermore, and im not saying its true, but if the "prequel" to HL will now be bundled with ATI cards then we would have another justification for those ridicules scores. <!--QuoteEnd--> </td></tr></table><span class='postcolor'> <!--QuoteEEnd--> Actually, the reason that they were not used was shown in other 'real-world' games on Gamersdepot. The detonator .50 drivers force visual options down (severe blurring is quite visible) to synthetically raise their framerate. Which is why the 'better' drivers were not used... instead going for the ones that will not crank everything back against the desires of the user.
The Det .50 drivers (as well as the nVidia backend in DOOM3) lower calculation precision to 12 or 16bit, rather than the standard 24 bit that ATI cards always use. Which is the ONLY reason the nVidia cards do better on DOOM3. With the standard rendering path (full 24-bit precision), the ATI cards still blow them out of the water, even with all of ID's help.
So at this point it's more a question of if you want a card that will both play the game well, *and* look clear and crisp while doing it. For that, you'll need a Radeon. If you want good AA quality with a lot less hit, you'll need a Radeon. If you want a card that will properly do trilinear filtering while doing AA, you'll need a Radeon. (Yes, another 'optimization' of the nVidia cards is to turn OFF trilinear filtering when AA is enabled... and they STILL run incredibly slowly. nVidia marketroids call it a 'driver bug'.) If you want a card with native Pixel Shader 2.0 support, you'll need a Radeon. (GFFXes only have PS1.4, and EMULATED at that.) If you want a card that stays nicely in its single allocated slot, you'll need a Radeon. <!--emo&:p--><img src='http://www.natural-selection.org/forums/html/emoticons/tounge.gif' border='0' style='vertical-align:middle' alt='tounge.gif'><!--endemo-->
<b>THE QUESTION IS DOES IT MATTER!!!!????!!!!!</b>
Seriously people this is beginning to get as tired and as old as the Steam debate. If I see one more thread about OMG NVIDIA SUCKS, OMG ATI ARE TEH SUX0R or NVIDIA CHEATS!!!! or whatever I may seriously break something, of radically reduce the amount of hair I have on my head.
If you want to discuss about cards do so somewhere else because some of us may be sick of it. People should buy the card <b>they want</b> not what someone else tells them to. Seriously lets just let people make their own minds up.
-EDIT- OK lets disable the smilies for this post -EDIT-
<!--QuoteBegin--Mercior+Sep 11 2003, 03:46 PM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Mercior @ Sep 11 2003, 03:46 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> You have to remember that the ATI drivers are far, FAR <b>FAR</b> worse than the geforce ones, and have been optimised for benchmarking tests to try to trick people into buying their card. When you actually come to play a game with a radeon, you'll find it doesnt perform even close to the geforce series. I had a voltmodded radeon 9700 pro not long ago and it got about 40fps in half life, and 15fps in battlefield if I was lucky. My current geforce 4 is >100fps on both those games. <!--QuoteEnd--> </td></tr></table><span class='postcolor'> <!--QuoteEEnd--> Lol...
It seems that you have forgotton that Nvidia was the first corporation to get caught redhanded cheating the benchmark-softwares with their drivers <!--emo&:p--><img src='http://www.unknownworlds.com/forums/html/emoticons/tounge.gif' border='0' style='vertical-align:middle' alt='tounge.gif'><!--endemo-->
I have used a "decent" nVidia card, a Geforce 4 ti-4200. In vietcong, my fps was something like 20. And half-life gave me something like 70. My 3DMark was 7200, which was incredibly low considering the fact that I had an Athlon XP 2400+ processor, a good nvidia nforce2 motherboard and 512 of DDR.. Not that 3DMark would be such a good way to calculate performance, but anyway. And I never used anti-aliasing because it caused mad low framerates.
Few months back, I bought ATi Radeon 9800 (not pro.). My 3DMark jumped up to 17 000 , and using 8x Anti-Aliasing and 16x anisotrophic filtering, I have a pretty steady 99 fps in Vietcong, and my Half-Life also (except for natural-selection, which non-steam seemed to give me fps from 50-70.. which is ODD. Steamed version givse 80-99 thou <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html/emoticons/smile.gif' border='0' style='vertical-align:middle' alt='smile.gif'><!--endemo--> ). The Radeon drivers are great, and easy to use.
I'm a huge Radeon fan, I can't even think about going back to the "enemy camp" after all the horror I've went through because of their horrible graphic cards.
I'm on a Ati rage fury pro *got it in 1999* and i get 20.0 in NS usually 2.3-2.7 in a turret farm or when 5 HA show up, when i first got Deus ex it ran smoothly but had lots of graphical errors, my friend who had a Nivada also got Deus ex and there wasn't a single graphical error, also the updates they released also let him get even higher fps <!--emo&:angry:--><img src='http://www.unknownworlds.com/forums/html/emoticons/mad.gif' border='0' style='vertical-align:middle' alt='mad.gif'><!--endemo--> .
Comments
It's not brand-loyalty... cards are cards. People just don't want to feel stupid, and so start to rabidly defend 'their' product, which is by extension a direct result of their own decision. The difficulty comes in when ten years down the line, they STILL don't want to feel like an idiot (admit that they were wrong) when their card is in the gutter. This is why we still have folks screaming about how 3Dfx should rule the world.
Get over it. For the moment, ATI has the faster, higher image-quality card. Their drivers are coming up to snuff, and nVidia's are getting loaded down with cheat-attempts... which is one reason the .50 Detonators were NOT used in benchmarking. They catch an attempt to screenshot and re-render at full visual option quality. So you get a beautiful screenshot... but you don't actually get to PLAY at that beautiful setting.
Hmmmm... who posts screenshots, along with benchmarking information? That's right! Reviewers. What a lot of people will base their purchasing choices from.
Just another disappointing market-share grab from nVidia, which (thankfully) a number of viewers caught and gave the finger.
*pets his video card*
Oh, and the pricecuts I think are close. I would do a guesstimate that the NVidia should be at £300 or less by november, and maybe close to £200 by Christmas. The ATi should follow but slower (higher demand I think?).
Come november time anyway(??), a lot of these poblems with either card on "current" games will be sorted, when supposedely NVidia and ATi are releasing their next card (fx6000/radeon 9900?)
Actually, the reason that they were not used was shown in other 'real-world' games on Gamersdepot. The detonator .50 drivers force visual options down (severe blurring is quite visible) to synthetically raise their framerate. Which is why the 'better' drivers were not used... instead going for the ones that will not crank everything back against the desires of the user.
The Det .50 drivers (as well as the nVidia backend in DOOM3) lower calculation precision to 12 or 16bit, rather than the standard 24 bit that ATI cards always use. Which is the ONLY reason the nVidia cards do better on DOOM3. With the standard rendering path (full 24-bit precision), the ATI cards still blow them out of the water, even with all of ID's help.
So at this point it's more a question of if you want a card that will both play the game well, *and* look clear and crisp while doing it. For that, you'll need a Radeon.
If you want good AA quality with a lot less hit, you'll need a Radeon.
If you want a card that will properly do trilinear filtering while doing AA, you'll need a Radeon. (Yes, another 'optimization' of the nVidia cards is to turn OFF trilinear filtering when AA is enabled... and they STILL run incredibly slowly. nVidia marketroids call it a 'driver bug'.)
If you want a card with native Pixel Shader 2.0 support, you'll need a Radeon. (GFFXes only have PS1.4, and EMULATED at that.)
If you want a card that stays nicely in its single allocated slot, you'll need a Radeon. <!--emo&:p--><img src='http://www.natural-selection.org/forums/html/emoticons/tounge.gif' border='0' style='vertical-align:middle' alt='tounge.gif'><!--endemo-->
*slaps* merc.
Seriously people this is beginning to get as tired and as old as the Steam debate. If I see one more thread about OMG NVIDIA SUCKS, OMG ATI ARE TEH SUX0R or NVIDIA CHEATS!!!! or whatever I may seriously break something, of radically reduce the amount of hair I have on my head.
If you want to discuss about cards do so somewhere else because some of us may be sick of it. People should buy the card <b>they want</b> not what someone else tells them to. Seriously lets just let people make their own minds up.
-EDIT- OK lets disable the smilies for this post -EDIT-
Lol...
It seems that you have forgotton that Nvidia was the first corporation to get caught redhanded cheating the benchmark-softwares with their drivers <!--emo&:p--><img src='http://www.unknownworlds.com/forums/html/emoticons/tounge.gif' border='0' style='vertical-align:middle' alt='tounge.gif'><!--endemo-->
I have used a "decent" nVidia card, a Geforce 4 ti-4200. In vietcong, my fps was something like 20. And half-life gave me something like 70. My 3DMark was 7200, which was incredibly low considering the fact that I had an Athlon XP 2400+ processor, a good nvidia nforce2 motherboard and 512 of DDR.. Not that 3DMark would be such a good way to calculate performance, but anyway. And I never used anti-aliasing because it caused mad low framerates.
Few months back, I bought ATi Radeon 9800 (not pro.). My 3DMark jumped up to 17 000 , and using 8x Anti-Aliasing and 16x anisotrophic filtering, I have a pretty steady 99 fps in Vietcong, and my Half-Life also (except for natural-selection, which non-steam seemed to give me fps from 50-70.. which is ODD. Steamed version givse 80-99 thou <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html/emoticons/smile.gif' border='0' style='vertical-align:middle' alt='smile.gif'><!--endemo--> ). The Radeon drivers are great, and easy to use.
I'm a huge Radeon fan, I can't even think about going back to the "enemy camp" after all the horror I've went through because of their horrible graphic cards.