Geforce Fx 5900nu Vs Ati 9800pro 128mb

elitebearelitebear Join Date: 2002-05-29 Member: 696Members
edited March 2004 in Off-Topic
<div class="IPBDescription">which to purchase?</div> hey guys, ive recently been planning on buying a new video card that would take the place of my current one; a horrific geforce 4 mx420. right now, im facing a dilemma. with a $200-$250 budget, im debating whether i should buy a geforce fx5900non ultra 128mb, or an ATI 9800pro 128mb (both of which are in about the same price range). ive been reading very mixed reviews on which dominates over the other and im getting so confused <!--emo&???--><img src='http://www.unknownworlds.com/forums/html//emoticons/confused.gif' border='0' style='vertical-align:middle' alt='confused.gif' /><!--endemo--> . anyone have an idea which one i should go for, and why?
«1

Comments

  • Cereal_KillRCereal_KillR Join Date: 2002-10-31 Member: 1837Members
    I'd go with the 9800 personally, it has been at the top much more, though I don't know what the "nu" is, the 9800 pro has been superior to the 5900's by quite a bit.
  • MoquiaoMoquiao Join Date: 2003-05-09 Member: 16168Members
    <!--QuoteBegin-Cereal_KillR+Mar 14 2004, 08:19 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Cereal_KillR @ Mar 14 2004, 08:19 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> I'd go with the 9800 personally, it has been at the top much more, though I don't know what the "nu" is, the 9800 pro has been superior to the 5900's by quite a bit. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    nu = non ultra... i think
  • SpetsnazSpetsnaz Join Date: 2003-12-26 Member: 24761Members, Constellation
    scrap them both and get an 9600XT
  • ANeMANeM Join Date: 2003-05-13 Member: 16267Members, Constellation
    Bah.. 9600XT is not better than the 9800pro...
  • DOOManiacDOOManiac Worst. Critic. Ever. Join Date: 2002-04-17 Member: 462Members, NS1 Playtester
    as much as it hurts me to say this, I can't really recommend getting an nVidia card. Also check out Best Buy this week, they got a 9800 Pro for $200 after a $50 instant rebate and a $50 mail-in rebate.
  • elitebearelitebear Join Date: 2002-05-29 Member: 696Members
    hmmm. what i dont get is this. the geforce fx5900nu has a core clock/memory clock speed of 400/850, while the 9800 has a core clock/memory clock speed of 380/680. i have absolutely no idea what this means, but all i know is that the geforce fx has more of both. despite this fact, the ati still seems to be favored by many people. what can explain the belief that ati performs better
  • TortexTortex Join Date: 2004-03-02 Member: 27063Members
    don't get nvidia cards! they have a virus n their site right now when you go to update your drivers for your graphics card! i got a trpjan right now that i'm still looking at to see how to get it out.
  • OttoDestructOttoDestruct Join Date: 2002-11-08 Member: 7790Members
    <!--QuoteBegin-elitebear+Mar 14 2004, 03:09 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (elitebear @ Mar 14 2004, 03:09 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> hmmm. what i dont get is this. the geforce fx5900nu has a core clock/memory clock speed of 400/850, while the 9800 has a core clock/memory clock speed of 380/680. i have absolutely no idea what this means, but all i know is that the geforce fx has more of both. despite this fact, the ati still seems to be favored by many people. what can explain the belief that ati performs better <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    If you dont know what it means, then don't buy a card based on someone saying it overclocks well, as I doubt you would want to do that. I would personally go with the 9800 Pro, I've heard lots of horror stories on the 5900's.
  • elitebearelitebear Join Date: 2002-05-29 Member: 696Members
    edited March 2004
    <!--QuoteBegin-OttoDestruct+Mar 14 2004, 08:12 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (OttoDestruct @ Mar 14 2004, 08:12 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->If you dont know what it means, then don't buy a card based on someone saying it overclocks well, as I doubt you would want to do that. I would personally go with the 9800 Pro, I've heard lots of horror stories on the 5900's.<!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
    well regardless of if i plan on overclocking my video card or not, the geforce fx5900 still has higher clocking speeds which would imply that it performs better and that i should purchase it for being so. sigh <!--emo&???--><img src='http://www.unknownworlds.com/forums/html//emoticons/confused.gif' border='0' style='vertical-align:middle' alt='confused.gif' /><!--endemo--> , im so confuzzled.
  • SvenpaSvenpa Wait, what? Join Date: 2004-01-03 Member: 25012Members, Constellation
    I have about the same dilemma but its about Readeon 9800 XT vs Leadtek geforce 5950 ultra fx as i can see on memory and engine(core?) clock geforce is faster. But wich would you recommend? I dont know about the other stuff it says about em.
  • CForresterCForrester P0rk(h0p Join Date: 2002-10-05 Member: 1439Members, Constellation
    edited March 2004
    I would go with the 9800. The difference in card clockspeeds is only minor, and I do believe that the 9800 beats the 5900. ATi's cards deliver much better graphics quality, and work much better with antialiasing and anisotropic filtering. You don't get a super-high framerate compared to nVidia's cards (The difference is minimal), but you still get a really stable framerate without sacrificing picture quality. Plus ATi's drivers rock, especially with the <a href='http://www.omegacorner.com' target='_blank'>Omega Drivers</a>, which are modded versions of the current Catalyst (ATi's) drivers that contain his custom configs and utilities.
    <span style='font-size:8pt;line-height:100%'>Minor edit by Marik_Steele to make URL work</span>
  • DOOManiacDOOManiac Worst. Critic. Ever. Join Date: 2002-04-17 Member: 462Members, NS1 Playtester
    Clockspeed means nothing, in the end its the actual performance of the card that matters, and sadly ATI is currently in the lead, but like, a lot. :(
  • DarkDudeDarkDude Join Date: 2003-08-06 Member: 19088Members
    I would get the Powercolor Radeon 9800SE. You may think I'm crazy, but stick with me here.

    There are two types of 9800SE cards. There's one based off of the 9500 chipset, and there's one based off of the 9700 chipset. Obviously, the 9700 based 9800SE is better. Why? It has 8 pixel pipelines (256 bit) with 4 locked ones while the 9500 based card has only 4 (128 bit) period. With some softmod drivers (just Google "softmod") you can unlock those 4 extra pipelines and have a 9800 Pro for $50-$100 less than you would have had to pay.

    You have to make sure you've got the right card based off of a 9700 though. <a href='http://www.newegg.com/app/ViewProductDesc.asp?description=14-131-236&depa=1' target='_blank'>Here</a> is a 9700 based 9800SE with 8 pixel pipelines. You can tell it has all 8 because the 4 memory chips are on either side of the GPU.

    If you've got the money for a 9800 Pro, by all means get it. If you're looking for the best card for the best price, you might want to check this one out. It's not without risk. You might end up getting a card where the other 4 pipelines aren't just disabled, but actually "broken". Even then, you could always send it back and grab another card.

    I've got another video card related question also. If you have a card with external power, does it plug directly into the PSU, or do you plug it into the wires coming out of the PSU. I've got a few open slots in the wires coming out of my PSU and I'm just wondering if I have to rip open the box surronding the PSU in order to install my new video card.
  • AlignAlign Remain Calm Join Date: 2002-11-02 Member: 5216Forum Moderators, Constellation
    Why do you want a new card when you already have a geforce 4? I have a geforce 3 and it is quite enough, even for ut2k4...
  • JavertJavert Join Date: 2003-04-30 Member: 15954Members
    Go for whichever is cheaper and gives you the best deal. Unless you plan to run Doom5 on the highest of high settings, you won't see a difference.
  • ElderwyrmElderwyrm Join Date: 2003-04-07 Member: 15296Members
    <!--QuoteBegin-DOOManiac+Mar 14 2004, 04:10 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (DOOManiac @ Mar 14 2004, 04:10 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> Clockspeed means nothing, in the end its the actual performance of the card that matters, and sadly ATI is currently in the lead, but like, a lot. <!--emo&:(--><img src='http://www.unknownworlds.com/forums/html//emoticons/sad.gif' border='0' style='vertical-align:middle' alt='sad.gif' /><!--endemo--> <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    Sadly?

    Go with the ATI. They are much better than Nvidia. Also the Nvidia card will take up one of your PCI slots because the card is so huge.
  • SvenpaSvenpa Wait, what? Join Date: 2004-01-03 Member: 25012Members, Constellation
    Whats best 9800pro or XT, dose 9800 (XT pro whatevers best) beat geforce 5950 ultra and dose it matter wich company that made it leadtek, Gainward etc.
  • DOOManiacDOOManiac Worst. Critic. Ever. Join Date: 2002-04-17 Member: 462Members, NS1 Playtester
    <!--QuoteBegin-Align+Mar 14 2004, 03:45 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Align @ Mar 14 2004, 03:45 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> Why do you want a new card when you already have a geforce 4? I have a geforce 3 and it is quite enough, even for ut2k4... <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    Games out today are pushing the GF4 over its limits if you want the absolute highest quality graphics. Dues Ex 2, Far Cry, Battlefield: Vietnam, PainKiller, and soon Stalker, DOOM 3, and HL2, are all very hardware taxing if you want it to look dropdead gorgeous. Which I do. Which is why I'm looking to replace my obsolete, aging Ti4400... :P
  • Umbraed_MonkeyUmbraed_Monkey Join Date: 2002-11-25 Member: 9922Members
    <!--QuoteBegin-DOOManiac+Mar 14 2004, 05:10 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (DOOManiac @ Mar 14 2004, 05:10 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> <!--QuoteBegin-Align+Mar 14 2004, 03:45 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Align @ Mar 14 2004, 03:45 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> Why do you want a new card when you already have a geforce 4? I have a geforce 3 and it is quite enough, even for ut2k4... <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
    Games out today are pushing the GF4 over its limits if you want the absolute highest quality graphics. Dues Ex 2, Far Cry, Battlefield: Vietnam, PainKiller, and soon Stalker, DOOM 3, and HL2, are all very hardware taxing if you want it to look dropdead gorgeous. Which I do. Which is why I'm looking to replace my obsolete, aging Ti4400... <!--emo&:p--><img src='http://www.unknownworlds.com/forums/html//emoticons/tounge.gif' border='0' style='vertical-align:middle' alt='tounge.gif' /><!--endemo--> <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    Ill give that useless POS a home if you want.... <!--emo&;)--><img src='http://www.unknownworlds.com/forums/html//emoticons/wink.gif' border='0' style='vertical-align:middle' alt='wink.gif' /><!--endemo-->
  • kuperayekuperaye Join Date: 2003-03-14 Member: 14519Members, Constellation
    Hey if your going to over clock your gfx card i would DEFINITLY get a 5900 nu because im pretty sure you can flash it to a 5950 Ultra and thats a 500 dollar card to double check though go to www.guru3d.com

    but if not get the 9800 pro its a lot better
  • elitebearelitebear Join Date: 2002-05-29 Member: 696Members
    dude, its a geforce4 mx420, 32mb. its worth about 40 dollars at best buy
  • OttoDestructOttoDestruct Join Date: 2002-11-08 Member: 7790Members
    <!--QuoteBegin-elitebear+Mar 14 2004, 03:21 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (elitebear @ Mar 14 2004, 03:21 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> <!--QuoteBegin-OttoDestruct+Mar 14 2004, 08:12 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (OttoDestruct @ Mar 14 2004, 08:12 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->If you dont know what it means, then don't buy a card based on someone saying it overclocks well, as I doubt you would want to do that. I would personally go with the 9800 Pro, I've heard lots of horror stories on the 5900's.<!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
    well regardless of if i plan on overclocking my video card or not, the geforce fx5900 still has higher clocking speeds which would imply that it performs better and that i should purchase it for being so. sigh <!--emo&???--><img src='http://www.unknownworlds.com/forums/html//emoticons/confused.gif' border='0' style='vertical-align:middle' alt='confused.gif' /><!--endemo--> , im so confuzzled. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    If anything higher clockspeeds could mean worse. Higher speed = higher temperature = bad. Just because it isnt clocked as high doesnt mean the 9800 Pro doesnt score as well, go to <a href='http://www.tomshardware.com' target='_blank'>Toms hardware</a> and search around for benchmarks. And all these people telling you to buy cards based on softmodding, ignore them. The chance that you actually get a card that CAN be soft modded, along with the chance that you dont kill it (and even attempting soft mod will probably void the warranty), is extremely low. Even if the cards DO overclock, they definately weren't designed to run with 8 pixel pipelines at higher clock speeds, thus frying them that much quicker.
  • Soylent_greenSoylent_green Join Date: 2002-12-20 Member: 11220Members, Reinforced - Shadow
    <!--QuoteBegin--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->Why do you want a new card when you already have a geforce 4? I have a geforce 3 and it is quite enough, even for ut2k4...
    <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->

    the geforce 4 mx420 is a geforce 4 in name only. It's basically a geforce 2.

    Some people like anti-aliasing and anisotropic filtering. Gf3 will run only very simple pixelshaders(ps1.1 only I think) and it is quite slow which will suck in most comming games, like HL2, doom3, far cry and stalker.
  • Soylent_greenSoylent_green Join Date: 2002-12-20 Member: 11220Members, Reinforced - Shadow
    <!--QuoteBegin--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->well regardless of if i plan on overclocking my video card or not, the geforce fx5900 still has higher clocking speeds which would imply that it performs better and that i should purchase it for being so. sigh  , im so confuzzled. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->

    Clock speed means very little when you don't know how much it gets done in one clock. A simple card like a geforce 4 mx440 can work at quite high frequencies like 275 MHz and get much much less done in one clock than a normal geforce 4 at the same clock frequency.

    The memory clock is not all that relevant either, usually compression and bandwidth saving techniques are used and it is important to know how good the card is at using it's bandwidth before you can make any judgements at all on which has more usable bandwidth.

    nVidia cards(this current generation, things may well change with the nv40) is quite bad at running pixelshaders, tiny programs run for each pixel in the graphics card, these can do some very beautiful things such as water that refracts light realisticly which will be more and more important in future games. Nvidia uses lower precision shaders in many games because they cannot run it fast enough otherwise(this may have an impact on image quality, not allways however).

    Right now ATi has nicer antialising and worse anisotropic filtering(angle dependent, weird angles get less attention, this is to save transistors).

    Nvidias strong point right now is raw fillrate in older games and openGL games(they allways had a lovely openGL driver).
  • Jim_has_SkillzJim_has_Skillz Join Date: 2003-01-19 Member: 12475Members, Constellation
    In all of my experiences, nvidia has been the best video card. ATI does not greatly outperform Nvidia in any of the tests. Some of the tests are even biased. Last I remember nvidia scored better in a futuremark test but ATI whined that they were supposedly cheating, which they weren't(why would you wanna draw graphics that aren't even on the screen). And therefore they corrected the tests so Nvidia would not have the supposed 'advantage'. Nvidia also has the best drivers for their cards I have ever seen. The ATI drivers are the worst and they aren't even getting better. I would definately go with Nvidia, ATI is way too hyped up right now. The only thing ATI does that I like is compete with Nvidia so prices are low.
  • OttoDestructOttoDestruct Join Date: 2002-11-08 Member: 7790Members
    <!--QuoteBegin-Jim has Skillz+Mar 14 2004, 09:59 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Jim has Skillz @ Mar 14 2004, 09:59 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> Nvidia also has the best drivers for their cards I have ever seen. The ATI drivers are the worst and they aren't even getting better. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    Please define best and worst. Seriously. You don't backup a single claim in there. ATI and nVidia bothe have good drivers, its a matter of what card outperforms the other.
  • Soylent_greenSoylent_green Join Date: 2002-12-20 Member: 11220Members, Reinforced - Shadow
    edited March 2004
    <!--QuoteBegin--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->In all of my experiences, nvidia has been the best video card. ATI does not greatly outperform Nvidia in any of the tests. Some of the tests are even biased. Last I remember nvidia scored better in a futuremark test but ATI whined that they were supposedly cheating, which they weren't(why would you wanna draw graphics that aren't even on the screen). And therefore they corrected the tests so Nvidia would not have the supposed 'advantage'. Nvidia also has the best drivers for their cards I have ever seen. The ATI drivers are the worst and they aren't even getting better. I would definately go with Nvidia, ATI is way too hyped up right now. The only thing ATI does that I like is compete with Nvidia so prices are low.
    <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->

    If you used the developer version of 3dmark you can "go of the rail" and look at the surroundings, and what do you know, HOM(something isn't rendered). Adding static clip planes is not an optimization you can do in a game and it's only effect is to artificially inflate nV's score by throwing out things before you could possibly know if they are to be rendered or not. A valid optimization in a benchmark is one that you can do in a game, no precomputing is allowed, 3dmark has a few driver version that they accept for nV and even those are STILL cheating by futermarks own admission(lower shader precision), but at least such optimizations can be used in a game. Degrading image quality for the sake of performance in a benchmark is quite a low thing to do.

    ATi did NOT whine, <a href='http://www.extremetech.com/' target='_blank'>exteme tech</a> did.

    What's the use of a benchmark if one IHV does every cheat and optimization they think they can get away with and one does not(ATi did some PS instruction reordering which gained them a percent or 2, they admitted it and removed it even though it is a valid optimization that is mathematically equivalent to the original shader and is an optimization you might find in a game.). Nvidia lied and said their cheats where some weird driver bugs(you don't accidentally add static clip planes just outside view at all times). Nvidia gained rougly 30% from their "optimizations".

    Also look at nvidia in shadermark or the HL2 preview and you'll see that full precision pixelshaders significantly hurts performance on nv hardware.

    Nvidia doesn't even appear to use full trilinear filtering in ANY DX game. Instead it forces a pseudo-trilinear mode(affectionately reffered to as brilinear filtering) which cannot be overriden by the user. I certainly wouldn't mind having it as an option as it does not significantly reduce image quality, but forcing it is a bit iffy.

    John Carmack has said that he thinks developers will treat nv3x cards as dx8 cards and he appears to be correct as well(he himself is treating it as a dx8 card, so is halo, HL2, tomb raider:AOD(urgh, horrid game) etc.). ATi performance in doom 3 was twice as good as nV when full precision was used, nV had better performance than ATi when mainly fx12(integer 12 bit) shaders where used. The lack of precision won't matter much to doom 3 according to carmack but it may well matter to other games. Certainly it will if you use high dynamic range rendering, integer shaders are right out as well here.

    Gabe Newell has also said a fair few things about nV performance in HL2, they allegedly spent 6 times as much time on nV hardware specific optimizations as they did for ATi because mid end nv3x hardware had such horrid performance.

    Personally I wouldn't count on nV putting "optimizations" in their drivers or the developer creating a lower precision path for all games that use full precision ps2 that I might want to play in the future.

    Checking reviews recently it would appear as though nV's drivers cause more trouble than ATi's as well.
  • WarriorWarrior Join Date: 2003-02-16 Member: 13624Members
    Go with the ati 9800 pro. Also you better have the CPU and ram to use the new card or you wont get much of a performance gain.
  • xioutlawixxioutlawix Join Date: 2002-11-05 Member: 7118Members, Constellation
    edited March 2004
    <!--QuoteBegin-Jim has Skillz+Mar 14 2004, 09:59 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Jim has Skillz @ Mar 14 2004, 09:59 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> Speculation <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->


    <!--QuoteBegin-Soylent green+--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Soylent green)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->Facts<!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->

    Owned
  • TalesinTalesin Our own little well of hate Join Date: 2002-11-08 Member: 7710NS1 Playtester, Forum Moderators
    In short, though the nVidia card is clocked higher, it's nowhere near as *efficient* as the ATI card. That's the difference... nVidia *has* to crank up the clocks to even make a pass at being competative.

    Their code paths are aging as they haven't redesigned their actual core in a very long while (as in, complete overhaul instead of just slapping on new modules, which is more or less what they've been doing).
    As an example, the GFFX may be clocked at 800MHz.. but if it takes four cycles to finish an operation, whereas the ATI card can do four of the operations in ONE cycle, an ATI card at 200MHz would be four times FASTER than the 800MHz 'beast'.

    It's white-sheet BSing to get non-savvy customers to buy their inferior product.


    Another funny thing is, no nVidia card can ACTUALLY do Pixel Shader 2.0 effects. The card translates any PS2.0 instruction it's given back to PS1.4, and THEN renders it. That translation only slows it down further.

    nVidia cards are slower. They also have WORSE image quality, especially in FSAA and Anisotropic tests. They're following the path of 3Dfx with their 'Cg' language. In short, they need to shape up and stop relying on rabid fanboys to buy their crappy cards, or they'll be washed up pretty quickly. Deal with it.


    Now, on to which card to buy. The 9800 Pro is an excellent card, and much faster than the 9600XT. If you can get your hands on one for only $200, DO SO. Make sure it's a genuine Built By ATI, and not a third-party... never had a problem with an official card, so that's all I'll buy. Have had friends who went for the off-brand, and they've had... problems.


    And as to what someone said before... the 9800 non-pro is NOT a four-pixel pipeline card. The 9600 (non-pro, Pro, and XT) is a 4-pipeline card. The 9500 non-pro was a four-pipe card, but the 9500 Pro (which I own) is an 8-pipe card, just like the 9700... just clocked a little lower.
Sign In or Register to comment.