Nvidia Or Ati?

24

Comments

  • 7Bistromath7Bistromath Join Date: 2003-12-04 Member: 23928Members, Constellation
    I'd just like to say that I believe that <a href='http://toaster.ytmnd.com/' target='_blank'>this website I made</a> a while back is the definitive source of info on SLI technology.
  • DOOManiacDOOManiac Worst. Critic. Ever. Join Date: 2002-04-17 Member: 462Members, NS1 Playtester
    ! You stole my graphic :O
  • 7Bistromath7Bistromath Join Date: 2003-12-04 Member: 23928Members, Constellation
    <!--QuoteBegin-DOOManiac+Jul 19 2004, 02:21 AM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (DOOManiac @ Jul 19 2004, 02:21 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> ! You stole my graphic :O <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    You mean you didn't see that link the day I made it?
  • redeemed_darknessredeemed_darkness Join Date: 2003-01-21 Member: 12565Members
    edited July 2004
    As of right now ATI is the way to go for gaming *edit* just make sure your comp can support it (most comps will)


    In the future we will see what will happen with the SLI then we can Judge if it is worth while or a load of crap
  • spinviperspinviper Join Date: 2003-05-08 Member: 16151Members
    <!--QuoteBegin-Defiance+Jul 19 2004, 01:28 AM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Defiance @ Jul 19 2004, 01:28 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> After tons of garbage driver problems, and incompatablility with certain VIA chipsets, I will never buy another ATI card for as long as I live. RADEON 9600XT's do not work with VIA KT600 Chipsets, regardless of the motherboard model or manufactor. I'm not saying their cards bad in terms of hardware and how well they can run games, I'm saying their cards are bad in terms of garbage drivers and compatability issues. The most you can get out of that card in that situation is 4x AGP mode - half of what it can really do.

    <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    Are you sure? I have another computer that uses the VIA KT600 Chipset and it works perfectly.. no driver problems.
  • BeastBeast Armonkyi Join Date: 2003-04-21 Member: 15731Members, Constellation
    edited July 2004
    Here's another question: Which of the 6800/x800 will remain a good card for the longest?
    I bought a GeForce 4 Ti4600 2 years ago. It's still kinda good, but it's only now begining to show signs that an upgrade is possibly needed.

    I don't buy graphics cards often. I don't have the funds to buy the newest one every 6 months. So I need to know which of the 6800/x800 will stay a good card for the longest <!--emo&:p--><img src='http://www.unknownworlds.com/forums/html//emoticons/tounge.gif' border='0' style='vertical-align:middle' alt='tounge.gif' /><!--endemo-->.

    From the many reviews I've seen... it looks like the 6800/x800 look to be near identical in performance, but... drivers can change a lot of things.. so it's even mroe confusing.
  • QuaunautQuaunaut The longest seven days in history... Join Date: 2003-03-21 Member: 14759Members, Constellation, Reinforced - Shadow
    <!--QuoteBegin-Beast+Jul 19 2004, 05:51 AM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Beast @ Jul 19 2004, 05:51 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> Here's another question: Which of the 6800/x800 will remain a good card for the longest?
    I bought a GeForce 4 Ti4600 2 years ago. It's still kinda good, but it's only now begining to show signs that an upgrade is possibly needed.

    I don't buy graphics cards often. I don't have the funds to buy the newest one every 6 months. So I need to know which of the 6800/x800 will stay a good card for the longest <!--emo&:p--><img src='http://www.unknownworlds.com/forums/html//emoticons/tounge.gif' border='0' style='vertical-align:middle' alt='tounge.gif' /><!--endemo-->.

    From the many reviews I've seen... it looks like the 6800/x800 look to be near identical in performance, but... drivers can change a lot of things.. so it's even mroe confusing. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    It is.

    Problem is though, GeForce is only good at OpenGL. And even there, it doesn't do too great.

    Then you look at ATi- Direct 3D, and they do OpenGL pretty good too, but not as good as GeForce.

    Now, you'd say their equal, but the thing is, every new game coming out except DOOM3 uses DX9.

    So ATi wins. And even games where OpenGL is the engine used, ATi wins once in a while- like with Half-Life. Now, I know there is a problem right now where ATi doesn't do too well on the HL engine, but from what I've seen, thats totally random, and simply switching to older drivers makes it fine again. And even so- ATi has some pretty cool things to warp the screen, which is pretty fun <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html//emoticons/smile.gif' border='0' style='vertical-align:middle' alt='smile.gif' /><!--endemo-->

    Note again: I HAVE A GEFORCE FX 5600 256MB v.

    So I'm COMING from GeForce, and have learned the error of my ways.
  • IsamilIsamil Join Date: 2003-11-25 Member: 23552Members, Constellation
    <!--QuoteBegin-Defiance+Jul 19 2004, 01:28 AM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Defiance @ Jul 19 2004, 01:28 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> After tons of garbage driver problems, and incompatablility with certain VIA chipsets, I will never buy another ATI card for as long as I live. RADEON 9600XT's do not work with VIA KT600 Chipsets, regardless of the motherboard model or manufactor. I'm not saying their cards bad in terms of hardware and how well they can run games, I'm saying their cards are bad in terms of garbage drivers and compatability issues. The most you can get out of that card in that situation is 4x AGP mode - half of what it can really do.

    All I know is I've never had problems with any Nvidia cards I own.

    And after looking at a lot of benchmarks since I have to get a new video card now, most of the higher end ones, like others have said, are alike in many ways, but each card has it own advantages.

    Personal preference... buy a card for what you want it to do, and make sure to research it throughly to make sure it will work with all your current hardware. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    Thats a motherboard driver problem. You need new motherboard drivers.

    Here you go <a href='http://downloads.viaarena.com/drivers/4in1/VIA_Hyperion%204IN1_V451v.zip' target='_blank'>http://downloads.viaarena.com/drivers/4in1...04IN1_V451v.zip</a>
    4 in 1 drivers, I had the same problem, it fixed it.
  • SvenpaSvenpa Wait, what? Join Date: 2004-01-03 Member: 25012Members, Constellation
    how will it be when nvidia releases cards that can work together? I absolutly think 2 6800 beats 1 x800 XT .
  • TommyVercettiTommyVercetti Join Date: 2003-02-10 Member: 13390Members, Constellation, Reinforced - Shadow
    edited July 2004
    Like I said, I'm definitely going for the X800.

    These are the complete specs of the system I'm buying, most options ripped from the site:

    <u>Cyberpower Inc. AMD System Custom Eqipped by Me</u>
    <b>Case:</b> Aluminum X-SuperAlien Server 500W w/ Window & LCD Temperature Display + Fan Control (Black Color)
    <b>Power Supply Upgrade:</b> Standard Case Supply
    <b>Neon Light Upgrade:</b> 12" Cold Cathode Neon Light Sound Activated (Red Color)
    <b>Processor:</b> AMD Athalon 64-FX 53 Processor
    <b>Motherboard:</b> Asus SK8V VIA K8T800 Chipset AGP8x w/ LAN, USB 2.0, IEEE, and Audio
    <b>Memory:</b> 4 512MB DualChannel Registered ECC PC3200 DDR400 Memory Sticks (Standard Major Brand)
    <b>Hard Drives:</b> 2 Western Digital Raptor 74GB 10,000 RPM Serial ATA 8MB Cache WD740GD (RAID-0 Config)
    <b>Video Card:</b> None; ATi Radeon X800 XT Platinum Edition AGP 8X 256MB (Purchased Separately)
    <b>Free Game:</b> None; only available in nVidia package
    <b>Free Far Cry DVD Game:</b> Of course
    <b>Primary Optical:</b> 16x DVD-ROM & 52x32x52 CD-RW Combo Drive (Red Color)
    <b>Secondary Optical:</b> 12x DVD-R/RW/CD-RW Drive (Red Color)
    <b>Monitor:</b> NEC AccuSync 120 21" CRT Monitor (purchased separately)
    <b>Sound Card:</b> Mad Dog Entertainer 7.1 DSP (purchased separately)
    <b>Thermal Display:</b> No; it's built in to the case
    <b>Flash Media Reader/Writer:</b> Internal 6in1 Flash Media Reader/Writer (Red Color)
    <b>Cooling Fans:</b> AMD Athalon 64-FX Certified CPU Fan & Heatsink + 3 Extra Case Fans
    <b>Speakers:</b> Creative Labs Inspire T7700 7.1 Surround Subwoofer Speaker System (Black Color)
    <b>Modem:</b> PCI 56k v.92 Fax Modem v/ Voice
    <b>Floppy Drive:</b> 1.44MB Floppy Drive (Red Color)
    <b>Zip Drive:</b> No
    <b>Network:</b> Onboard 10/100 Network Card
    <b>Keyboard:</b> Ideazon Z-Keyboard Base Interchangable Keyboard + CROSS-FIRE interface
    <b>Mouse:</b> None; Razer Boomer Speed (purchased separately)
    <b>Operating System:</b> None; already in possesion of Windows XP Home Edition
    <b>Rounded Cable:</b> Not unless you can think of a reason
    <b>Service:</b> Standard Warranty + 1-year On-site Service
    <b>Rush Service:</b> No; ship out in 5-10 business days
    <b>Wireless 802.11B/G Access Point:</b> None
    <b>Wireless 802.11B/G Network Card:</b> None
    <b>Office Suite 1:</b> Microsoft Works 7.0
    <b>Office Suite 2:</b> None
    <b>Printer:</b> None
    <b>Printer Cable:</b> None
    <b>MP3 Player:</b> None
    <b>IEEE Card:</b> None; already integral component of motherboard
    <b>USB Port:</b> Standard 2 USB Ports + 2 Extra USB Ports
    <b>Video Camera:</b> None
    <b>Scanner:</b> None

    Total: $3,439.00

    Anyone spot any weaknesses?
  • MedHeadMedHead Join Date: 2002-12-19 Member: 11115Members, Constellation
    My father swears by nVidia. He says "ATI drivers are pathetic, they don't work. There are minor to no differences between ATI and nVidia cards - so the driver is what is important. And since ATI drivers don't work well with Half-Life, nor does my ATI TV card program work well, I consider nVidia a better card. I've <b>never</b> had any problems with nVidia."

    On the other hand, I'm more of an ATI fan - for the moment. I go with whatever card is better. If I remember correctly, I think it was ATI that had the "wallhack" drivers. The company that did those drivers was very flippant about the whole thing... but then again, it was only one game that it wasn't working in (but, on the other hand, Half-Life <b>is</b> a pretty big game to ignore).

    But, once the huge air horn got attached on the nVidia card, I started looking at ATI more closely. And wow, these things were kicking nVidia away. At 4x FSAA, ATI cards were purring, while nVidia was dying in a pool of its own vomit. ATI never made a foghorn heatsink that required two ports, either.

    For the moment, I'm stuck on ATI (and no, not because of Talesin, although the debates back and forth in the past have certainly helped me learn more about ATI).
  • QuaunautQuaunaut The longest seven days in history... Join Date: 2003-03-21 Member: 14759Members, Constellation, Reinforced - Shadow
    <!--QuoteBegin-Svenpa+Jul 19 2004, 07:00 AM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Svenpa @ Jul 19 2004, 07:00 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> how will it be when nvidia releases cards that can work together? I absolutly think 2 6800 beats 1 x800 XT . <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    I wouldn't count on it.

    GeForce can't do ASF or AA worth CRAP. AT ALL.

    They MAY equal out with the 2 cards though.
  • BirdyBirdy Join Date: 2003-05-29 Member: 16825Members, Constellation
    Expensive O.o

    I want an ATI too, never had one before ^^
  • dynamicxdynamicx Join Date: 2004-06-27 Member: 29574Members
    Ok lets try to get a few things right: the only similar thing about SLI to the 3dfx days, is the name. It comes as standard on the 6800s Ultra Extremes. It offers roughly 1.8x the speed of a single card.

    As for OpenGL and DirectX, Nvidia kicks ATI's arse in OpenGL. DirectX 9 is a little different. When running at high resolutions with no filtering Nvidia is a little faster, they are equal with just AA enabled, and ATI just edges ahead with anisotropic enabled.

    PS 3.0 is not a marketing scam.. granted it won't provide as much of a difference as PS 1.1 -> PS 2.0. However, all the newest games are going to be using it. Unreal Engine 3.0 for example, will most likely be required. Also, current games are starting to take advantage of it, farcry for example in the latest patch had added a few parts of it, mainly increasing performance for the moment(from 2% up to almost 20%). With the image enhancements of PS 3.0 coming later.

    Image quality wars are dead... the fact that you have to freeze frame and zoom in 10x to notice any difference kinda speaks for itself. As for "optimizations", both has done their fair share. Nvidia with the 3dMark 03 and reverting to bilinear when it shouldn't, ATI with quack3 and brillinear.

    Overall.. if you don't upgrade hat often, no question but to go with Nvidia. If you want run everything in 1600x1200 with max settings and filterings but don't want to go with Nvidia and SLI, go with ATI. Practically, all the latest cards are cracking, and you'll be happy with whichever company you like best/can find cheapest. <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html//emoticons/smile.gif' border='0' style='vertical-align:middle' alt='smile.gif' /><!--endemo-->
  • kuperayekuperaye Join Date: 2003-03-14 Member: 14519Members, Constellation
    <!--QuoteBegin-Svenpa+Jul 19 2004, 09:00 AM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Svenpa @ Jul 19 2004, 09:00 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> how will it be when nvidia releases cards that can work together? I absolutly think 2 6800 beats 1 x800 XT . <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    of course spend 1000 dollars to beat the crap out of card that you payed for 400 dollars and it still out performs the cards 1 v 1.

    Spend 1000 dollars yea have fun with no space in your computer
  • EEKEEK Join Date: 2004-02-25 Member: 26898Banned
    edited July 2004
    <!--QuoteBegin-dynamicx+Jul 19 2004, 12:14 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (dynamicx @ Jul 19 2004, 12:14 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> PS 3.0 is not a marketing scam.. granted it won't provide as much of a difference as PS 1.1 -> PS 2.0. However, all the newest games are going to be using it. Unreal Engine 3.0 for example, will most likely be required. Also, current games are starting to take advantage of it, farcry for example in the latest patch had added a few parts of it, mainly increasing performance for the moment(from 2% up to almost 20%). With the image enhancements of PS 3.0 coming later. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
    ... that's like buying a pure electric car saying that everyone will need one in the future. Yeah, that may be true at some point, but 'future' is not 'now'.

    If you're still on your 6800 when Unreal 3 comes around, you'll need to upgrade anyway. By then ATI will have it as well, so it doesn't matter.

    Secondly, who cares if Far Cry is using Pixel Shader 3.0, or even parts of it for a performance boost? I showed you the benchmarks - ATI SMACKD THE UTTER MOTHER LIVING * CRAP * out of NVIDIA. You're going to say get a 6800 for PS3.0 for a 20% boost... when at top settings, the NVIDIA slugged along at 12 FPS, while ATI went a (relatively, that is) blazing fast 38-40 FPS? So, what, get the nvidia card and maybe go <b>16</b> FPS instead of 12? "OMGZ PS 3.0 CHANGED MEH LIFE!!!!" Hahahhahahahahahaahahha...

    As for whomever way back said 'Blah blah well the GAME DEVELOPERS can start using pixel shader 3.0'... 1) I'm not a game developer. 2) Neither are you. 3) So really - WHO GIVES A ****? Am I gonna praise the company that made AutoCAD? I'm sure architechtural engineers will, but it doens't affect me at all, so I certainly don't.
  • dynamicxdynamicx Join Date: 2004-06-27 Member: 29574Members
    <!--QuoteBegin-EEK+Jul 19 2004, 12:29 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (EEK @ Jul 19 2004, 12:29 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->... that's like buying a pure electric car saying that everyone will need one in the future. Yeah, that may be true at some point, but 'future' is not 'now'.

    If you're still on your 6800 when Unreal 3 comes around, you'll need to upgrade anyway. By then ATI will have it as well, so it doesn't matter.

    Secondly, who cares if Far Cry is using Pixel Shader 3.0, or even parts of it for a performance boost? I showed you the benchmarks - ATI SMACKD THE UTTER MOTHER LIVING * CRAP * out of NVIDIA. You're going to say get a 6800 for PS3.0 for a 20% boost... when at top settings, the NVIDIA slugged along at 12 FPS, while ATI went a (relatively, that is) blazing fast 38-40 FPS? So, what, get the nvidia card and maybe go <b>16</b> FPS instead of 12? "OMGZ PS 3.0 CHANGED MEH LIFE!!!!" Hahahhahahahahahaahahha...

    As for whomever way back said 'Blah blah well the GAME DEVELOPERS can start using pixel shader 3.0'... 1) I'm not a game developer. 2) Neither are you. 3) So really - WHO GIVES A ****?<!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
    Have a look at this for some better benches: <a href='http://www.firingsquad.com/hardware/far_cry_sm30/' target='_blank'>http://www.firingsquad.com/hardware/far_cry_sm30/</a>
    And as for PS 3.0 the point was that it's closer than you think, S.T.A.L.K.E.R, the next Splinter Cell and many more, that really aren't that far away will use it. Also, if PS 3.0 can be added to Farcry.. even partially.. it could be added to both hl-2 and doom3.
  • QuaunautQuaunaut The longest seven days in history... Join Date: 2003-03-21 Member: 14759Members, Constellation, Reinforced - Shadow
    <!--QuoteBegin-dynamicx+Jul 19 2004, 10:39 AM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (dynamicx @ Jul 19 2004, 10:39 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> <!--QuoteBegin-EEK+Jul 19 2004, 12:29 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (EEK @ Jul 19 2004, 12:29 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->... that's like buying a pure electric car saying that everyone will need one in the future. Yeah, that may be true at some point, but 'future' is not 'now'.

    If you're still on your 6800 when Unreal 3 comes around, you'll need to upgrade anyway. By then ATI will have it as well, so it doesn't matter.

    Secondly, who cares if Far Cry is using Pixel Shader 3.0, or even parts of it for a performance boost? I showed you the benchmarks - ATI SMACKD THE UTTER MOTHER LIVING * CRAP * out of NVIDIA. You're going to say get a 6800 for PS3.0 for a 20% boost... when at top settings, the NVIDIA slugged along at 12 FPS, while ATI went a (relatively, that is) blazing fast 38-40 FPS? So, what, get the nvidia card and maybe go <b>16</b> FPS instead of 12? "OMGZ PS 3.0 CHANGED MEH LIFE!!!!" Hahahhahahahahahaahahha...

    As for whomever way back said 'Blah blah well the GAME DEVELOPERS can start using pixel shader 3.0'... 1) I'm not a game developer. 2) Neither are you. 3) So really - WHO GIVES A ****?<!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
    Have a look at this for some better benches: <a href='http://www.firingsquad.com/hardware/far_cry_sm30/' target='_blank'>http://www.firingsquad.com/hardware/far_cry_sm30/</a>
    And as for PS 3.0 the point was that it's closer than you think, S.T.A.L.K.E.R, the next Splinter Cell and many more, that really aren't that far away will use it. Also, if PS 3.0 can be added to Farcry.. even partially.. it could be added to both hl-2 and doom3. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    linkage no worky.
  • TalesinTalesin Our own little well of hate Join Date: 2002-11-08 Member: 7710NS1 Playtester, Forum Moderators
    404NotFound, note that isn't the number of connectors... you may have more than one molex connector per power lead.
    The 6800 demands that you give it a full POWER LEAD for each molex connector on the board, rendering every other molex connector on those particular bundles of wire unusable.

    TommyVercetti, you've run into one of the basic problems. Do NOT use a Creative Labs card on any Ath64 machine. CL has screwed up the drivers, to where they do not catch an 'ACPI race condition'... the short version being, if you're running Windows on it, and go into a 3D game, or use a network card, the machine will randomly blackscreen and reboot with no warning. This is PURELY Creative Labs' fault, in the manner in which they wrote their drivers. The cards before the SBLive! will work fine, if you can still find an old SB16-PCI... but SBLive, Audigy, and Audigy2 (which are all the same damn card, honestly) will cause this behaviour because the Ath64 processors (including the FX line) require ACPI to be enabled. I've had to pull my own SBLive from the machine due to this little glitch.

    dynamicx, you have that reversed. The gap between ATi and nVidia under OpenGL is very slight... 10% at 'actual' resolutions (meaning no 640x480x16, as no one paying $700 for a video card is going to go that low-res).
    SLI is the SAME TECHNOLOGY USED BY 3Dfx. They put out a press release saying that it's taken this long to integrate the technology. And as noted before, the performance gain can range between 120% and 180%.
    As for pixel shader 3.0... look at the specification, then look at the PS2.0 spec, and the hardware T&L baseline. ANY card that can do PS2.0 and has a hardware T&L unit can do PS3.0 with minor driver modifications (addition of the new name-types).
    The 6800 is falling behind. The UE is the only model that can put up somewhat of a fight, but still gets slapped around.... while sounding like a vacuum cleaner, taking up an insane amount of power, and giving crappy image quality.
    Oh, and the 'image quality wars' are not dead. When benchmarkers grab a screenshot and blow it up, that is to show the more subtle aspects.. which are HIGHLY visible when the game is in motion.

    And I'm sorry... I normally don't do this (except to Tom's Hardware). You expect anyone to believe FiringSquad? With their biased benchmarking procedures, in everything from motherboards to CPUs to video cards? Are they *STILL* insisting that a P4 @ 3.2GHz can outperform an Ath64 @ 2GHz? It's laughable. And don't even get me started on the number of CS debates that go on in their forums.
  • aaarrrghaaarrrgh Join Date: 2003-10-20 Member: 21812Members
    Instead of bickering like a bitter fanboy about how the x800 XT GT YF PE PC or the 6800GT VX FX OEM CS is better because they make your hair grow faster, I'll recommend <a href='http://www.anandtech.com/video/showdoc.aspx?i=2113&p=1' target='_blank'>this</a> article.

    What it, in my opinion, comes down to, when looking away from card performance and driver performance, is the price. And just as importantly, the games you want to play. Something this article illustrates in a great way. I, for instance, have no reason to go for X800 pro's as the 6800GT's are cheaper, especially when the performance differences are so insignificant (and I see me using the card for a long time, just like this GF3, so the AA benchmarks are, at least to me, quite irrelevant <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html//emoticons/smile.gif' border='0' style='vertical-align:middle' alt='smile.gif' /><!--endemo-->). I furthermore refuse to go higher up, because it's a waste of money, at least for a scandinavian customer.

    I don't doubt that the Nvidia cards will perform better with Doom 3, and neither do I doubt that Valve have been messing around with optimizations for the ATI cards, so they will perform better than their respective competitors.

    Naturally not a situation that's any fair for the customer who'd want to enjoy both games fully, but that's how it goes.
  • NumbersNotFoundNumbersNotFound Join Date: 2002-11-07 Member: 7556Members
    You guys are getting way to worked up about this. it's a video card.
  • TommyVercettiTommyVercetti Join Date: 2003-02-10 Member: 13390Members, Constellation, Reinforced - Shadow
    edited July 2004
    I feel like I'm caught in the crossfire...

    What exactly does PS3.0 do? Right now I think I'm going with the X800 because I like the brute force it offers and Talesin says the drivers could be modified at a later date to support PS3.0 (but would they be official drivers?). I don't have the cash for two nVidias right now, and I'm not too keen on the whole two slot thing...

    [edit] Thanks for the heads up on the Creative card. Are they going to release a fix later? I don't really know any other decent cards, and this is the first time I've heard of this issue anywhere. [/edit]
  • NumbersNotFoundNumbersNotFound Join Date: 2002-11-07 Member: 7556Members
    The Turtle Beach Santa Cruz is a great sound card as well. That's what I use over Creative Labs, just don't expect great Linux support.
  • Jim_has_SkillzJim_has_Skillz Join Date: 2003-01-19 Member: 12475Members, Constellation
    Haha, as I was saying, if you want the best of the best, you have to go with SLI Technology and get two Geforce cards because Nvidia is the only one with this technology. Yes it will be expensive and it will be VERY hot inside along with multiple power connectors stolen but if you are a power-enthusiast, this is the only way to go.

    For me, I am probably going to get the Geforce 6800 GT considering in the tests I have seen, it outperforms its ATi counterpart(albeit by a small margin). I am not going to buy my card based on how hard it can kick another cards **** in one test(espiecially Anti-Aliasing, I could care less for it).
  • TommyVercettiTommyVercetti Join Date: 2003-02-10 Member: 13390Members, Constellation, Reinforced - Shadow
    I'm still favoring the X800. It beats the nVidia in the two tests I care most about, Far Cry and Halo. I couldn't care less about racing or RTS games (though it still runs well in there), and as I said earlier I don't like the fact that the 6800 is so bulky.
  • aaarrrghaaarrrgh Join Date: 2003-10-20 Member: 21812Members
    edited July 2004
    <!--QuoteBegin-TommyVercetti+Jul 19 2004, 03:45 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (TommyVercetti @ Jul 19 2004, 03:45 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> What exactly does PS3.0 do? <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
    Currently? Not too much. Right now, it's main use is optimizations (PS2.0 effects), pretty much. In the future, it's much likelier to see effects that are much more movielike (in comparison to PS2.0), simply because it offers much bigger freedom, and less limits.

    However, unlike 3Dc, this is not a gimmick, and it'll be standard on all future graphic cards (until PS4.0 arrives <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html//emoticons/smile.gif' border='0' style='vertical-align:middle' alt='smile.gif' /><!--endemo-->) .

    It's one of those things that might, or might not be a benefit in the next 6 months, really (depending on the amount of companies that'll do PS3.0 optimizations (as opposed to PS3.0 effects, which we won't see in a while). Only time will tell.

    EDIT: If you care about Far Cry, you should at least wait and see how the 1.2 patch turns out, as it'll offer 3.0 support by then. <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html//emoticons/smile.gif' border='0' style='vertical-align:middle' alt='smile.gif' /><!--endemo-->
  • DragonMechDragonMech Join Date: 2003-09-19 Member: 21023Members, Constellation, Reinforced - Shadow
    edited July 2004
    <!--QuoteBegin-aaarrrgh+Jul 19 2004, 03:10 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (aaarrrgh @ Jul 19 2004, 03:10 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> EDIT: If you care about Far Cry, you should at least wait and see how the 1.2 patch turns out, as it'll offer 3.0 support by then. <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html//emoticons/smile.gif' border='0' style='vertical-align:middle' alt='smile.gif' /><!--endemo--> <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
    FC 1.2 is out.

    [edit]I stand corrected - it'll be out soon.
  • aaarrrghaaarrrgh Join Date: 2003-10-20 Member: 21812Members
    edited July 2004
    <!--QuoteBegin-Dragon_Mech+Jul 19 2004, 04:15 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Dragon_Mech @ Jul 19 2004, 04:15 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> <!--QuoteBegin-aaarrrgh+Jul 19 2004, 03:10 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (aaarrrgh @ Jul 19 2004, 03:10 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> EDIT: If you care about Far Cry, you should at least wait and see how the 1.2 patch turns out, as it'll offer 3.0 support by then. <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html//emoticons/smile.gif' border='0' style='vertical-align:middle' alt='smile.gif' /><!--endemo--> <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
    FC 1.2 is out. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
    FC 1.1 is. Far Cry 1.2 is, apparently, close to release.

    <a href='http://ubisoft-en.custhelp.com/cgi-bin/ubisoft_en.cfg/php/enduser/std_alp.php' target='_blank'>http://ubisoft-en.custhelp.com/cgi-bin/ubi...ser/std_alp.php</a>

    EDIT: Here's a nice article about PS3.0. While some of devs agree on PS3.0 being useful just yet, it's also obvious that's it's not just another useless hack.

    <a href='http://www.gamers-depot.com/interviews/dx9b/001.htm' target='_blank'>clickie.</a>
  • TommyVercettiTommyVercetti Join Date: 2003-02-10 Member: 13390Members, Constellation, Reinforced - Shadow
    Each site I go to is eiher "nVidia rules," or "ATi is God."

    I guess honest reporting got dumped a while back, eh?

    Really all I want is a card that will run HL2, Doom III, and STALKER well enough for me to play with settings maxed at 1280x1024 res with a framerate above sixty. And it has to stay good for at least a year and a half to two years.
  • TychoCelchuuuTychoCelchuuu Anememone Join Date: 2002-03-23 Member: 345Members
    <!--QuoteBegin-TommyVercetti+Jul 19 2004, 06:32 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (TommyVercetti @ Jul 19 2004, 06:32 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> Each site I go to is eiher "nVidia rules," or "ATi is God."

    I guess honest reporting got dumped a while back, eh?

    Really all I want is a card that will run HL2, Doom III, and STALKER well enough for me to play with settings maxed at 1280x1024 res with a framerate above sixty. And it has to stay good for at least a year and a half to two years. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    Quite possibly the both r0x0rz j00r b0x0rz. I'd go with ATI personally; sucks less power.
Sign In or Register to comment.