Quick Question: Which New Gpu Will Last Me Longer?

2»

Comments

  • CabooseCaboose title = name(self, handle) Join Date: 2003-02-15 Member: 13597Members, Constellation
    edited October 2004
    The 6800 for $270 is also only 128mb...

    you can get an x700 for $195 with 256 megs of memory a faster clock speed and a faster FSB...

    :edit: even the X600 XT is better ($154) Only thing is both the ati cards are PCI Express...

    6800
    Chipset/Core Speed: nVIDIA GeForce 6800/325MHz
    Memory/Effective Speed: 128MB DDR/700MHz
    BUS: AGP 4X/8X
    Ports: VGA Out(15 Pin D-Sub) + TV-Out (S-Video) + DVI
    Support 3D API: DirectX 9, OpenGL 1.5
    Max Resolution@32bit Color: 2048X1536@85Hz

    X600 XT
    Chipset/Core Speed: ATI Radeon X600XT/500MHz
    Memory/Effective Speed: 128MB DDR/740MHz
    BUS: PCI Express x16
    Ports: VGA Out(15 Pin D-Sub) + VIVO + DVI
    Support 3D API: DirectX 9, OpenGL 1.5
    Max Resolution@32bit Color: 2048X1536

    X700 Pro
    Chipset/Core Speed: ATI Radeon X700 PRO/420MHz
    Memory/Effective Speed: 256MB DDR/860MHz
    BUS: PCI-Express
    Ports: VGA Out(15 Pin D-Sub) + TV-Out (S-Video/Composite Out) + DVI
    Support 3D API: DirectX 9, OpenGL 2.0
    Max Resolution@32bit Color: 2048X1536@85Hz
  • Travis_DaneTravis_Dane Join Date: 2003-04-06 Member: 15249Members
    Well if we're gonna be low-budget happy, get a 6800LE and unlock the remaining pixel and vertex pipelines. Tadaa, a 6800 for $200, which you can also overclock by it's core and memory, close to a GT.
  • Crono5Crono5 Join Date: 2003-07-22 Member: 18357Members
    <!--QuoteBegin-Talesin+Oct 16 2004, 06:16 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Talesin @ Oct 16 2004, 06:16 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> But then, I guess if you're used to a card that can't even do ACTUAL pixel shader 2.0, that kind of thing would excite you. <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html//emoticons/smile-fix.gif' border='0' style='vertical-align:middle' alt='smile-fix.gif' /><!--endemo-->



    ERK! Chrono, look at the X700 series then. They stomp the crap out of the standard 6800s from gaming benchmarks, and run between $200 and 250 (250 for the top-end X700 with 256MB RAM, and a higher-clocked core, which gives the 6800GT a run for its money). The only problem is that they're still making their way out to retail stores at the moment. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    First off, yes, talk of pixel shaders gets me excited. Very, very excited. See, until my current Radeon 9200, I've been living off integrated video chipsets. I ran through the Half-Life 2 Stress Test, and pretty much went :O . So yeah, I thought Pixel Shader 3.0 would be like, super-duper-ultra-spectacular cool. My dad's been the one with the luxury of the TNT 2 and GeForce Ti (which I thought was some pretty hot stuff. I used to follow this graphics card stuff back when the Ti4600 was unbeatable, and Matrox just came out with their triple-monitor support enabled card, I think it was called Parphenila or something)

    Also, in case you couldn't tell, I'm easily swayed, I just want the damn system to be finalized ><

    I didn't see the X700 (Which I would've picked out if I had...) because I decided on a motherboard with AGP, thinking "Bah, PCI-E probably came out yesterday, no one has anything for it."

    The price differece is enough to afford me to switch motherboards.

    So X700!

    Undoubtedly my final choice.

    Well, at least hopefully. I sort of like ATi more than nVidia because they make the GameCube graphics chips. And because I have a Radeon 9200 in my box and it's the best thing I've ever touched.

    Which, I realize, is pathetic.
  • funbagsfunbags Join Date: 2003-06-08 Member: 17099Members
    With Cat 4.10, an X700 shouldn't be that far behind the X800...its only 100 difference:D
  • Travis_DaneTravis_Dane Join Date: 2003-04-06 Member: 15249Members
    Yes well, since everyone seems to pimp the Catalyst 4.10 drivers: <a href='http://forums.guru3d.com/showthread.php?threadid=110159&perpage=10&pagenumber=4' target='_blank'>ATi screwed them up</a>.
  • BaconTheoryBaconTheory Join Date: 2003-09-06 Member: 20615Members
    I don't know exactly when, but I have heard that the next generatin of vidoe cards will be 512mb. I don't know how true that is, but oh well.

    Oh yeah, make a big investment and go for the X800 XTPE.
  • funbagsfunbags Join Date: 2003-06-08 Member: 17099Members
    <!--QuoteBegin-Travis Dane+Oct 17 2004, 09:15 AM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Travis Dane @ Oct 17 2004, 09:15 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> Yes well, since everyone seems to pimp the Catalyst 4.10 drivers: <a href='http://forums.guru3d.com/showthread.php?threadid=110159&perpage=10&pagenumber=4' target='_blank'>ATi screwed them up</a>. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    Thats a forum full of people like you and me who screw them up. Testimonials are not a reliable source information. I'd like to see some proof from a proffesional review website/ATi that they are screwed up. Half the people in that thread said it runs perfect and they are getting better performance.

    "did all the trouble users use dna or omega drivers"

    Please go find real proof.
  • Travis_DaneTravis_Dane Join Date: 2003-04-06 Member: 15249Members
    <!--QuoteBegin-funbags+Oct 17 2004, 06:06 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (funbags @ Oct 17 2004, 06:06 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->Thats a forum full of people like you and me who screw them up. Testimonials are not a reliable source information. I'd like to see some proof from a proffesional review website/ATi that they are screwed up.<!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
    Haha, like ATi's ever going to say their drivers are screwed. If you want a proper indication of a driver's functionality, you obviously have to try it on as many different configurations as possible. Sure a review site can knock up a couple of system, but it doesn't come near the consumer masses. Not to mention these drivers are SUPPOSED to be used by consumers, and if consumers can't get it running properly, then YES there's something not right.

    Working drivers out of the box is something what we can consider possible in 2004...
  • TalesinTalesin Our own little well of hate Join Date: 2002-11-08 Member: 7710NS1 Playtester, Forum Moderators
    Travis. You have to realize... many people can't make TOAST without screwing it up. Even more walk out into traffic without looking.

    The short and sweet is, a lot of people are UTTER RAVING MORONS. They try to make toast and end up with their genetalia stuck in there, and the dog's run off with the bread.

    Personally, I've had zero problems with the drivers. Installed out-of-the-box with no hassles, as I read the instructions.
    However, I'm certain that if you handed them to someone who routinely stuffs the mouse up their... posterior to use it, they'll manage to screw things up. Either not downloading the non-.NET version (sans CCC), or downloading the CCC version but not bothering to install .NET 1.1 first.


    Yes. This is a PEBKAC error. Their code is ID-10T. They need a hard LART to the face, if they can't even manage to get simple drivers working. Hell, hand them cyanide peeps, clearly labelled, and see if they read the 'DO NOT EAT' instructions. If they don't, the world will quickly be a better place for everyone.
  • TheFrostmourneTheFrostmourne Join Date: 2004-09-14 Member: 31708Members
    ATi definatly did not screw up the drivers.
    I got a nice performance increase in all my games (UT2k4, BF:V, BF: 1942).
    CS, DoD and NS allready were at 90-100 FPS in-game before these drivers so nothing new in that department.

    Thumbs up to the new ATi driver for me, I also like the new Control Center Interface.
  • SiliconSilicon Join Date: 2003-02-18 Member: 13683Members
    edited October 2004
    is it just me or does ATI have to clock their gpu/vpu and ram faster just to compete {within a small margin} with Nvidia? seems like the whole thing where Intel just clocks their processors higher where AMD still competes decently against them, but hey that's just me.
  • DrSuredeathDrSuredeath Join Date: 2002-11-11 Member: 8217Members
    <!--QuoteBegin-Silicon+Oct 17 2004, 02:54 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Silicon @ Oct 17 2004, 02:54 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> is it just me or does ATI have to clock their gpu/vpu and ram faster just to compete {within a small margin} with Nvidia? seems like the whole thing where Intel just clocks their processors higher where AMD still competes decently against them, but hey that's just me. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    Correction.

    ATI only have to clock their GPU faster to compete with the new Nvidia line.
    If anything, that's more ownage.
  • SiliconSilicon Join Date: 2003-02-18 Member: 13683Members
    edited October 2004
    <!--QuoteBegin-Dr.Suredeath+Oct 17 2004, 01:18 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Dr.Suredeath @ Oct 17 2004, 01:18 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> <!--QuoteBegin-Silicon+Oct 17 2004, 02:54 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Silicon @ Oct 17 2004, 02:54 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> is it just me or does ATI have to clock their gpu/vpu and ram faster just to compete {within a small margin} with  Nvidia? seems like the whole thing where Intel just clocks their processors higher where AMD still competes decently against them, but hey that's just me. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
    Correction.

    ATI only have to clock their GPU faster to compete with the new Nvidia line.
    If anything, that's more ownage. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
    I guess you don't get what I am trying to say so I will say it even simpler, there's a ratio from mhz to performance, and nvidia seems to have a higher performance to mhz ratio than ATI does.

    Radeon X800 XT PE: 520MHz core / 550MHz mem 256bit GDDR-3
    GF 6800 Ultra: 400MHz core / 550MHz mem 256bit GDDR-3

    3dmark 2003:
    Radeon X800 XT PE: 12204
    GF 6800 Ultra Extreme: 11873
  • Travis_DaneTravis_Dane Join Date: 2003-04-06 Member: 15249Members
    <!--QuoteBegin-Silicon+Oct 17 2004, 11:01 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Silicon @ Oct 17 2004, 11:01 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> <!--QuoteBegin-Dr.Suredeath+Oct 17 2004, 01:18 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Dr.Suredeath @ Oct 17 2004, 01:18 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> <!--QuoteBegin-Silicon+Oct 17 2004, 02:54 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Silicon @ Oct 17 2004, 02:54 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> is it just me or does ATI have to clock their gpu/vpu and ram faster just to compete {within a small margin} with  Nvidia? seems like the whole thing where Intel just clocks their processors higher where AMD still competes decently against them, but hey that's just me. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
    Correction.

    ATI only have to clock their GPU faster to compete with the new Nvidia line.
    If anything, that's more ownage. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
    I guess you don't get what I am trying to say so I will say it even simpler, there's a ratio from mhz to performance, and nvidia seems to have a higher performance to mhz ratio than ATI does.

    Radeon X800 XT PE: 520MHz core / 550MHz mem 256bit GDDR-3
    GF 6800 Ultra Extreme: 400MHz core / 550MHz mem 256bit GDDR-3

    3dmark 2003:
    Radeon X800 XT PE: 12204
    GF 6800 Ultra Extreme: 11873 <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    The core of an Ultra Extreme is clocked at 450MHz, what you have there is an Ultra.
  • SiliconSilicon Join Date: 2003-02-18 Member: 13683Members
    ya i pasted the wrong name from the chart: <a href='http://users.erols.com/chare/video.htm' target='_blank'>http://users.erols.com/chare/video.htm</a>
  • TheFrostmourneTheFrostmourne Join Date: 2004-09-14 Member: 31708Members
    edited October 2004
    The last thing you want to do is compare the clock speed of the Intel CPUs to the AMD CPUs.

    The example you gave is utterly stupid. When you look at Processors the Front Side Bus Speed (or in AMD's case the HyperTransport Speed), on Die Cache size and number of caches is what will play a large role in the performance of the CPU.

    And should not be compared in parrallel to GPU.
  • V_MANV_MAN V-MAN Join Date: 2002-11-03 Member: 6217Members, Constellation
    ATI's x800 for the win tbh

    NVIDIA is a dying breed
  • SiliconSilicon Join Date: 2003-02-18 Member: 13683Members
    <!--QuoteBegin-The-Frostmourne+Oct 17 2004, 03:32 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (The-Frostmourne @ Oct 17 2004, 03:32 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> The last thing you want to do is compare the clock speed of the Intel CPUs to the AMD CPUs.

    The example you gave is utterly stupid. When you look at Processors the Front Side Bus Speed (or in AMD's case the HyperTransport Speed), on Die Cache size and number of caches is what will play a large role in the performance of the CPU.

    And should not be compared in parrallel to GPU. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    you obviously didn't read my example correctly, I'm sorry that you don't understand it.
  • DartenDarten Join Date: 2003-09-03 Member: 20513Members
    <!--QuoteBegin-DOOManiac+Oct 16 2004, 12:02 AM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (DOOManiac @ Oct 16 2004, 12:02 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->Both cards are so close its really just personal preference and what you can get your hands on, + what your budget is at (though the cards are pretty much identically priced. Maybe one will be $20 cheaper or something, but it will be close).



    Disregard everything everyone tells you and just get the one that is your personal preference.

    Yes, that includes this post.<!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
    That's the best advice you'll hear.

    And Talesin, you're opinions are so biased I bet even die-hard ATi fanboys read them and say, "Holy ****, that guy is biased."
  • funbagsfunbags Join Date: 2003-06-08 Member: 17099Members
    Also, the 6800 eats a lot of power.

    ATi ftw.
  • illuminexilluminex Join Date: 2004-03-13 Member: 27317Members, Constellation
    edited October 2004
    Actually, Talesin makes a lot of excellent points. Nvidia was top dog on the GPU front for several years, completely uncontested. ATi made the big jump and blew them away with the 9700, which completely blew away most of Nvidias offerings, even a year later. They started releasing decent drivers, and everything started going up. Meanwhile, Nvidia released the FX series of cards, the first group noted mostly for their amazing noise capabilities, the second being known for still lacking performance and not being true DX 9 cards.

    See, ATi sucker punched the king and took the throne. Now they're actually wrestling over top spot, and ATi's superior performance and graphical capabilities forced Nvidia to stop dragging their feet and screwing the consumer with poor image quality just to squeeze out 2 more fps. So, the consumer benefits, since both have to release top quality GPU's.

    Now, ATi has done some dumb crap recently, downplaying the importance of SLI, griping about native PCI-E, etc. However, their cards are still best for Source based games, and they're finally improving their OpenGL drivers. You can bet that the next generation of cards will probably blow this generation away in an many ways as possible.

    Also, does anyone else think that, with the advent of SLI, ATi may choose to simply create a "super card" that will carry similar performance? That's what happened last time someone made it possible to use two video cards as one; the next generation of cards had doubled performance, making the double card solution obsolete.
  • Crono5Crono5 Join Date: 2003-07-22 Member: 18357Members
    <!--QuoteBegin-illuminex+Oct 20 2004, 04:31 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (illuminex @ Oct 20 2004, 04:31 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> Also, does anyone else think that, with the advent of SLI, ATi may choose to simply create a "super card" that will carry similar performance? That's what happened last time someone made it possible to use two video cards as one; the next generation of cards had doubled performance, making the double card solution obsolete. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    Why not network two of the double-power cards together?
  • funbagsfunbags Join Date: 2003-06-08 Member: 17099Members
    Because the world would explode in a furry of fps.
  • DragonMechDragonMech Join Date: 2003-09-19 Member: 21023Members, Constellation, Reinforced - Shadow
    You say that like it was a bad thing....
Sign In or Register to comment.