Nvidia Reveals Geforce 6

DOOManiacDOOManiac Worst. Critic. Ever. Join Date: 2002-04-17 Member: 462Members, NS1 Playtester
edited April 2004 in Off-Topic
<div class="IPBDescription">HardOCP and Anandtech have good previews</div><a href='http://www.anandtech.com/video/showdoc.html?i=2023' target='_blank'>Anandtech Preview</a>
<a href='http://www.hardocp.com/article.html?art=NjA2' target='_blank'>Hard OCP Preview</a>

This card seems to be a double edged sword for the most part. I predict that people are either going to completely love it or completely hate it.

<img src='http://images.anandtech.com/reviews/video/NVIDIA/GeForce6800U/angle1.jpg' border='0' alt='user posted image' />

When you see this thing the first thing that comes to mind is does it have the vacuum cleaner properties of its predecessor? From the previews I glanced over, it is actually supposed to be very quiet.

The downside of this card? Its expensive, $600. It is a power hog. NVidia recommends at LEAST a 480W power supply, and this puppy needs **TWO** 4-pin power dealies.

The good? It appears to geniunely kick the living crap out of the 9800 XT. In Anandtech's Halo tests for instance, this thing was nearly twice as fast as the 9800 XT.

But don't forget, ATI's new card will no doubt be coming out very soon to match heels with this, so as it always is in the hardware game, something else is already in development that will be faster. And probably cheaper.


Me? I'm going to wait and see. I've really lost my faith in NVidia, and the cost of the card in addition to the cost of a new power supply (not to mention the cost of an increased electricity bill) make me wonder if it will outweigh the performance benefits...

[edit]BTW When you read that HardOCP review and see the 9800 with numbers closer to the GF6, keep in mind that for some stupid reason HardOCP benchmarked the games in DIFFERENT RESOLUTIONS. For example, in Battlefield: Vietnam the GF6 (52.3 avg FPS) is barely higher than the 9800 XT (49.4 avg FPS), however the GF6 was running at 1280x960 whereas the Radeon was only 1024x768. Also the GF6 was running 4xAA while the 9800 had no AA. I'd recommend just sticking to Anandtech for a numbers comparrison
«13

Comments

  • MrPinkMrPink Join Date: 2002-05-28 Member: 678Members
    When is this thing supposed to come out? I was gonna get a 9800XT but I might just get a 9800 pro and plan on buying this when it is released
  • StarchyStarchy Join Date: 2003-04-21 Member: 15727Members, Constellation
    $600? No thanks, I'll stick with meh £60 one. <!--emo&:D--><img src='http://www.natural-selection.org/forums/html//emoticons/biggrin.gif' border='0' style='vertical-align:middle' alt='biggrin.gif' /><!--endemo-->
  • That_Annoying_KidThat_Annoying_Kid Sire of Titles Join Date: 2003-03-01 Member: 14175Members, Constellation
    edited April 2004
    O_O

    pretty sexy card

    it will be interesting to see how it fairs

    [edit]
    hi CForrester, and ken! wewt and hello to JHunz as well ^_^
    [/edit]
  • JHunzJHunz Join Date: 2002-11-15 Member: 8815Members, Constellation
    Sounds sexy. Too bad I'm too poor for it. I wonder how long it will be from this announcement that ATI unveils their newest card.
  • Domo-KunDomo-Kun Join Date: 2004-03-18 Member: 27410Members
    funny.....hmmm 600 bucks for a vid card? or 600 bucks for an Athlon 64fx cpu........
    ill take the cpu then save up 250 for the 9800xt, which will drop to that price after ATI releases its new card. ohh ya im a shexy shoppah.*wiggle* *wiggle*
  • JimmehJimmeh Join Date: 2003-08-24 Member: 20173Members, Constellation
    ATi forever <!--emo&;)--><img src='http://www.unknownworlds.com/forums/html//emoticons/wink.gif' border='0' style='vertical-align:middle' alt='wink.gif' /><!--endemo-->
  • ComproxComprox *chortle* Canada Join Date: 2002-01-23 Member: 7Members, Super Administrators, Forum Admins, NS1 Playtester, NS2 Developer, Constellation, NS2 Playtester, Reinforced - Shadow, WC 2013 - Silver, Subnautica Developer, Subnautica Playtester, Pistachionauts
    Does that thing cover up another PCI/AGP/whatever slot with it's massive heatsink? If so, biiig downside for me :/
  • DOOManiacDOOManiac Worst. Critic. Ever. Join Date: 2002-04-17 Member: 462Members, NS1 Playtester
    It covers up at least one PCI slot, maybe even 2. Yes, a definate downside.

    p.s. no motherboard has 2 agp slots. :P
  • DragonMechDragonMech Join Date: 2003-09-19 Member: 21023Members, Constellation, Reinforced - Shadow
    edited April 2004
    [EDIT] DOOM beat me to it.
  • LikuLiku I, am the Somberlain. Join Date: 2003-01-10 Member: 12128Members
    That thing better make me waffles.
  • MrPinkMrPink Join Date: 2002-05-28 Member: 678Members
    Yeah this card is absolutely out of the question for me <b>as it requires a 480W power supply and covers up 1-2 PCI slots</b>
  • NumbersNotFoundNumbersNotFound Join Date: 2002-11-07 Member: 7556Members
    <!--QuoteBegin-MrPink+Apr 14 2004, 02:26 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (MrPink @ Apr 14 2004, 02:26 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> Yeah this card is absolutely out of the question for me <b>as it requires a 480W power supply and covers up 1-2 PCI slots</b> <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    According to the Tom's Hardware tests, anyway, the thing seems to pull slightly less power when idle and slightly MORE under full use...

    I'm not sure how rocksoild that PSU requirement is.
  • BigMadSteveBigMadSteve Join Date: 2003-02-12 Member: 13472Members
    Mighty card but give ATi a while. Saying that, I would take it if it were given to me though it looks like you would need a crane to move it on to your PC while getting some engineers to re-route electrical cables in the area so the card gets the power it needs.
  • NumbersNotFoundNumbersNotFound Join Date: 2002-11-07 Member: 7556Members
    <!--QuoteBegin-BigMadSteve+Apr 14 2004, 02:49 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (BigMadSteve @ Apr 14 2004, 02:49 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> Mighty card but give ATi a while. Saying that, I would take it if it were given to me though it looks like you would need a crane to move it on to your PC while getting some engineers to re-route electrical cables in the area so the card gets the power it needs. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    The reviews said it is actually quite light and quiet, but yes, it does take that 2nd slot.

    And two molex connectors... uhhh... since when was that tough wiring? <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html//emoticons/smile.gif' border='0' style='vertical-align:middle' alt='smile.gif' /><!--endemo-->
  • Umbraed_MonkeyUmbraed_Monkey Join Date: 2002-11-25 Member: 9922Members
    seriously, with all that power and cooling, it better beat R9800XT by 2x on everything....and bake me a pai while doing so.
  • ScinetScinet Join Date: 2003-01-19 Member: 12489Members, Constellation
    <!--QuoteBegin-DOOManiac+Apr 14 2004, 01:04 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (DOOManiac @ Apr 14 2004, 01:04 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> The downside of this card? Its expensive, $600. It is a power hog. NVidia recommends at LEAST a 480W power supply, and this puppy needs **TWO** 4-pin power dealies.
    <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    Say what? 480W PSU? No deal, mate. I only upgrade my PSU when it breaks down or when all my peripherals combined strain it too much. I wouldn't do it for one silly vid card. Add the requirement of 2 power cable connectors and I'm outta here.

    Anyways, the only time I haven't felt cheated five months after I've bought a new graphics card was with the Voodoo1. That guy lasted for well over two years. When I bought a GF2, it started feeling inadequate about a week later, and when I upgraded to a GF4ti4400, I could've sworn that it didn't do that much difference.

    <!--QuoteBegin-Umbraed Monkey+--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Umbraed Monkey)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->  seriously, with all that power and cooling, it better beat R9800XT by 2x on everything....and bake me a pai while doing so.<!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
    Judging from the heat my GF4 generates, I'd say you just need to put the ingredients in a bowl on top of that thing and it will indeed bake you a pai.
  • JimmehJimmeh Join Date: 2003-08-24 Member: 20173Members, Constellation
    <!--QuoteBegin--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->The 16x1 GeForce 6800 Ultra will be clocked at 400/550 (core/mem) and priced at $499, while its 12x1 little brother the GeForce 6800 non-ultra will be priced at $299 (clock speeds to be determined). <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
  • GrillkohleGrillkohle Join Date: 2003-12-23 Member: 24695Members, Constellation
    edited April 2004
    O_O
    You could beat someone to death with that video card. I wonder what it weighs.
    If I could spare the money for a vid card like this one, I would wait until ATi releases their new 'flagship'. I just like their stuff better.
  • TenSixTenSix Join Date: 2002-11-09 Member: 7932Members
    Bah, I plan on just holding off on video card upgrades until Half-life 2 is out. Of course by that time, the GF 6800 very well might be scraping the bottom of the bargain barell <!--emo&:(--><img src='http://www.unknownworlds.com/forums/html//emoticons/sad.gif' border='0' style='vertical-align:middle' alt='sad.gif' /><!--endemo-->
  • BogglesteinskyBogglesteinsky Join Date: 2002-12-24 Member: 11488Members
    so, you blow $600 on an AGP vid card, then PCI Express takes over...

    Ho hum
  • OmegamanOmegaman Join Date: 2004-01-11 Member: 25239Members
    HOLY FRIGGIN CRAP! That's not cool! Ive been saving up my money and I just bought my GeForce FX LAST WEEK! And theyre already moving to the next big video card? Not fair! <!--emo&:(--><img src='http://www.unknownworlds.com/forums/html//emoticons/sad.gif' border='0' style='vertical-align:middle' alt='sad.gif' /><!--endemo-->

    Aside from the whining, I have a feeling that the GeForce 6 and an Aquanox 3 will be released simultaneously.
  • WarriorWarrior Join Date: 2003-02-16 Member: 13624Members
    Drools. Looks nice but ill wait for the new ati card and wait for nvidia next vid cards using that new pci interface.
  • MrPinkMrPink Join Date: 2002-05-28 Member: 678Members
    PCI Express is grossly overrated
  • Soylent_greenSoylent_green Join Date: 2002-12-20 Member: 11220Members, Reinforced - Shadow
    <!--QuoteBegin--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->
    The reviews said it is actually quite light and quiet, but yes, it does take that 2nd slot.

    And two molex connectors... uhhh... since when was that tough wiring?<!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->

    They must be on separate and dedicated cables, not just all from the same line. For people with a PSU that is compatible but only 3 molex cables and too many harddrives, CD-rom drives and junk this is a difficult problem.
  • Soylent_greenSoylent_green Join Date: 2002-12-20 Member: 11220Members, Reinforced - Shadow
    <!--QuoteBegin--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> PCI Express is grossly overrated<!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->

    PCI express can upload stuff to main RAM much much faster than AGP. If graphics cards start being used for processing of non-branched code(and with the increased programabillity they very well might be in a few years) this will be very important, but it isn't very relevant now.
  • DOOManiacDOOManiac Worst. Critic. Ever. Join Date: 2002-04-17 Member: 462Members, NS1 Playtester
    <!--QuoteBegin-Omegaman!+Apr 14 2004, 03:31 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Omegaman! @ Apr 14 2004, 03:31 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> HOLY FRIGGIN CRAP! That's not cool! Ive been saving up my money and I just bought my GeForce FX LAST WEEK! And theyre already moving to the next big video card? Not fair! :(

    Aside from the whining, I have a feeling that the GeForce 6 and an Aquanox 3 will be released simultaneously. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    Welcome to the world of buying PC Hardware.
  • Sniper_ChanceSniper_Chance Join Date: 2002-12-11 Member: 10549Members
    What's with those female ports? They look nothing like standard VGA ports.

    What, we have to buy a new monitor or conversion plug for this? :/
  • MulletMullet Join Date: 2003-04-28 Member: 15910Members, Constellation
    <!--QuoteBegin-Jimmeh+Apr 14 2004, 11:32 AM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Jimmeh @ Apr 14 2004, 11:32 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> ATi forever <!--emo&;)--><img src='http://www.unknownworlds.com/forums/html//emoticons/wink.gif' border='0' style='vertical-align:middle' alt='wink.gif' /><!--endemo--> <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    Amen to that.
  • DOOManiacDOOManiac Worst. Critic. Ever. Join Date: 2002-04-17 Member: 462Members, NS1 Playtester
    <!--QuoteBegin-$niper Chance+Apr 14 2004, 05:09 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> ($niper Chance @ Apr 14 2004, 05:09 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> What's with those female ports? They look nothing like standard VGA ports.

    What, we have to buy a new monitor or conversion plug for this? :/ <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    Those are DVI plugs. Every card since like the Geforce 1 has had one of them on the back, and now NVidia is finally helping to push monitor technology out of the stoneage.

    Its for a much better video signal on flat panel monitors.

    Don't worry, I'm sure the card will come with a dongle that converts it to VGA. ATI has DVI only cards too (we use them at work) and they all come w/ a dongle to translate to VGA for the few CRTs we have.
  • EpidemicEpidemic Dark Force Gorge Join Date: 2003-06-29 Member: 17781Members
    ATI is dead, long live the new champion! <!--emo&:p--><img src='http://www.unknownworlds.com/forums/html//emoticons/tounge.gif' border='0' style='vertical-align:middle' alt='tounge.gif' /><!--endemo--> $300 is affordable for a video card <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html//emoticons/smile.gif' border='0' style='vertical-align:middle' alt='smile.gif' /><!--endemo-->
This discussion has been closed.