Nvidia Reveals Geforce 6
DOOManiac
Worst. Critic. Ever. Join Date: 2002-04-17 Member: 462Members, NS1 Playtester
<div class="IPBDescription">HardOCP and Anandtech have good previews</div><a href='http://www.anandtech.com/video/showdoc.html?i=2023' target='_blank'>Anandtech Preview</a>
<a href='http://www.hardocp.com/article.html?art=NjA2' target='_blank'>Hard OCP Preview</a>
This card seems to be a double edged sword for the most part. I predict that people are either going to completely love it or completely hate it.
<img src='http://images.anandtech.com/reviews/video/NVIDIA/GeForce6800U/angle1.jpg' border='0' alt='user posted image' />
When you see this thing the first thing that comes to mind is does it have the vacuum cleaner properties of its predecessor? From the previews I glanced over, it is actually supposed to be very quiet.
The downside of this card? Its expensive, $600. It is a power hog. NVidia recommends at LEAST a 480W power supply, and this puppy needs **TWO** 4-pin power dealies.
The good? It appears to geniunely kick the living crap out of the 9800 XT. In Anandtech's Halo tests for instance, this thing was nearly twice as fast as the 9800 XT.
But don't forget, ATI's new card will no doubt be coming out very soon to match heels with this, so as it always is in the hardware game, something else is already in development that will be faster. And probably cheaper.
Me? I'm going to wait and see. I've really lost my faith in NVidia, and the cost of the card in addition to the cost of a new power supply (not to mention the cost of an increased electricity bill) make me wonder if it will outweigh the performance benefits...
[edit]BTW When you read that HardOCP review and see the 9800 with numbers closer to the GF6, keep in mind that for some stupid reason HardOCP benchmarked the games in DIFFERENT RESOLUTIONS. For example, in Battlefield: Vietnam the GF6 (52.3 avg FPS) is barely higher than the 9800 XT (49.4 avg FPS), however the GF6 was running at 1280x960 whereas the Radeon was only 1024x768. Also the GF6 was running 4xAA while the 9800 had no AA. I'd recommend just sticking to Anandtech for a numbers comparrison
<a href='http://www.hardocp.com/article.html?art=NjA2' target='_blank'>Hard OCP Preview</a>
This card seems to be a double edged sword for the most part. I predict that people are either going to completely love it or completely hate it.
<img src='http://images.anandtech.com/reviews/video/NVIDIA/GeForce6800U/angle1.jpg' border='0' alt='user posted image' />
When you see this thing the first thing that comes to mind is does it have the vacuum cleaner properties of its predecessor? From the previews I glanced over, it is actually supposed to be very quiet.
The downside of this card? Its expensive, $600. It is a power hog. NVidia recommends at LEAST a 480W power supply, and this puppy needs **TWO** 4-pin power dealies.
The good? It appears to geniunely kick the living crap out of the 9800 XT. In Anandtech's Halo tests for instance, this thing was nearly twice as fast as the 9800 XT.
But don't forget, ATI's new card will no doubt be coming out very soon to match heels with this, so as it always is in the hardware game, something else is already in development that will be faster. And probably cheaper.
Me? I'm going to wait and see. I've really lost my faith in NVidia, and the cost of the card in addition to the cost of a new power supply (not to mention the cost of an increased electricity bill) make me wonder if it will outweigh the performance benefits...
[edit]BTW When you read that HardOCP review and see the 9800 with numbers closer to the GF6, keep in mind that for some stupid reason HardOCP benchmarked the games in DIFFERENT RESOLUTIONS. For example, in Battlefield: Vietnam the GF6 (52.3 avg FPS) is barely higher than the 9800 XT (49.4 avg FPS), however the GF6 was running at 1280x960 whereas the Radeon was only 1024x768. Also the GF6 was running 4xAA while the 9800 had no AA. I'd recommend just sticking to Anandtech for a numbers comparrison
This discussion has been closed.
Comments
pretty sexy card
it will be interesting to see how it fairs
[edit]
hi CForrester, and ken! wewt and hello to JHunz as well ^_^
[/edit]
ill take the cpu then save up 250 for the 9800xt, which will drop to that price after ATI releases its new card. ohh ya im a shexy shoppah.*wiggle* *wiggle*
p.s. no motherboard has 2 agp slots. :P
According to the Tom's Hardware tests, anyway, the thing seems to pull slightly less power when idle and slightly MORE under full use...
I'm not sure how rocksoild that PSU requirement is.
The reviews said it is actually quite light and quiet, but yes, it does take that 2nd slot.
And two molex connectors... uhhh... since when was that tough wiring? <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html//emoticons/smile.gif' border='0' style='vertical-align:middle' alt='smile.gif' /><!--endemo-->
<!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
Say what? 480W PSU? No deal, mate. I only upgrade my PSU when it breaks down or when all my peripherals combined strain it too much. I wouldn't do it for one silly vid card. Add the requirement of 2 power cable connectors and I'm outta here.
Anyways, the only time I haven't felt cheated five months after I've bought a new graphics card was with the Voodoo1. That guy lasted for well over two years. When I bought a GF2, it started feeling inadequate about a week later, and when I upgraded to a GF4ti4400, I could've sworn that it didn't do that much difference.
<!--QuoteBegin-Umbraed Monkey+--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Umbraed Monkey)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> seriously, with all that power and cooling, it better beat R9800XT by 2x on everything....and bake me a pai while doing so.<!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
Judging from the heat my GF4 generates, I'd say you just need to put the ingredients in a bowl on top of that thing and it will indeed bake you a pai.
You could beat someone to death with that video card. I wonder what it weighs.
If I could spare the money for a vid card like this one, I would wait until ATi releases their new 'flagship'. I just like their stuff better.
Ho hum
Aside from the whining, I have a feeling that the GeForce 6 and an Aquanox 3 will be released simultaneously.
The reviews said it is actually quite light and quiet, but yes, it does take that 2nd slot.
And two molex connectors... uhhh... since when was that tough wiring?<!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
They must be on separate and dedicated cables, not just all from the same line. For people with a PSU that is compatible but only 3 molex cables and too many harddrives, CD-rom drives and junk this is a difficult problem.
PCI express can upload stuff to main RAM much much faster than AGP. If graphics cards start being used for processing of non-branched code(and with the increased programabillity they very well might be in a few years) this will be very important, but it isn't very relevant now.
Aside from the whining, I have a feeling that the GeForce 6 and an Aquanox 3 will be released simultaneously. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
Welcome to the world of buying PC Hardware.
What, we have to buy a new monitor or conversion plug for this? :/
Amen to that.
What, we have to buy a new monitor or conversion plug for this? :/ <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
Those are DVI plugs. Every card since like the Geforce 1 has had one of them on the back, and now NVidia is finally helping to push monitor technology out of the stoneage.
Its for a much better video signal on flat panel monitors.
Don't worry, I'm sure the card will come with a dongle that converts it to VGA. ATI has DVI only cards too (we use them at work) and they all come w/ a dongle to translate to VGA for the few CRTs we have.