I can't recommend a 6600GT enough, I've got one and it's the mutt's nuts, the dog's danglies, the badger's tadgers... Ahem. Though make sure you go with a recognised brand, I got a crap one and it overheated all the time so I swapped it for a XFX one which is awesome.
<!--QuoteBegin-spinviper+May 12 2005, 06:00 AM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (spinviper @ May 12 2005, 06:00 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> <!--QuoteBegin-CoolCookieCooks+May 11 2005, 08:59 AM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (CoolCookieCooks @ May 11 2005, 08:59 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> <!--QuoteBegin-twoflow+May 11 2005, 01:52 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (twoflow @ May 11 2005, 01:52 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> <!--QuoteBegin-CoolCookieCooks+May 11 2005, 01:36 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (CoolCookieCooks @ May 11 2005, 01:36 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> Ah so you had the same problems as me twoflow?
Damn.... <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd--> Yeah, it sounds identical. I had the corrupted Windows loading screen as well. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd--> mkay. At the moment im following twoflows story.
Kouji San, if i upgrade the bios on my gfx card, itll definately void the warranty. My PSU is 420w Thermaltake one, so i doubt its that.
Obst, Ive had this card on 2 different mobos, both of which are Asus.
For a replacement gfx card, ive got my eyes on the Nvidia 6600GT. I heard asus were poor for graphics cards, so what make would be best for an nvidia card?
Edit: raz0r, this card isnt overclocked, just the same as I got it. except for the GPU cooler. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd--> You might have cracked your cards PCB when installing it. I have a 9800 PRO and <3 it. Also, Installing a 3rd party cooler voids your warranty. GL! <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd--> Well, the company wasnt smart enough to notice lasttime that i had a third party cooler installed onto it previously... <!--emo&:p--><img src='http://www.unknownworlds.com/forums/html/emoticons/tounge.gif' border='0' style='vertical-align:middle' alt='tounge.gif' /><!--endemo-->
Plus I can not see any physical damage to the card such as cracks on the pcb. The only damage is like 2 mangled split pins from where I tried to get the original cooler off >_>
@ TwoFlow: i dont care if it doesnt meet cutting edge graphics, I just want an nvidia card thats the equilavent/better than the ATI 9800 pro <!--emo&:D--><img src='http://www.unknownworlds.com/forums/html/emoticons/biggrin-fix.gif' border='0' style='vertical-align:middle' alt='biggrin-fix.gif' /><!--endemo-->
Again, the 6600GT is the best bang/buck card you can buy at the moment. I have 2 of the XFX Gamers Edition cards in my system, racking up about 6400 3D 05 Marks, and giving me 80fps during Ravenholm on HL2.
Bleh, FireGL's are only any good at CAD work, or 3D modelling.
Put simply...<a href='http://www.gainward.de/new/products/pcx/coolfx/comparison.html' target='_blank'>Once again, NVIDIA wins the race... (link to Gainward site...)</a>
<!--QuoteBegin-TommyVercetti+May 12 2005, 09:00 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (TommyVercetti @ May 12 2005, 09:00 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> Meh, X850XTs of the PCI-X variety on a SLI mobo are better than the nVidia offerings.
But hey, I'm not revisiting this thread because I'm sure people will disagree. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd--> QFGreatJustice
<!--QuoteBegin-Lt Patch+May 12 2005, 09:03 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Lt Patch @ May 12 2005, 09:03 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> Bleh, FireGL's are only any good at CAD work, or 3D modelling.
Put simply...<a href='http://www.gainward.de/new/products/pcx/coolfx/comparison.html' target='_blank'>Once again, NVIDIA wins the race... (link to Gainward site...)</a> <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd--> <a href='http://www1.graphics.tomshardware.com/graphic/20041222/vga_charts-09.html' target='_blank'>Except ATi won the race</a>
The X800XL outperforms two 6800Ultras at 1024x768, for less than half the price. Surely that counts for something?
Kouji_SanSr. Hινε UÏкεεÏεг - EUPT DeputyThe NetherlandsJoin Date: 2003-05-13Member: 16271Members, NS2 Playtester, Squad Five Blue
Pfft buy an AMD since thats bett... Oh wait!
I'm getting a Sapphire 850XT platinum vivo 256mb (pci-e) this week, or at leat I hope they deliver it this week... ONLY ONE MORE DAY LEFT!!! <!--emo&:(--><img src='http://www.unknownworlds.com/forums/html/emoticons/sad-fix.gif' border='0' style='vertical-align:middle' alt='sad-fix.gif' /><!--endemo-->
<!--QuoteBegin-raz0r+May 12 2005, 03:10 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (raz0r @ May 12 2005, 03:10 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> <!--QuoteBegin-Lt Patch+May 12 2005, 09:03 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Lt Patch @ May 12 2005, 09:03 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> Bleh, FireGL's are only any good at CAD work, or 3D modelling.
Put simply...<a href='http://www.gainward.de/new/products/pcx/coolfx/comparison.html' target='_blank'>Once again, NVIDIA wins the race... (link to Gainward site...)</a> <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd--> <a href='http://www1.graphics.tomshardware.com/graphic/20041222/vga_charts-09.html' target='_blank'>Except ATi won the race</a>
The X800XL outperforms two 6800Ultras at 1024x768, for less than half the price. Surely that counts for something? <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd--> Yeah, but if you've got enough cash for two 6800 Ultras, then you won't be running at 1024x768. nVidia won on every single benchmark in your link, at the highest quality settings. If you just look at 1024x768, then SLI doesn't give you an advantage.
Sounds like overheating, might want to try more thermal paste since you changed fans. I have a saphire radeon 9800 128mb, and I never have any problems besides (recently) graphics corrupting in planetside, but soe cant code **** and that games never ran right on my 5600fx or 9800 :/
Just keep your drivers up to date, check your bios and make sure that the settings are good, and clean the slots in your mobo to make sure that the pads are not shorting out. Oxydation can be messy sometimes if you get any kind of moisture into copper pads.
Or if you have previous cards from diffrent manufacutres, ive noticed that when you switch from ati to nvidia, nvidia wont work, or if you go from nvidia to ati, the ati wont work right. The geforce fx 5600 I had was so bad on my machine that I couldent play without random actual reboots, not shutdowns, reboots. No errors, no noises, just either locks up and screen goes black or it would reboot. Not even the computer stores around town could find out what was wrong with it. I eventually got the 9800, and never had problems since. But before the fx, i had a radeon 7000 (this is a long time ago <!--emo&:p--><img src='http://www.unknownworlds.com/forums/html/emoticons/tounge.gif' border='0' style='vertical-align:middle' alt='tounge.gif' /><!--endemo-->) so that might of been the problem all along.
To prevent dodginess as outlined by drfuzzy, use this every time you install a new driver, be it for a new card or just an update for your current one.
<!--QuoteBegin-Drfuzzy+May 12 2005, 11:03 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Drfuzzy @ May 12 2005, 11:03 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> Sounds like overheating, might want to try more thermal paste since you changed fans. I have a saphire radeon 9800 128mb, and I never have any problems besides (recently) graphics corrupting in planetside, but soe cant code **** and that games never ran right on my 5600fx or 9800 :/
Just keep your drivers up to date, check your bios and make sure that the settings are good, and clean the slots in your mobo to make sure that the pads are not shorting out. Oxydation can be messy sometimes if you get any kind of moisture into copper pads.
Or if you have previous cards from diffrent manufacutres, ive noticed that when you switch from ati to nvidia, nvidia wont work, or if you go from nvidia to ati, the ati wont work right. The geforce fx 5600 I had was so bad on my machine that I couldent play without random actual reboots, not shutdowns, reboots. No errors, no noises, just either locks up and screen goes black or it would reboot. Not even the computer stores around town could find out what was wrong with it. I eventually got the 9800, and never had problems since. But before the fx, i had a radeon 7000 (this is a long time ago <!--emo&:p--><img src='http://www.unknownworlds.com/forums/html/emoticons/tounge.gif' border='0' style='vertical-align:middle' alt='tounge.gif' /><!--endemo-->) so that might of been the problem all along. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd--> Dont mean to be mean or anything fuzzy, but have you even read whats been said in the thread?
<!--QuoteBegin-theclam+May 12 2005, 11:38 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (theclam @ May 12 2005, 11:38 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> <!--QuoteBegin-raz0r+May 12 2005, 03:10 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (raz0r @ May 12 2005, 03:10 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> <!--QuoteBegin-Lt Patch+May 12 2005, 09:03 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Lt Patch @ May 12 2005, 09:03 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> Bleh, FireGL's are only any good at CAD work, or 3D modelling.
Put simply...<a href='http://www.gainward.de/new/products/pcx/coolfx/comparison.html' target='_blank'>Once again, NVIDIA wins the race... (link to Gainward site...)</a> <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd--> <a href='http://www1.graphics.tomshardware.com/graphic/20041222/vga_charts-09.html' target='_blank'>Except ATi won the race</a>
The X800XL outperforms two 6800Ultras at 1024x768, for less than half the price. Surely that counts for something? <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd--> Yeah, but if you've got enough cash for two 6800 Ultras, then you won't be running at 1024x768. nVidia won on every single benchmark in your link, at the highest quality settings. If you just look at 1024x768, then SLI doesn't give you an advantage. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd--> Except they were <b>all</b> part of "the way it's meant to be played"
<!--QuoteBegin-raz0r+May 13 2005, 04:48 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (raz0r @ May 13 2005, 04:48 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> <!--QuoteBegin-theclam+May 12 2005, 11:38 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (theclam @ May 12 2005, 11:38 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> <!--QuoteBegin-raz0r+May 12 2005, 03:10 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (raz0r @ May 12 2005, 03:10 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> <!--QuoteBegin-Lt Patch+May 12 2005, 09:03 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Lt Patch @ May 12 2005, 09:03 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> Bleh, FireGL's are only any good at CAD work, or 3D modelling.
Put simply...<a href='http://www.gainward.de/new/products/pcx/coolfx/comparison.html' target='_blank'>Once again, NVIDIA wins the race... (link to Gainward site...)</a> <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd--> <a href='http://www1.graphics.tomshardware.com/graphic/20041222/vga_charts-09.html' target='_blank'>Except ATi won the race</a>
The X800XL outperforms two 6800Ultras at 1024x768, for less than half the price. Surely that counts for something? <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd--> Yeah, but if you've got enough cash for two 6800 Ultras, then you won't be running at 1024x768. nVidia won on every single benchmark in your link, at the highest quality settings. If you just look at 1024x768, then SLI doesn't give you an advantage. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd--> Except they were <b>all</b> part of "the way it's meant to be played" <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd--> 3D '05 them then. The only systems that I've seen breach the 10k barrier are 6800Ultra systems, in SLi mode.
Which is something that ATi don't have, even though they are developing something similar. All I can say is, dear God, don't let them do multipple cores on the same card...
Remember the one they did with Rage chipsets? Ick... <!--emo&:(--><img src='http://www.unknownworlds.com/forums/html/emoticons/sad-fix.gif' border='0' style='vertical-align:middle' alt='sad-fix.gif' /><!--endemo-->
<!--QuoteBegin-Lt Patch+May 13 2005, 07:03 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Lt Patch @ May 13 2005, 07:03 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> <!--QuoteBegin-raz0r+May 13 2005, 04:48 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (raz0r @ May 13 2005, 04:48 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> <!--QuoteBegin-theclam+May 12 2005, 11:38 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (theclam @ May 12 2005, 11:38 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> <!--QuoteBegin-raz0r+May 12 2005, 03:10 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (raz0r @ May 12 2005, 03:10 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> <!--QuoteBegin-Lt Patch+May 12 2005, 09:03 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Lt Patch @ May 12 2005, 09:03 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> Bleh, FireGL's are only any good at CAD work, or 3D modelling.
Put simply...<a href='http://www.gainward.de/new/products/pcx/coolfx/comparison.html' target='_blank'>Once again, NVIDIA wins the race... (link to Gainward site...)</a> <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd--> <a href='http://www1.graphics.tomshardware.com/graphic/20041222/vga_charts-09.html' target='_blank'>Except ATi won the race</a>
The X800XL outperforms two 6800Ultras at 1024x768, for less than half the price. Surely that counts for something? <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd--> Yeah, but if you've got enough cash for two 6800 Ultras, then you won't be running at 1024x768. nVidia won on every single benchmark in your link, at the highest quality settings. If you just look at 1024x768, then SLI doesn't give you an advantage. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd--> Except they were <b>all</b> part of "the way it's meant to be played" <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd--> 3D '05 them then. The only systems that I've seen breach the 10k barrier are 6800Ultra systems, in SLi mode.
Which is something that ATi don't have, even though they are developing something similar. All I can say is, dear God, don't let them do multipple cores on the same card...
Remember the one they did with Rage chipsets? Ick... <!--emo&:(--><img src='http://www.unknownworlds.com/forums/html/emoticons/sad-fix.gif' border='0' style='vertical-align:middle' alt='sad-fix.gif' /><!--endemo--> <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd--> But 3DMark is only a benchmark. Real performance is what matters.
But in all honesty, there is no <b>real</b> difference between ATi and Nvidia right now, it's mostly a matter of personal preference/Fanboy influence
<!--QuoteBegin-raz0r+May 13 2005, 06:18 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (raz0r @ May 13 2005, 06:18 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> <!--QuoteBegin-Lt Patch+May 13 2005, 07:03 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Lt Patch @ May 13 2005, 07:03 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> <!--QuoteBegin-raz0r+May 13 2005, 04:48 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (raz0r @ May 13 2005, 04:48 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> <!--QuoteBegin-theclam+May 12 2005, 11:38 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (theclam @ May 12 2005, 11:38 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> <!--QuoteBegin-raz0r+May 12 2005, 03:10 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (raz0r @ May 12 2005, 03:10 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> <!--QuoteBegin-Lt Patch+May 12 2005, 09:03 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Lt Patch @ May 12 2005, 09:03 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> Bleh, FireGL's are only any good at CAD work, or 3D modelling.
Put simply...<a href='http://www.gainward.de/new/products/pcx/coolfx/comparison.html' target='_blank'>Once again, NVIDIA wins the race... (link to Gainward site...)</a> <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd--> <a href='http://www1.graphics.tomshardware.com/graphic/20041222/vga_charts-09.html' target='_blank'>Except ATi won the race</a>
The X800XL outperforms two 6800Ultras at 1024x768, for less than half the price. Surely that counts for something? <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd--> Yeah, but if you've got enough cash for two 6800 Ultras, then you won't be running at 1024x768. nVidia won on every single benchmark in your link, at the highest quality settings. If you just look at 1024x768, then SLI doesn't give you an advantage. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd--> Except they were <b>all</b> part of "the way it's meant to be played" <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd--> 3D '05 them then. The only systems that I've seen breach the 10k barrier are 6800Ultra systems, in SLi mode.
Which is something that ATi don't have, even though they are developing something similar. All I can say is, dear God, don't let them do multipple cores on the same card...
Remember the one they did with Rage chipsets? Ick... <!--emo&:(--><img src='http://www.unknownworlds.com/forums/html/emoticons/sad-fix.gif' border='0' style='vertical-align:middle' alt='sad-fix.gif' /><!--endemo--> <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd--> But 3DMark is only a benchmark. Real performance is what matters.
But in all honesty, there is no <b>real</b> difference between ATi and Nvidia right now, it's mostly a matter of personal preference/Fanboy influence
<=== ATi Fanboy! <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd--> Then you run it comparativly. Change only the GFX card each time, then you get a true result, as everything else is the same, then nothing can affect the results other than the graphics card. Think scientific!.
3DMark is a more reliable way of determining maximum potential of a graphics card, as some game engines will run faster on ATi chipsets, and some will run faster on NVIDIA ones. Even then, some chipsets perform better at different resolutions, and different filtering levels.
At least the 3DMark series equally weights the tests that are NVIDIA optimised, and those that are ATi optimised.
True benchmark style. Give neither an eventual advantage, then you get a balanced result. Not a HL2 timedemo which runs better on ATi chips at 1024,0,0, then better on NVIDIA ones at 1280,2,2.
HL2 should not be used as the only benchmark, which is why Futuremark do so well with their 3DMark brand.
Some cards are going to be better at some things, which is why the banchmark is vital, as it measures <b>POTENTIAL</b>, not actual current power.
Look at the "Return To Proxycon" demo with 3D'05. Nothing today uses that kind of technology. It's a test of potential, as one day, the games are going to be like that, and my card is going to still be rocking a playable 30fps when it does.
I use it as a measure of (sorry for this) "futureproofing" a system.
And to b*tch to people how much NVIDIA owns ATi across most of the market!
Put simply...<a href='http://www.gainward.de/new/products/pcx/coolfx/comparison.html' target='_blank'>Once again, NVIDIA wins the race... (link to Gainward site...)</a> <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd--> <a href='http://www1.graphics.tomshardware.com/graphic/20041222/vga_charts-09.html' target='_blank'>Except ATi won the race</a>
The X800XL outperforms two 6800Ultras at 1024x768, for less than half the price. Surely that counts for something? <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd--> Yeah, but if you've got enough cash for two 6800 Ultras, then you won't be running at 1024x768. nVidia won on every single benchmark in your link, at the highest quality settings. If you just look at 1024x768, then SLI doesn't give you an advantage. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd--> Except they were <b>all</b> part of "the way it's meant to be played" <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd--> 3D '05 them then. The only systems that I've seen breach the 10k barrier are 6800Ultra systems, in SLi mode.
Which is something that ATi don't have, even though they are developing something similar. All I can say is, dear God, don't let them do multipple cores on the same card...
Remember the one they did with Rage chipsets? Ick... <!--emo&:(--><img src='http://www.unknownworlds.com/forums/html/emoticons/sad-fix.gif' border='0' style='vertical-align:middle' alt='sad-fix.gif' /><!--endemo--> <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd--> But 3DMark is only a benchmark. Real performance is what matters.
But in all honesty, there is no <b>real</b> difference between ATi and Nvidia right now, it's mostly a matter of personal preference/Fanboy influence
<=== ATi Fanboy! <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd--> Then you run it comparativly. Change only the GFX card each time, then you get a true result, as everything else is the same, then nothing can affect the results other than the graphics card. Think scientific!.
3DMark is a more reliable way of determining maximum potential of a graphics card, as some game engines will run faster on ATi chipsets, and some will run faster on NVIDIA ones. Even then, some chipsets perform better at different resolutions, and different filtering levels.
At least the 3DMark series equally weights the tests that are NVIDIA optimised, and those that are ATi optimised.
True benchmark style. Give neither an eventual advantage, then you get a balanced result. Not a HL2 timedemo which runs better on ATi chips at 1024,0,0, then better on NVIDIA ones at 1280,2,2.
HL2 should not be used as the only benchmark, which is why Futuremark do so well with their 3DMark brand.
Some cards are going to be better at some things, which is why the banchmark is vital, as it measures <b>POTENTIAL</b>, not actual current power.
Look at the "Return To Proxycon" demo with 3D'05. Nothing today uses that kind of technology. It's a test of potential, as one day, the games are going to be like that, and my card is going to still be rocking a playable 30fps when it does.
I use it as a measure of (sorry for this) "futureproofing" a system.
And to b*tch to people how much NVIDIA owns ATi across most of the market! <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd--> I linked to the HL2 benchies because i felt that they were most appropriate, as this is a HL based forum. But also because they were the only ones in which ATi had a distinct advantage <!--emo&:D--><img src='http://www.unknownworlds.com/forums/html/emoticons/biggrin-fix.gif' border='0' style='vertical-align:middle' alt='biggrin-fix.gif' /><!--endemo-->
Which brings me perfectly to my reason for using benchmarks.
People use the most appropriate game for showcasing their range of cards, and they also pick the best settings.
You could basically turn those benchmarks into an advert for ATi, saying "Look, X800XT owns 2x 6800Ultras, in HL2!"
Which would be perfect enough, however, as theclam has pointed out, you don't run an SLi system at 1024. I have 2x 6600GT's in my system, and eveything gets played at 1280, nor not at all. And I have what is known as "a mid range SLi setup". MID-RANGE, and I can run HL2 at better speeds than just about every ATi system that I've ran the same level on. Including a friends X850XT-based rig. Which impressed the hell out of me, and almost drove my friend to tears, given that he's just spent 2 grand on that system, to be whipped by a mid-range SLi system. I only beat his rate by something like 0.4 fps, but it was enough to even match his, let alone marginally beat it. He's now running 2x 6800Ultras, and has nearly doubled my 3D'05 score of 6300. His system now scores 11000, and trouces my system on everything.
He was actually bought on the idea of the ATi cards beating NVIDIA ones at every level, until he found out that mid-range, high end gaming PC's running NVIDIA cards actually score higher frame rates than highest end ATi ones.
But that doesn't mean that benchmarks are the be-all-and-end-all of it. Gaming techonology changes over time, and we've seen points in time when NVIDIA have been all over ATi, and their other rivals, to a time when ATi have been all over NVIDIA. However, we are back at the period when NVIDIA reign supreme over most gamers, and I'm certain that we'll see that change, when ATi deliver some ground-breaking technology, which will be countered by NVIDIA, then recounted by ATI, etc.
All the graphics market is just a big seesaw, as is the CPU market, as Intel are now thrusting back with 4 effective core CPUs (2 physical, 2 HT'd), all running at 3.8-odd jigga-hurtz, just soley to best AMD's FX range.
Anyhoo, I'm just waiting for the day when Matrox come around, and whip everyone, just like they used to be able to...
How did you manage to get 2 6600GTs to outperform his X850XT? either there's some serious tweakge going on here, or the rest of your systems are different. An X850XT should outperform 2 6600GTs, even in Doom 3.
My old 9600 Pro can pump HL2 at a nice 60 fps even in a heavy battle, with the average at 100 fps
that was untill my PC started hating me... I think I've worn the poor gal down *pokes vid card*
Dual Radeon x850XTe's will out perform dual nVidia 6800 Ultra's in most anything, but the Ultra's will provide SLIGHTLY more dpendable coverage. As a trade off, nVidia draws more juice
Comments
They're very nice, but I can't see mine keeping at the cutting edge of graphics for more than half a year longer.
Damn.... <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
Yeah, it sounds identical. I had the corrupted Windows loading screen as well. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
mkay. At the moment im following twoflows story.
Kouji San, if i upgrade the bios on my gfx card, itll definately void the warranty. My PSU is 420w Thermaltake one, so i doubt its that.
Obst, Ive had this card on 2 different mobos, both of which are Asus.
For a replacement gfx card, ive got my eyes on the Nvidia 6600GT. I heard asus were poor for graphics cards, so what make would be best for an nvidia card?
Edit: raz0r, this card isnt overclocked, just the same as I got it. except for the GPU cooler. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
You might have cracked your cards PCB when installing it. I have a 9800 PRO and <3 it. Also, Installing a 3rd party cooler voids your warranty. GL! <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
Well, the company wasnt smart enough to notice lasttime that i had a third party cooler installed onto it previously... <!--emo&:p--><img src='http://www.unknownworlds.com/forums/html/emoticons/tounge.gif' border='0' style='vertical-align:middle' alt='tounge.gif' /><!--endemo-->
Plus I can not see any physical damage to the card such as cracks on the pcb. The only damage is like 2 mangled split pins from where I tried to get the original cooler off >_>
@ TwoFlow: i dont care if it doesnt meet cutting edge graphics, I just want an nvidia card thats the equilavent/better than the ATI 9800 pro <!--emo&:D--><img src='http://www.unknownworlds.com/forums/html/emoticons/biggrin-fix.gif' border='0' style='vertical-align:middle' alt='biggrin-fix.gif' /><!--endemo-->
Fix'd
Better fix'd.
Again, the 6600GT is the best bang/buck card you can buy at the moment. I have 2 of the XFX Gamers Edition cards in my system, racking up about 6400 3D 05 Marks, and giving me 80fps during Ravenholm on HL2.
I win <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html/emoticons/smile-fix.gif' border='0' style='vertical-align:middle' alt='smile-fix.gif' /><!--endemo-->
But hey, I'm not revisiting this thread because I'm sure people will disagree.
Put simply...<a href='http://www.gainward.de/new/products/pcx/coolfx/comparison.html' target='_blank'>Once again, NVIDIA wins the race... (link to Gainward site...)</a>
But hey, I'm not revisiting this thread because I'm sure people will disagree. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
QFGreatJustice
Put simply...<a href='http://www.gainward.de/new/products/pcx/coolfx/comparison.html' target='_blank'>Once again, NVIDIA wins the race... (link to Gainward site...)</a> <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
<a href='http://www1.graphics.tomshardware.com/graphic/20041222/vga_charts-09.html' target='_blank'>Except ATi won the race</a>
The X800XL outperforms two 6800Ultras at 1024x768, for less than half the price.
Surely that counts for something?
I'm getting a Sapphire 850XT platinum vivo 256mb (pci-e) this week, or at leat I hope they deliver it this week... ONLY ONE MORE DAY LEFT!!! <!--emo&:(--><img src='http://www.unknownworlds.com/forums/html/emoticons/sad-fix.gif' border='0' style='vertical-align:middle' alt='sad-fix.gif' /><!--endemo-->
Put simply...<a href='http://www.gainward.de/new/products/pcx/coolfx/comparison.html' target='_blank'>Once again, NVIDIA wins the race... (link to Gainward site...)</a> <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
<a href='http://www1.graphics.tomshardware.com/graphic/20041222/vga_charts-09.html' target='_blank'>Except ATi won the race</a>
The X800XL outperforms two 6800Ultras at 1024x768, for less than half the price.
Surely that counts for something? <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
Yeah, but if you've got enough cash for two 6800 Ultras, then you won't be running at 1024x768. nVidia won on every single benchmark in your link, at the highest quality settings. If you just look at 1024x768, then SLI doesn't give you an advantage.
Just keep your drivers up to date, check your bios and make sure that the settings are good, and clean the slots in your mobo to make sure that the pads are not shorting out. Oxydation can be messy sometimes if you get any kind of moisture into copper pads.
Or if you have previous cards from diffrent manufacutres, ive noticed that when you switch from ati to nvidia, nvidia wont work, or if you go from nvidia to ati, the ati wont work right. The geforce fx 5600 I had was so bad on my machine that I couldent play without random actual reboots, not shutdowns, reboots. No errors, no noises, just either locks up and screen goes black or it would reboot. Not even the computer stores around town could find out what was wrong with it. I eventually got the 9800, and never had problems since. But before the fx, i had a radeon 7000 (this is a long time ago <!--emo&:p--><img src='http://www.unknownworlds.com/forums/html/emoticons/tounge.gif' border='0' style='vertical-align:middle' alt='tounge.gif' /><!--endemo-->) so that might of been the problem all along.
<a href='http://downloads.guru3d.com/download.php?det=745' target='_blank'>Driver Cleaner Pro</a>
Just keep your drivers up to date, check your bios and make sure that the settings are good, and clean the slots in your mobo to make sure that the pads are not shorting out. Oxydation can be messy sometimes if you get any kind of moisture into copper pads.
Or if you have previous cards from diffrent manufacutres, ive noticed that when you switch from ati to nvidia, nvidia wont work, or if you go from nvidia to ati, the ati wont work right. The geforce fx 5600 I had was so bad on my machine that I couldent play without random actual reboots, not shutdowns, reboots. No errors, no noises, just either locks up and screen goes black or it would reboot. Not even the computer stores around town could find out what was wrong with it. I eventually got the 9800, and never had problems since. But before the fx, i had a radeon 7000 (this is a long time ago <!--emo&:p--><img src='http://www.unknownworlds.com/forums/html/emoticons/tounge.gif' border='0' style='vertical-align:middle' alt='tounge.gif' /><!--endemo-->) so that might of been the problem all along. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
Dont mean to be mean or anything fuzzy, but have you even read whats been said in the thread?
Put simply...<a href='http://www.gainward.de/new/products/pcx/coolfx/comparison.html' target='_blank'>Once again, NVIDIA wins the race... (link to Gainward site...)</a> <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
<a href='http://www1.graphics.tomshardware.com/graphic/20041222/vga_charts-09.html' target='_blank'>Except ATi won the race</a>
The X800XL outperforms two 6800Ultras at 1024x768, for less than half the price.
Surely that counts for something? <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
Yeah, but if you've got enough cash for two 6800 Ultras, then you won't be running at 1024x768. nVidia won on every single benchmark in your link, at the highest quality settings. If you just look at 1024x768, then SLI doesn't give you an advantage. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
Except they were <b>all</b> part of "the way it's meant to be played"
Put simply...<a href='http://www.gainward.de/new/products/pcx/coolfx/comparison.html' target='_blank'>Once again, NVIDIA wins the race... (link to Gainward site...)</a> <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
<a href='http://www1.graphics.tomshardware.com/graphic/20041222/vga_charts-09.html' target='_blank'>Except ATi won the race</a>
The X800XL outperforms two 6800Ultras at 1024x768, for less than half the price.
Surely that counts for something? <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
Yeah, but if you've got enough cash for two 6800 Ultras, then you won't be running at 1024x768. nVidia won on every single benchmark in your link, at the highest quality settings. If you just look at 1024x768, then SLI doesn't give you an advantage. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
Except they were <b>all</b> part of "the way it's meant to be played" <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
3D '05 them then. The only systems that I've seen breach the 10k barrier are 6800Ultra systems, in SLi mode.
Which is something that ATi don't have, even though they are developing something similar. All I can say is, dear God, don't let them do multipple cores on the same card...
Remember the one they did with Rage chipsets? Ick... <!--emo&:(--><img src='http://www.unknownworlds.com/forums/html/emoticons/sad-fix.gif' border='0' style='vertical-align:middle' alt='sad-fix.gif' /><!--endemo-->
Put simply...<a href='http://www.gainward.de/new/products/pcx/coolfx/comparison.html' target='_blank'>Once again, NVIDIA wins the race... (link to Gainward site...)</a> <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
<a href='http://www1.graphics.tomshardware.com/graphic/20041222/vga_charts-09.html' target='_blank'>Except ATi won the race</a>
The X800XL outperforms two 6800Ultras at 1024x768, for less than half the price.
Surely that counts for something? <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
Yeah, but if you've got enough cash for two 6800 Ultras, then you won't be running at 1024x768. nVidia won on every single benchmark in your link, at the highest quality settings. If you just look at 1024x768, then SLI doesn't give you an advantage. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
Except they were <b>all</b> part of "the way it's meant to be played" <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
3D '05 them then. The only systems that I've seen breach the 10k barrier are 6800Ultra systems, in SLi mode.
Which is something that ATi don't have, even though they are developing something similar. All I can say is, dear God, don't let them do multipple cores on the same card...
Remember the one they did with Rage chipsets? Ick... <!--emo&:(--><img src='http://www.unknownworlds.com/forums/html/emoticons/sad-fix.gif' border='0' style='vertical-align:middle' alt='sad-fix.gif' /><!--endemo--> <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
But 3DMark is only a benchmark. Real performance is what matters.
But in all honesty, there is no <b>real</b> difference between ATi and Nvidia right now, it's mostly a matter of personal preference/Fanboy influence
<=== ATi Fanboy!
Put simply...<a href='http://www.gainward.de/new/products/pcx/coolfx/comparison.html' target='_blank'>Once again, NVIDIA wins the race... (link to Gainward site...)</a> <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
<a href='http://www1.graphics.tomshardware.com/graphic/20041222/vga_charts-09.html' target='_blank'>Except ATi won the race</a>
The X800XL outperforms two 6800Ultras at 1024x768, for less than half the price.
Surely that counts for something? <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
Yeah, but if you've got enough cash for two 6800 Ultras, then you won't be running at 1024x768. nVidia won on every single benchmark in your link, at the highest quality settings. If you just look at 1024x768, then SLI doesn't give you an advantage. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
Except they were <b>all</b> part of "the way it's meant to be played" <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
3D '05 them then. The only systems that I've seen breach the 10k barrier are 6800Ultra systems, in SLi mode.
Which is something that ATi don't have, even though they are developing something similar. All I can say is, dear God, don't let them do multipple cores on the same card...
Remember the one they did with Rage chipsets? Ick... <!--emo&:(--><img src='http://www.unknownworlds.com/forums/html/emoticons/sad-fix.gif' border='0' style='vertical-align:middle' alt='sad-fix.gif' /><!--endemo--> <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
But 3DMark is only a benchmark. Real performance is what matters.
But in all honesty, there is no <b>real</b> difference between ATi and Nvidia right now, it's mostly a matter of personal preference/Fanboy influence
<=== ATi Fanboy! <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
Then you run it comparativly. Change only the GFX card each time, then you get a true result, as everything else is the same, then nothing can affect the results other than the graphics card. Think scientific!.
3DMark is a more reliable way of determining maximum potential of a graphics card, as some game engines will run faster on ATi chipsets, and some will run faster on NVIDIA ones. Even then, some chipsets perform better at different resolutions, and different filtering levels.
At least the 3DMark series equally weights the tests that are NVIDIA optimised, and those that are ATi optimised.
True benchmark style. Give neither an eventual advantage, then you get a balanced result. Not a HL2 timedemo which runs better on ATi chips at 1024,0,0, then better on NVIDIA ones at 1280,2,2.
HL2 should not be used as the only benchmark, which is why Futuremark do so well with their 3DMark brand.
Some cards are going to be better at some things, which is why the banchmark is vital, as it measures <b>POTENTIAL</b>, not actual current power.
Look at the "Return To Proxycon" demo with 3D'05. Nothing today uses that kind of technology. It's a test of potential, as one day, the games are going to be like that, and my card is going to still be rocking a playable 30fps when it does.
I use it as a measure of (sorry for this) "futureproofing" a system.
And to b*tch to people how much NVIDIA owns ATi across most of the market!
Put simply...<a href='http://www.gainward.de/new/products/pcx/coolfx/comparison.html' target='_blank'>Once again, NVIDIA wins the race... (link to Gainward site...)</a> <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
<a href='http://www1.graphics.tomshardware.com/graphic/20041222/vga_charts-09.html' target='_blank'>Except ATi won the race</a>
The X800XL outperforms two 6800Ultras at 1024x768, for less than half the price.
Surely that counts for something? <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
Yeah, but if you've got enough cash for two 6800 Ultras, then you won't be running at 1024x768. nVidia won on every single benchmark in your link, at the highest quality settings. If you just look at 1024x768, then SLI doesn't give you an advantage. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
Except they were <b>all</b> part of "the way it's meant to be played" <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
3D '05 them then. The only systems that I've seen breach the 10k barrier are 6800Ultra systems, in SLi mode.
Which is something that ATi don't have, even though they are developing something similar. All I can say is, dear God, don't let them do multipple cores on the same card...
Remember the one they did with Rage chipsets? Ick... <!--emo&:(--><img src='http://www.unknownworlds.com/forums/html/emoticons/sad-fix.gif' border='0' style='vertical-align:middle' alt='sad-fix.gif' /><!--endemo--> <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
But 3DMark is only a benchmark. Real performance is what matters.
But in all honesty, there is no <b>real</b> difference between ATi and Nvidia right now, it's mostly a matter of personal preference/Fanboy influence
<=== ATi Fanboy! <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
Then you run it comparativly. Change only the GFX card each time, then you get a true result, as everything else is the same, then nothing can affect the results other than the graphics card. Think scientific!.
3DMark is a more reliable way of determining maximum potential of a graphics card, as some game engines will run faster on ATi chipsets, and some will run faster on NVIDIA ones. Even then, some chipsets perform better at different resolutions, and different filtering levels.
At least the 3DMark series equally weights the tests that are NVIDIA optimised, and those that are ATi optimised.
True benchmark style. Give neither an eventual advantage, then you get a balanced result. Not a HL2 timedemo which runs better on ATi chips at 1024,0,0, then better on NVIDIA ones at 1280,2,2.
HL2 should not be used as the only benchmark, which is why Futuremark do so well with their 3DMark brand.
Some cards are going to be better at some things, which is why the banchmark is vital, as it measures <b>POTENTIAL</b>, not actual current power.
Look at the "Return To Proxycon" demo with 3D'05. Nothing today uses that kind of technology. It's a test of potential, as one day, the games are going to be like that, and my card is going to still be rocking a playable 30fps when it does.
I use it as a measure of (sorry for this) "futureproofing" a system.
And to b*tch to people how much NVIDIA owns ATi across most of the market! <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
I linked to the HL2 benchies because i felt that they were most appropriate, as this is a HL based forum. But also because they were the only ones in which ATi had a distinct advantage <!--emo&:D--><img src='http://www.unknownworlds.com/forums/html/emoticons/biggrin-fix.gif' border='0' style='vertical-align:middle' alt='biggrin-fix.gif' /><!--endemo-->
People use the most appropriate game for showcasing their range of cards, and they also pick the best settings.
You could basically turn those benchmarks into an advert for ATi, saying "Look, X800XT owns 2x 6800Ultras, in HL2!"
Which would be perfect enough, however, as theclam has pointed out, you don't run an SLi system at 1024. I have 2x 6600GT's in my system, and eveything gets played at 1280, nor not at all. And I have what is known as "a mid range SLi setup". MID-RANGE, and I can run HL2 at better speeds than just about every ATi system that I've ran the same level on. Including a friends X850XT-based rig. Which impressed the hell out of me, and almost drove my friend to tears, given that he's just spent 2 grand on that system, to be whipped by a mid-range SLi system. I only beat his rate by something like 0.4 fps, but it was enough to even match his, let alone marginally beat it. He's now running 2x 6800Ultras, and has nearly doubled my 3D'05 score of 6300. His system now scores 11000, and trouces my system on everything.
He was actually bought on the idea of the ATi cards beating NVIDIA ones at every level, until he found out that mid-range, high end gaming PC's running NVIDIA cards actually score higher frame rates than highest end ATi ones.
But that doesn't mean that benchmarks are the be-all-and-end-all of it. Gaming techonology changes over time, and we've seen points in time when NVIDIA have been all over ATi, and their other rivals, to a time when ATi have been all over NVIDIA. However, we are back at the period when NVIDIA reign supreme over most gamers, and I'm certain that we'll see that change, when ATi deliver some ground-breaking technology, which will be countered by NVIDIA, then recounted by ATI, etc.
All the graphics market is just a big seesaw, as is the CPU market, as Intel are now thrusting back with 4 effective core CPUs (2 physical, 2 HT'd), all running at 3.8-odd jigga-hurtz, just soley to best AMD's FX range.
Anyhoo, I'm just waiting for the day when Matrox come around, and whip everyone, just like they used to be able to...
either there's some serious tweakge going on here, or the rest of your systems are different.
An X850XT should outperform 2 6600GTs, even in Doom 3.
The x850 XT can compare to SLi systems on some resolutions/games, and sometimes beats them.
that was untill my PC started hating me... I think I've worn the poor gal down *pokes vid card*
Dual Radeon x850XTe's will out perform dual nVidia 6800 Ultra's in most anything, but the Ultra's will provide SLIGHTLY more dpendable coverage. As a trade off, nVidia draws more juice
That may be my problem...
*goes out to buy an 850 watt power supply*