Nvidia Or Ati?
TommyVercetti
Join Date: 2003-02-10 Member: 13390Members, Constellation, Reinforced - Shadow
in Off-Topic
<div class="IPBDescription">FX 6800 vs. X800</div> I'm in the market for a new PC, and I feel like designing my own. I hear that the ATi X800 gets better framerates at high resolutions and settings, but does not support the advanced DX9 effects. So, can you geniuses (I'm serious) help me out with the video card decision? The games I'm most likely going to be playing on it are Doom III, Half-Life 2, STALKER: Shadow of Chernobyl, and others like them. So, which would you recommend? Keep in mind I really love the eye candy.
Comments
Haven't you been reading any of the onther topics ATI owns NVidia in every way.
Don't touch NVidia- terrible cards, terrible drivers, terrible service.
Perhaps what you're thinking of is 'Pixel Shader 3.0'... which is mostly a marketing scam, as any card that can run actual PS2.0, and has an on-board hardware T&L unit (aka: any ATi card after the R9500) can do them with minor driver handle-modifications by the coders.
Oddly, nVidia is boasting that they can do them... when the 6800 series is the first nVidia card to actually handle ANYTHING past PS1.4 <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html//emoticons/smile.gif' border='0' style='vertical-align:middle' alt='smile.gif' /><!--endemo-->
Short version, the x800 series kicks the living crap out of the 6800 series. The XTPE is the fastest consumer card on the market ($500 msrp). The 6800 Ultra Extreme ($700 msrp) is a 'golden sample' card, which means they find the BEST ones out of their stock and clock them higher. They STILL can't keep up with the x800 XTPE, draw two DEVOTED molex power connectors, and hog two expansion slots. They also go back to the 'dustbuster' cooling system that was mocked so much on the FX5800.
Go for the ATi. Your eyes, ears and wallet will thank you later.
The image look near identical to me on both cards unless you get out the magnifying glass (even then it isn't much)
the benchmarks are actually pretty similar. the two cards are neck and neck.
I'm not saying they go bad.
I'm saying they CAN'T DO CRAP.
I'm not saying they go bad.
I'm saying they CAN'T DO CRAP. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
Example?
I'd also like to know about the "terrible service" claim, since Nvidia doesn't actually make the cards.
I'm not saying they go bad.
I'm saying they CAN'T DO CRAP. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
much like ATI's mobility or linux drivers, I hear
And JHunz, the Mobility series use the same drivers as the standard desktop boards.
The Linux drivers are just about as difficult to install as any other module (easier actually, given that they come with an option to compile a custom module based upon your sourcetree, rather than a static-linked binary).. the only kvetch I have with them is that they don't have a 64-bit Linux version available yet.
And JHunz, the Mobility series use the same drivers as the standard desktop boards.
The Linux drivers are just about as difficult to install as any other module (easier actually, given that they come with an option to compile a custom module based upon your sourcetree, rather than a static-linked binary).. the only kvetch I have with them is that they don't have a 64-bit Linux version available yet. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
I've not heard a single source say it's like the old "dustbuster" they say it CAN be, but NEVER got that hot.
This isn't really the place to go for balanced advice. There's too many fanboys on either side.
Really, you can say that of <i>any</i> internet forum. Best bet is to just flip a coin, I say.
So what? Go to Alienware's page and read their 'testimonials'. Guess who got paid the most?
Secondly, you are NOT a game developer. If you care to prove me wrong, give me a title, name, and publishing company, but the ONLY thing I've ever seen you able to do is run your linux emulators.
Third, even though you CLAIM to be a game developer (news flash - Coding your little flash applets doesn't make you a developer on par with John Carmack), you say that frame rates are superb. Yeah, sure, just like Pentiums, NVidia appeals to bigger numbers then better quality, so all the idiot kiddies out there think their penises are bigger (these are the same people who think refresh rates are bunk and brag that they get 100 fps on half-life)
Fourth, and going back:
<!--QuoteBegin--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->
I'd also like to know about the "terrible service" claim, since Nvidia doesn't actually make the cards.<!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
That sounds reason enough to me why they're ****. You get an ATI made by ATI, you know what you're getting. You go to buy an nvidia card, you have to check which manufacturers are better, which will screw you, and I got SCREWED on my Geforce 4 (the thing melted its own cooling unit).
Also don't forget about PCI express now Talesin. If you want the best money can buy, you can put 2 Video cards into your system doubling the power and easily destroying ATi. There is a lot Geforce is doing for the market and they keep getting better and better. Yes ATi is damn good(which is great, it creates competition which means lower prices), but if you truly want the best, get a pair of Geforce 6800 Ultra's with PCI Express. Beware if you do that, it means your going to have to get a motherboard that has PCI Express slots(which unfortunately takes up 2 of your regular PCI slots.(Wouldn't matter for me though cuz I don't use any PCI slots).
If you want more reviews and ranting on this topic, you should check some previous threads that are in Off-Topic.
3DFX is dead for a variety of reasons, mainly because they tried to pull that same **** - pass off lower cards and say 'well you should buy two to do it best' or something.
That said:
<a href='http://www.nordichardware.com/reviews/graphiccard/2004/r420/index.php?ez=10' target='_blank'>http://www.nordichardware.com/reviews/grap...index.php?ez=10</a>
Now flip through - the 6800 beat the X800 in only a couple games, or did it by a difference of less then 8 FPS. Whereas when the X800 beat the 6800, it did it by a massive amount.
1600 4xAA 8xAF - X800 did far more then THREE TIMES as good as the 6800. That is freaking PATHETIC.
Also this article is HIGHLY nvidia biased - They pass off graphical errors and corruptions on the NVidia cards (read the end of that far cry page for one) and say 'But we'll just wait for the next patch'. Then read the Temporal AA page - They lambast it saying it's crap since it doesn't work at under 60 FPS. What's worse - the NV40 drives that GIVE ERRORS or a feature that is working just as intended but not how THEY want it? Apparently the latter.
As for the pixel shader - consider how many games have been created for FUTURE cards using FUTURE technology lately. Sorry, but right now the only thing we have to go on is Unreal 3. And frankly? The super-ultra-vast majority of games do NOT use 'potential' technology unless they're ULTRA POSITIVE that they'll sell. Look at games like Call of Duty - was built on existing technology. Games like Doom3 raise the bar, but you're fooling yourself if you think within a year more then only a handful of games will be using pixel shader 3.0 - and by the time they do, it won't matter that NVidia had it first.
Dreamworks and Pixar have ultra powerful machines, superior rendering systems. Do game developers make games that look like those? No, nearly never. Because if they did right away, wow look, a dozen people are able to run the game.
Buying the 6800 for PS3.0 is sheer idiocy. If you did, you're an utter fool.
Don't you think he knows? He's been in pretty much every one. He's known in <i>other planes of reality</i> for hating nVidia with a burning passion.
The "advanced technologies" of the 6800 are not going to be used a whole lot for at least 2-3 years, at which point those technologies will be standard on every card, and will be more efficient than they are now. When you buy the 6800, you're investing in future technology that will be done better in less than a year, on lesser cards.
Be smart; wait for the PCI Express mobo's and X800's to come out and buy those. That will be worth the time and money.
3DFX is dead for a variety of reasons, mainly because they tried to pull that same **** - pass off lower cards and say 'well you should buy two to do it best' or something.
That said:
<a href='http://www.nordichardware.com/reviews/graphiccard/2004/r420/index.php?ez=10' target='_blank'>http://www.nordichardware.com/reviews/grap...index.php?ez=10</a>
Now flip through - the 6800 beat the X800 in only a couple games, or did it by a difference of less then 8 FPS. Whereas when the X800 beat the 6800, it did it by a massive amount.
1600 4xAA 8xAF - X800 did far more then THREE TIMES as good as the 6800. That is freaking PATHETIC.
Also this article is HIGHLY nvidia biased - They pass off graphical errors and corruptions on the NVidia cards (read the end of that far cry page for one) and say 'But we'll just wait for the next patch'. Then read the Temporal AA page - They lambast it saying it's crap since it doesn't work at under 60 FPS. What's worse - the NV40 drives that GIVE ERRORS or a feature that is working just as intended but not how THEY want it? Apparently the latter.
As for the pixel shader - consider how many games have been created for FUTURE cards using FUTURE technology lately. Sorry, but right now the only thing we have to go on is Unreal 3. And frankly? The super-ultra-vast majority of games do NOT use 'potential' technology unless they're ULTRA POSITIVE that they'll sell. Look at games like Call of Duty - was built on existing technology. Games like Doom3 raise the bar, but you're fooling yourself if you think within a year more then only a handful of games will be using pixel shader 3.0 - and by the time they do, it won't matter that NVidia had it first.
Dreamworks and Pixar have ultra powerful machines, superior rendering systems. Do game developers make games that look like those? No, nearly never. Because if they did right away, wow look, a dozen people are able to run the game.
Buying the 6800 for PS3.0 is sheer idiocy. If you did, you're an utter fool. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
Yeah, your right, 3DFX did develop the PCI Express and then Nvidia bought 3DFX thus allowing them to update their hardware and resell it.
EEK, you should really do some research on this, but since its obvious you didn't do your homework I will give you a basic rundown of how PCI Express works.
Basically, you have two cards(right now Nvidia is saying they should be of the same time, for instance Two Geforce 6800 Ultras instead of say a Geforce 6800 Ultra and Geforce 6800 GT). The graphics to be processed is divided by two and each card is given an equal amount of work do to for the final output. Once the processing is done, the two outputs become one final output which gets displayed.
If you still don't understand how it works, go look around for a couple articles on it.
That being said, one test does not mean that a certain card is better than the others. This has already been described in previous threads but I will say it again. We already know that the Geforce gets it **** kicked in high Anti-Aliasing test(thats what ATi is good at!!). But guess what, whenever OpenGL comes around, guess who's **** is getting kicked(you probably thought Nvidia and you were horribly wrong). Nvidia does a fine job of kicking ATi's **** in OpenGL therefore giving them the edge it that market.
And about the Pixel Shaders. READ MY FREAKIN POST MAN, I said that Nvidia has the technology to allow GAME DEVELOPERS begin work on games that actually use this technology by using their hardware. By no way did I say that the GAMERS would benefit from this card(except that they can look forward to seeing nicer-looking games a lot sooner than they probably would have).
My point is that both cards are similar and you can't give one test to show that one card is exponentially better than the other, because then your not giving a fair review! You have to give the whole picture and when the whole picture is drawn in this matter, you will find that both cards are very similar except one developer has the technology to allow 2 video cards in one system instead of just one.
It is a fact, 2 is better than 1, you can't dispute that.
<b><span style='font-size:21pt;line-height:100%'>SLI</span></b>, or Scan-Line Interleaving.. the system that uses a pair of bridged cards.
PCI-Express, or PCI-X, is a replacement for the AGP port on your motherboard. It offers far greater bandwidth for large amounts of data-transfer (usually textures are the largest part).
However, SLI is only available (last I saw from their page) on the PCI-X version of a hand-selected crop of 6800 Ultra Extremes, with custom PCBs.
Short version, they will be as rare as hen's teeth, require a standalone PSU to run them, be noisy as all crap, and deliver approximately 150%-190% of the current 6800 UE performance numbers. Which will leave the X800 XTPE in the dust. However, I suppose if you have at least $700 per card (price for a 6800UE, msrp), plus a markup for the PCIX version, plus a markup for the SLI bridge PCB, plus a markup for the motherboard that has the slots to support it.
If you're willing to spend approximately $2000 for the absolute cream of the crop consumer-grade 3D card, then that's fine with me.
However.
You could probably liquid or vapor-cool an X800 XTPE, get a custom BIOS from ATi, and overclock it to hell and back for about half the cost, the same performance, and a little bit of risk. <!--emo&;)--><img src='http://www.unknownworlds.com/forums/html//emoticons/wink.gif' border='0' style='vertical-align:middle' alt='wink.gif' /><!--endemo--> Oh, and you wouldn't have to switch out your motherboard.
Now, are we both done talking about fantasy-cards that few, if any, will ever realize?
Short version. The XTPE is $500, and is faster than the UE at $700. And let's not even talk about how easy it is to walk down to the local Fry's and pick up an XTPE, as opposed to trying to find one of the rare 'Golden Sample' 6800 UE cards... it'll only go downhill. <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html//emoticons/smile.gif' border='0' style='vertical-align:middle' alt='smile.gif' /><!--endemo-->
The "advanced technologies" of the 6800 are not going to be used a whole lot for at least 2-3 years, at which point those technologies will be standard on every card, and will be more efficient than they are now. When you buy the 6800, you're investing in future technology that will be done better in less than a year, on lesser cards.
Be smart; wait for the PCI Express mobo's and X800's to come out and buy those. That will be worth the time and money. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
Since my ceiling is about $3,500, I'm going for the X800 XT P.E. Two 6800's is simply not worth it.
And to answer your question, these games are the most important to me:
1. STALKER: Shadow of Chernobyl
2. Half-Life 2
3. Doom III
I'm not saying they go bad.
I'm saying they CAN'T DO CRAP. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
Example?
I'd also like to know about the "terrible service" claim, since Nvidia doesn't actually make the cards. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
Far Cry? Splinter Cell: Pandora Tomorrow? ANY game using pixel shaders(other than DOOM3- its opengl based, so they'll be pretty close to same)
<b><span style='font-size:21pt;line-height:100%'>SLI</span></b>, or Scan-Line Interleaving.. the system that uses a pair of bridged cards.
PCI-Express, or PCI-X, is a replacement for the AGP port on your motherboard. It offers far greater bandwidth for large amounts of data-transfer (usually textures are the largest part).
However, SLI is only available (last I saw from their page) on the PCI-X version of a hand-selected crop of 6800 Ultra Extremes, with custom PCBs.
Short version, they will be as rare as hen's teeth, require a standalone PSU to run them, be noisy as all crap, and deliver approximately 150%-190% of the current 6800 UE performance numbers. Which will leave the X800 XTPE in the dust. However, I suppose if you have at least $700 per card (price for a 6800UE, msrp), plus a markup for the PCIX version, plus a markup for the SLI bridge PCB, plus a markup for the motherboard that has the slots to support it.
If you're willing to spend approximately $2000 for the absolute cream of the crop consumer-grade 3D card, then that's fine with me.
However.
You could probably liquid or vapor-cool an X800 XTPE, get a custom BIOS from ATi, and overclock it to hell and back for about half the cost, the same performance, and a little bit of risk. <!--emo&;)--><img src='http://www.unknownworlds.com/forums/html//emoticons/wink.gif' border='0' style='vertical-align:middle' alt='wink.gif' /><!--endemo--> Oh, and you wouldn't have to switch out your motherboard.
Now, are we both done talking about fantasy-cards that few, if any, will ever realize?
Short version. The XTPE is $500, and is faster than the UE at $700. And let's not even talk about how easy it is to walk down to the local Fry's and pick up an XTPE, as opposed to trying to find one of the rare 'Golden Sample' 6800 UE cards... it'll only go downhill. <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html//emoticons/smile.gif' border='0' style='vertical-align:middle' alt='smile.gif' /><!--endemo--> <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
Sorry, yeah its called SLI but from my knowledge and the reviews I have read, it won't be rare and it isn't certainly going to cost that much. It also isn't JUST FOR the cream of the crop Geforce cards. I think the lowest cards they will support are the Geforce 6800 GTs.
As for you EEk, I hope thats not your rebuttal... Of course I didn't mean two chips of much lesser value than one big one. We are talking about 2 cards that are different in many ways but yet really similar in benchmarks, etc. I hope you aren't saying that the X800 card is more than 2 times better than the Geforce equivalent.
They sold less than a thousand of the cables in the consumer market, which were *required* for SLI.
It was a bomb. A goose-egg. A cash-sink that helped to drag them downward. I only find it amusing how many of 3Dfx's mistakes nVidia is DUPLICATING, SLI only being the latest.
In the case of nVidia, the cards must be SLI-specific... which the early review boards are NOT. They don't have the headers. The average consumer board may not have the headers, either.
See prior note about 'jack up price for X feature'.
Regardless, you're going to be paying at least $800 (assuming a pair of GTs) for a monstrous space heater that will dominate your case and give only marginally better performance (if that) to the ATi solution. You'll also be giving up two full, dedicated power-leads from your PSU, which leaves most people with one to run all of their HDDs, optical drives, and any accent lighting.
That's right. For EACH of these cards, nVidia demands the sacrifice of one PSU lead... on the 6800 UE, make that two PSU leads PER CARD. You'll literally have to put in a second PSU *just to run the video cards*.
You'll also be giving up two full, dedicated power-leads from your PSU, which leaves most people with one to run all of their HDDs, optical drives, and any accent lighting.
That's right. For EACH of these cards, nVidia demands the sacrifice of one PSU lead... on the 6800 UE, make that two PSU leads PER CARD. You'll literally have to put in a second PSU *just to run the video cards*. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
This is where 3DFX went absolutely nuts. On their very rare high-end model of the Voodoo5 (I believe it was called the "6000") their prototypes required something different.
You know that cable going straight from your wall outlet/surge protector to the back of your computer? Yeah. Think of having a 2nd one of those cables plugged directly into the back of the video card. I didn't believe it either until I <a href='http://www.sysopt.com/articles/21stCenturyTrends/v5_6000.png' target='_blank'>saw it</a>.
In any case, it's evident that nVidia's engineers aren't pushing as hard as ATi's in terms of power consumption and heat.
All I know is I've never had problems with any Nvidia cards I own.
And after looking at a lot of benchmarks since I have to get a new video card now, most of the higher end ones, like others have said, are alike in many ways, but each card has it own advantages.
Personal preference... buy a card for what you want it to do, and make sure to research it throughly to make sure it will work with all your current hardware.
That's right. For EACH of these cards, nVidia demands the sacrifice of one PSU lead... on the 6800 UE, make that two PSU leads PER CARD. You'll literally have to put in a second PSU *just to run the video cards*. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
Am I the only one that has a PSU that is overflowing with PSU molex connectors? I must have like 8 of the things totally in excess.