TalesinOur own little well of hateJoin Date: 2002-11-08Member: 7710NS1 Playtester, Forum Moderators
Ergh. Show a shot of the AR next, EEK. I'm too lazy to boot up D3 just to look at it, but I don't remember the top looking THAT crappy from the last time I was running around.
No idea how people can defend such ****ty and blatant tactics toward garnering stupid people as a marketbase with high FPS counts, but fugly image quality.
TalesinOur own little well of hateJoin Date: 2002-11-08Member: 7710NS1 Playtester, Forum Moderators
I actually went back and read through that Anandtech article... it's a classic biased benchmark, if not in the flavortext, then in the particular games chosen. Note how many of them are OpenGL, as opposed to DX8 or 9. Nowhere near an even sorting.
Also, any time the 6800 series pulls out a higher framerate it tends to be praised heavily in the reviewer-text, with fingers pointed toward 'see, we can still beat ATi!'... however, when the ATi parts pull out a can of whoop*** in a given game, it's described as a 'marginal victory' or a 'narrow lead'. Or even more obviously, they praised how the 6800 is LESS behind the X800 in FarCry, even though the graphical errors have not been fixed yet. <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html//emoticons/smile.gif' border='0' style='vertical-align:middle' alt='smile.gif' /><!--endemo--> I'd love to see a benchmark between the 6800 UE and the X800 XTPE that also compared the IQ, rather than just the framerate.
Again. nVidia delivers high-speed crap. ATi delivers slightly-slower (like you'll notice the 5-10fps in most games), full-quality images. I guess that's another thing people picked up from CS. They don't care if it looks like crap, so long as they can type '100fpsomgWTHhahahalooser!!11!' over the global chat.
<!--QuoteBegin-Talesin+Aug 13 2004, 05:53 AM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Talesin @ Aug 13 2004, 05:53 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> Ergh. Show a shot of the AR next, EEK. I'm too lazy to boot up D3 just to look at it, but I don't remember the top looking THAT crappy from the last time I was running around.
No idea how people can defend such ****ty and blatant tactics toward garnering stupid people as a marketbase with high FPS counts, but fugly image quality. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd--> Best I could do, since I just loaded an autosave and played for 3 minutes.
I just realized I was pumping out 71 frames on my 9800 on that last shot <!--emo&:D--><img src='http://www.unknownworlds.com/forums/html//emoticons/biggrin.gif' border='0' style='vertical-align:middle' alt='biggrin.gif' /><!--endemo-->
This is wierd though - is my AR missing something on it? His has a 'textured' look but it looks like horrendous artifacting then any actual reasonable texture...
Funny thing is, I had 43 FPS when there were TWO hellknights on my screen <!--emo&:D--><img src='http://www.unknownworlds.com/forums/html//emoticons/biggrin.gif' border='0' style='vertical-align:middle' alt='biggrin.gif' /><!--endemo-->
(There's no point to this shot except to show how pretty it is)
TalesinOur own little well of hateJoin Date: 2002-11-08Member: 7710NS1 Playtester, Forum Moderators
That's the AR I know and love. <!--emo&:D--><img src='http://www.unknownworlds.com/forums/html//emoticons/biggrin.gif' border='0' style='vertical-align:middle' alt='biggrin.gif' /><!--endemo-->
No, the 'texture' on the GeForce-rendered AR is an example of one type of bad normalmap compression artifacting, combined with cut color-calculations.
It's essentially the same thing that's going on with the hand, just that the GFs can't use the cool little trick the ATis do... stripping the alpha channel from a normalmapped texture (as it isn't used) and putting the normal info in that channel for compression. GeForces have to compress the normalmap seperately, which results in ugly, terrible artifacting.
Also, notice the ugly white garbage along the GF-zombie's (viewer's left) arm, where the fat zombie is behind it. That little dip-in, apparently a fold of cloth, should NOT be white-edged. Another fault in nVidia's rendering... or more precisely, another cut corner sacrificed in the name of higher fps.
I mean, seriously. If you want the game to look crappy, just play it on the 'low' detail settings. Don't shell out for a crappy card that *can't* do the <i>good</i> quality, when you want/need it.
The thing about video cards is you need to compromise quality vs. quantity (of frames). To say that FPS doesn't matter is foolish, it does, but frankly, anything over 40 is entirely playable. I played through most of doom 3 at around 20 fps max before I upgraded to 4.9 cats and other tweaks.
I'll go with Nvidia if ATI gets worse, not if Nvidia gets better.
<!--QuoteBegin-EEK+Aug 13 2004, 08:44 AM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (EEK @ Aug 13 2004, 08:44 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> The thing about video cards is you need to compromise quality vs. quantity (of frames). To say that FPS doesn't matter is foolish, it does, but frankly, anything over 40 is entirely playable. I played through most of doom 3 at around 20 fps max before I upgraded to 4.9 cats and other tweaks.
I'll go with Nvidia if ATI gets worse, not if Nvidia gets better. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd--> I'm playing it at 5-15 FPS max in most areas and still having fun. <!--emo&;)--><img src='http://www.unknownworlds.com/forums/html//emoticons/wink.gif' border='0' style='vertical-align:middle' alt='wink.gif' /><!--endemo--> It's surprisingly playable at about 10 FPS.
<a href='http://graphics.tomshardware.com/graphic/20040812/index.html' target='_blank'>http://graphics.tomshardware.com/graphic/20040812/index.html</a> but since the benchmarks were done by Nvidia, I'm going to wait for THG to verify them.
I showed that benchmark because I wanted to show how narrow minded the people were who said ATi > Nvidia. They have advantages and disadvantes and we could argue over benchmarks forever (lord knows people have). I suggest you don't over analyze, the differences are so tiny and if you chose "wrong", you'll hardly notice.
PS: I've heard that the X600's are garbage, getting schooled by 9800s.
<!--QuoteBegin-Talesin+Aug 13 2004, 05:20 AM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Talesin @ Aug 13 2004, 05:20 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> Not to mention the bits about nVidia cutting colour calculations on the back-end to 12 or 8-bit color to generate a higher fps reading, but looking like crap. <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html//emoticons/smile.gif' border='0' style='vertical-align:middle' alt='smile.gif' /><!--endemo-->
You have that backward, Jim. If you want a crap image, but high framerates of said crap, go with an nVidia. ATi keeps things at the full 24-bit color, and does actual, proper color calcs. Meaning no shadow-banding, screwed up normalmaps, or low-grade miplevel when dithering is required. <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html//emoticons/smile.gif' border='0' style='vertical-align:middle' alt='smile.gif' /><!--endemo--> So if you like to look at a game and have it look like a beautiful next-gen game, you'll be needing an ATi.
Still, for that pricepoint, I'll hold judgement until side-by-sides are done between the 6600 and X600. Most benchmarkers will do a mouseover-toggle so you can see what changes, and which looks better. Too bad the TFSAA settings have to be pseudo-generated, as 12x looks absolutely stunning even on a 9500 Pro. <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html//emoticons/smile.gif' border='0' style='vertical-align:middle' alt='smile.gif' /><!--endemo--> <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd--> you mean 16 or 12 (integer precision) right?
<!--QuoteBegin-Hibame+Aug 14 2004, 12:40 AM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Hibame @ Aug 14 2004, 12:40 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> I heard that the human eye can see at about 60 fps, so why go for 120 o.O Why not just keep at maybe lets say 70 and keep boosting quality <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd--> Human eye caps out at just over 20 FPS if I'm not mistaken...
The eye doesn't cap at ANY FPS, because it doesn't USE FPS. The higher FPS, the smoother it will look. Now, you may not conciously notice it between 1000 and 2000, but, to your eye, there is still a difference. In fact, I'd say that the max FPS an eye can handle is 299,792,458 frames per second, if you're a metre away from your monitor. That number increases as you get closer.
<a href='http://www.100fps.com/how_many_frames_can_humans_see.htm' target='_blank'>eh. googled cause i can't remember the oft-linked article.://eh. googled cause i can't remem...linked article.</a>
<!--QuoteBegin-Wheeee+Aug 14 2004, 12:47 AM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Wheeee @ Aug 14 2004, 12:47 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> no, you can see well above 200 fps.
<a href='http://www.100fps.com/how_many_frames_can_humans_see.htm' target='_blank'>eh. googled cause i can't remember the oft-linked article.://eh. googled cause i can't remem...linked article.://eh. googled cause i can't remem...linked article.</a> <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd--> The fact that the eye can intellegently process an image flashed for 1/200th of a second does not mean that your eye is refreshing its picture at that FPS. Your nerves pick up the bightness in the image and it gives you and afterbright effect, thus you see the image for longer than it acctually existed, long enough to barely analize its content. If you are watching a fan spinning, somewhere between 20-30 RPS it appears as perfectly still to your eye, that is because your eye is picking up the location of the fan blades at exactly the same point, one spin ahead of the location they were in the last image the eye processed (this happens the number of blades in the fan before this point too, that is why we comonly see it on car wheels that don't spin that fast, because of recuring hubcap or mag patterns) That is the maximum rate at which your eye processes images, thus any FPS greater than that rate will appear more or less perfectly fluid, expecially since they are bright images from the computer screen. (don't worry, your eye will synch properly to images flashing at odd FPS compared to its max, as long as the image is sensical enough for the brain to go "oh, I see what's happening", which generally computer games are)
EpidemicDark Force GorgeJoin Date: 2003-06-29Member: 17781Members
If not for nothing else it will probably push down the 9800 pro prices <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html//emoticons/smile.gif' border='0' style='vertical-align:middle' alt='smile.gif' /><!--endemo-->
This lovely shot courtesy of a GeForce card, full settings max <!--emo&:D--><img src='http://www.unknownworlds.com/forums/html//emoticons/biggrin.gif' border='0' style='vertical-align:middle' alt='biggrin.gif' /><!--endemo--> <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd--> Which specific card is this taken on? 5xxx series had IQ problems, not so with the 6xxx series.
Talesin is gonna be **** at me but I think Im gonna grab a 6800 non ultra, but with the ultra's pipelining off newegg for 300.... (The card comes with all 16 pipelines but only 12 are active. A softmod enables the last 4) I read on aanddtech's forums they were all jumping for joy over this card...
<!--QuoteBegin-CommunistWithAGun+Aug 14 2004, 03:43 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (CommunistWithAGun @ Aug 14 2004, 03:43 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->Talesin is gonna be **** at me but I think Im gonna grab a 6800 non ultra, but with the ultra's pipelining off newegg for 300.... (The card comes with all 16 pipelines but only 12 are active. A softmod enables the last 4) I read on aanddtech's forums they were all jumping for joy over this card...<!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd--> Do note that a 6800's remaining pipelines might actually <b>not</b> work properly. And if that's the case, hope that you can turn them back off or you're a long way from home.
This lovely shot courtesy of a GeForce card, full settings max <!--emo&:D--><img src='http://www.unknownworlds.com/forums/html//emoticons/biggrin.gif' border='0' style='vertical-align:middle' alt='biggrin.gif' /><!--endemo--> <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd--> Something tells me that screenshot was taken with a GF4, fancy to enlighten us?
my computer specs amd 2100+ geforce 4 ti 4600 512 ddr pc 2700 ram
on low settings i was getting about 40-50 fps (in combat)
set the graphics on ultra high and 4x anti aliasing and i was getting 35-45 fps (in combat)
dont ask my how my computers doing it but i love this baby!
and heres a screen for you guys that dont have it yet (maxed out graphics) excuse the jpeg compression and dont mind my health i was trying to figure out how to take a screenshot while they were beating on me tounge.gif<!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd--> i remembered this screen (sucked up every doom3 pic one the forums <!--emo&;)--><img src='http://www.unknownworlds.com/forums/html//emoticons/wink.gif' border='0' style='vertical-align:middle' alt='wink.gif' /><!--endemo-->), quick search brought up <a href='http://www.unknownworlds.com/forums/index.php?showtopic=76580&st=150' target='_blank'>this</a>
ontopic, lets see how those new cards perform. sooner or later i will have to replace the best card ever, my ol' trusty ti4200 <!--emo&???--><img src='http://www.unknownworlds.com/forums/html//emoticons/confused.gif' border='0' style='vertical-align:middle' alt='confused.gif' /><!--endemo-->
<!--QuoteBegin-Svenpa+Aug 15 2004, 11:31 AM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Svenpa @ Aug 15 2004, 11:31 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> Are the 6600 only for PCI express? And are u sure that the 256mb dosent do any noticeble change? even with the games out now? <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd--> Its not only for PCI Express but it supports it so you can have 2 6600's in your box for double the performance for either $300 dollars or $400 dollars approximately, and I am sure that will definately be worth it.
As for color-miscalculations and the such, I don't see any problems in that Geforce screen shot. It looks completely realistic too me, and if I was to notice and small details, it would only be in screenshots cuz when you are playing the game, you aren't pointing out the tiniest details that are flawed, they aren't noticable when you are in a firefight.
As for that review, yes all those games were OpenGL, but they were ALL good games. The ending of the review made a lot of sense to me. They stated that if you enjoy more games that are OpenGL, you would be happier with an Nvidia card, likewise, if you like more games that worked well with DX8 or 9, then ATi was for you.
TalesinOur own little well of hateJoin Date: 2002-11-08Member: 7710NS1 Playtester, Forum Moderators
Jim, you are defending a flawed position. Look at the texture and normal corruption on the AR. Look at the HAND. Look at the terrible, horrible shadow-banding that is a tell-tale of the corners nVidia has been forced to cut, just so their fanboys will have some small measure of hope... like a homeless family huddled around a last candle.
Comments
No idea how people can defend such ****ty and blatant tactics toward garnering stupid people as a marketbase with high FPS counts, but fugly image quality.
Also, any time the 6800 series pulls out a higher framerate it tends to be praised heavily in the reviewer-text, with fingers pointed toward 'see, we can still beat ATi!'... however, when the ATi parts pull out a can of whoop*** in a given game, it's described as a 'marginal victory' or a 'narrow lead'. Or even more obviously, they praised how the 6800 is LESS behind the X800 in FarCry, even though the graphical errors have not been fixed yet. <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html//emoticons/smile.gif' border='0' style='vertical-align:middle' alt='smile.gif' /><!--endemo-->
I'd love to see a benchmark between the 6800 UE and the X800 XTPE that also compared the IQ, rather than just the framerate.
Again. nVidia delivers high-speed crap. ATi delivers slightly-slower (like you'll notice the 5-10fps in most games), full-quality images.
I guess that's another thing people picked up from CS. They don't care if it looks like crap, so long as they can type '100fpsomgWTHhahahalooser!!11!' over the global chat.
No idea how people can defend such ****ty and blatant tactics toward garnering stupid people as a marketbase with high FPS counts, but fugly image quality. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
Best I could do, since I just loaded an autosave and played for 3 minutes.
I just realized I was pumping out 71 frames on my 9800 on that last shot <!--emo&:D--><img src='http://www.unknownworlds.com/forums/html//emoticons/biggrin.gif' border='0' style='vertical-align:middle' alt='biggrin.gif' /><!--endemo-->
This is wierd though - is my AR missing something on it? His has a 'textured' look but it looks like horrendous artifacting then any actual reasonable texture...
(There's no point to this shot except to show how pretty it is)
No, the 'texture' on the GeForce-rendered AR is an example of one type of bad normalmap compression artifacting, combined with cut color-calculations.
It's essentially the same thing that's going on with the hand, just that the GFs can't use the cool little trick the ATis do... stripping the alpha channel from a normalmapped texture (as it isn't used) and putting the normal info in that channel for compression.
GeForces have to compress the normalmap seperately, which results in ugly, terrible artifacting.
Also, notice the ugly white garbage along the GF-zombie's (viewer's left) arm, where the fat zombie is behind it. That little dip-in, apparently a fold of cloth, should NOT be white-edged. Another fault in nVidia's rendering... or more precisely, another cut corner sacrificed in the name of higher fps.
I mean, seriously. If you want the game to look crappy, just play it on the 'low' detail settings. Don't shell out for a crappy card that *can't* do the <i>good</i> quality, when you want/need it.
I'll go with Nvidia if ATI gets worse, not if Nvidia gets better.
I'll go with Nvidia if ATI gets worse, not if Nvidia gets better. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
I'm playing it at 5-15 FPS max in most areas and still having fun. <!--emo&;)--><img src='http://www.unknownworlds.com/forums/html//emoticons/wink.gif' border='0' style='vertical-align:middle' alt='wink.gif' /><!--endemo--> It's surprisingly playable at about 10 FPS.
Introducing: GeForce 143042000! Part of the Geforce 23 series.
With Doom3, it can get 240021 fps at 1600x1200.
With Doom4, it can get 12 fps at 1600x1200.
And with Doom5, it can get .02 fps at 1600x1200.
HOW FAST IS THAT!?
but since the benchmarks were done by Nvidia, I'm going to wait for THG to verify them.
PS: I've heard that the X600's are garbage, getting schooled by 9800s.
You have that backward, Jim. If you want a crap image, but high framerates of said crap, go with an nVidia. ATi keeps things at the full 24-bit color, and does actual, proper color calcs. Meaning no shadow-banding, screwed up normalmaps, or low-grade miplevel when dithering is required. <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html//emoticons/smile.gif' border='0' style='vertical-align:middle' alt='smile.gif' /><!--endemo--> So if you like to look at a game and have it look like a beautiful next-gen game, you'll be needing an ATi.
Still, for that pricepoint, I'll hold judgement until side-by-sides are done between the 6600 and X600. Most benchmarkers will do a mouseover-toggle so you can see what changes, and which looks better. Too bad the TFSAA settings have to be pseudo-generated, as 12x looks absolutely stunning even on a 9500 Pro. <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html//emoticons/smile.gif' border='0' style='vertical-align:middle' alt='smile.gif' /><!--endemo--> <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
you mean 16 or 12 (integer precision) right?
Why not just keep at maybe lets say 70 and keep boosting quality
Why not just keep at maybe lets say 70 and keep boosting quality <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
Human eye caps out at just over 20 FPS if I'm not mistaken...
<a href='http://www.100fps.com/how_many_frames_can_humans_see.htm' target='_blank'>eh. googled cause i can't remember the oft-linked article.://eh. googled cause i can't remem...linked article.</a>
<a href='http://www.100fps.com/how_many_frames_can_humans_see.htm' target='_blank'>eh. googled cause i can't remember the oft-linked article.://eh. googled cause i can't remem...linked article.://eh. googled cause i can't remem...linked article.</a> <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
The fact that the eye can intellegently process an image flashed for 1/200th of a second does not mean that your eye is refreshing its picture at that FPS. Your nerves pick up the bightness in the image and it gives you and afterbright effect, thus you see the image for longer than it acctually existed, long enough to barely analize its content. If you are watching a fan spinning, somewhere between 20-30 RPS it appears as perfectly still to your eye, that is because your eye is picking up the location of the fan blades at exactly the same point, one spin ahead of the location they were in the last image the eye processed (this happens the number of blades in the fan before this point too, that is why we comonly see it on car wheels that don't spin that fast, because of recuring hubcap or mag patterns) That is the maximum rate at which your eye processes images, thus any FPS greater than that rate will appear more or less perfectly fluid, expecially since they are bright images from the computer screen. (don't worry, your eye will synch properly to images flashing at odd FPS compared to its max, as long as the image is sensical enough for the brain to go "oh, I see what's happening", which generally computer games are)
128mb version or 256 mb version...will it make a HUGE difference? (The 256mb version is about $70 more)
This lovely shot courtesy of a GeForce card, full settings max <!--emo&:D--><img src='http://www.unknownworlds.com/forums/html//emoticons/biggrin.gif' border='0' style='vertical-align:middle' alt='biggrin.gif' /><!--endemo--> <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
Which specific card is this taken on? 5xxx series had IQ problems, not so with the 6xxx series.
Do note that a 6800's remaining pipelines might actually <b>not</b> work properly. And if that's the case, hope that you can turn them back off or you're a long way from home.
<!--QuoteBegin--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->
<a href='http://www.unknownworlds.com/forums/uploads//post-10-1091515728.jpg' target='_blank'>http://www.unknownworlds.com/forums/up...-1091515728.jpg</a>
This lovely shot courtesy of a GeForce card, full settings max <!--emo&:D--><img src='http://www.unknownworlds.com/forums/html//emoticons/biggrin.gif' border='0' style='vertical-align:middle' alt='biggrin.gif' /><!--endemo-->
<!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
Something tells me that screenshot was taken with a GF4, fancy to enlighten us?
my computer specs
amd 2100+
geforce 4 ti 4600
512 ddr pc 2700 ram
on low settings i was getting about 40-50 fps (in combat)
set the graphics on ultra high and 4x anti aliasing and i was getting 35-45 fps (in combat)
dont ask my how my computers doing it but i love this baby!
and heres a screen for you guys that dont have it yet (maxed out graphics)
excuse the jpeg compression and dont mind my health i was trying to figure out how to take a screenshot while they were beating on me tounge.gif<!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
i remembered this screen (sucked up every doom3 pic one the forums <!--emo&;)--><img src='http://www.unknownworlds.com/forums/html//emoticons/wink.gif' border='0' style='vertical-align:middle' alt='wink.gif' /><!--endemo-->), quick search brought up <a href='http://www.unknownworlds.com/forums/index.php?showtopic=76580&st=150' target='_blank'>this</a>
ontopic, lets see how those new cards perform. sooner or later i will have to replace the best card ever, my ol' trusty ti4200 <!--emo&???--><img src='http://www.unknownworlds.com/forums/html//emoticons/confused.gif' border='0' style='vertical-align:middle' alt='confused.gif' /><!--endemo-->
Its not only for PCI Express but it supports it so you can have 2 6600's in your box for double the performance for either $300 dollars or $400 dollars approximately, and I am sure that will definately be worth it.
As for color-miscalculations and the such, I don't see any problems in that Geforce screen shot. It looks completely realistic too me, and if I was to notice and small details, it would only be in screenshots cuz when you are playing the game, you aren't pointing out the tiniest details that are flawed, they aren't noticable when you are in a firefight.
As for that review, yes all those games were OpenGL, but they were ALL good games. The ending of the review made a lot of sense to me. They stated that if you enjoy more games that are OpenGL, you would be happier with an Nvidia card, likewise, if you like more games that worked well with DX8 or 9, then ATi was for you.
<!--emo&;)--><img src='http://www.unknownworlds.com/forums/html//emoticons/wink.gif' border='0' style='vertical-align:middle' alt='wink.gif' /><!--endemo-->