KungFuSquirrelBasher of MuttonsJoin Date: 2002-01-26Member: 103Members, NS1 Playtester, Contributor
I recently purchased a Radeon card when I was setting up my dual monitor setup. It was $249 with no tax and no shipping on a deal from newegg and produces stellar performance in every program I run - everything I have been playing lately (Elite Force 2, UT2K3, Homeworld 2 demo, etc.) runs silky smooth with highest possible resolution settings and all details cranked to the max, and that's 'only' on an Athlon XP 1900. I've got a kick-**** dual 20" monitor setup, and with the exception of some weird issues with Diablo II, I've never been happier with graphics card performance. This after previously owning a GeForce 256, GeForce2 Ultra, and GeForce3 Ti500.
I could care less whether I'll run HL2 or Doom3 better - they'll run satisfactorily. In this case, it was cost over performance, and I got a hell of a deal for my money, all things considered.
<!--QuoteBegin--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->but it's also not in their best interests to create a sub-group specifically to write an nVidia-specific rendering pipeline, when the ATI card is scooting along just fine with standard OpenGL<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
No, it is in Valve's intrests to make their product run on every system they can possibly reach. Cutting out half the market is the commerical equivilant of shooting yourself in the head. I'm certain that Valve will be doing everything in their power in conjunction with nvidia to make HL2 run nice and sweetly on nvidia cards.
The most annoying thing I've found is that I was considering upgrading my card as well. Thanks to this thread, I'm now certain that if I posed the question "Which card should I get?" I'll be bombarded with responses that arn't exactly non-biased <!--emo&:p--><img src='http://www.unknownworlds.com/forums/html/emoticons/tounge.gif' border='0' style='vertical-align:middle' alt='tounge.gif'><!--endemo--> I had hoped vid cards were free from such divisions as those that exist between Athlon and Intel but it seems I was wrong.
I just want to know what truth lies behind those charts on the first page.
Firstly, how has the site managed to get a copy of HL2 to test out the card's performance? It certainly can't be anything to do with Valve if they are trying to defend Nvidia from the comments.
Also, the fact it is saying most cards will run at 14fps? Am I the only person thinking what the fudge?
TalesinOur own little well of hateJoin Date: 2002-11-08Member: 7710NS1 Playtester, Forum Moderators
edited September 2003
<!--QuoteBegin--Ryo-Ohki+Sep 12 2003, 07:41 AM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Ryo-Ohki @ Sep 12 2003, 07:41 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> No, it is in Valve's intrests to make their product run on every system they can possibly reach. Cutting out half the market is the commerical equivilant of shooting yourself in the head. I'm certain that Valve will be doing everything in their power in conjunction with nvidia to make HL2 run nice and sweetly on nvidia cards. <!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd--> Yes. But only AFTER the game is complete. The fact that the ATI cards can take the standard OpenGL renderer and tear through what's asked of them is not a testament to ATI... it's a criticism of nVidia for REQUIRING programmers to code specifically for their card. If anyone, nVidia should be taking a long, hard look at their drivers if they can't intelligently take an OpenGL stream and convert it on the fly to the Cg crud.
So if it's a choice between having a kicka** game that only runs on half the video cards out there, or a mediocre game that runs on all of them...
Let's just leave it at that I'd rather play the evolution of the original Half-Life revolution on my Radeon, than yet another Daikatana and have it run exactly the same on nVidia and ATI.
(edit) Yes, sites DO get advance copies of games. It's how they have reviews and previews out on time. Alternately, the site may have been handed benchmark information from VALVe, if they don't want to even let one copy out of their control. Which is likely, given how tight-lipped they've been about HL2 in general, over years now. (/edit)
btw i think HL2 uses just DirectX and Doom3 uses just OpenGL. And i think Tim Sweeney who makes Unreal games said that if you are making games for Windows you have to use DirectX rather than OpenGL if you want the best performance. now i don't know who to listen to about anything.
<!--QuoteBegin--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->Yes. But only AFTER the game is complete. The fact that the ATI cards can take the standard OpenGL renderer and tear through what's asked of them is not a testament to ATI... it's a criticism of nVidia for REQUIRING programmers to code specifically for their card. If anyone, nVidia should be taking a long, hard look at their drivers if they can't intelligently take an OpenGL stream and convert it on the fly to the Cg crud. <!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
Look I'm not here to debate whether ATI or nVidia is the better company. I know virtually nothing of either and I'm not taking sides. Regardless of what nVidia did to their cards they still have a very vested interest, as does Valve, in getting HL2 to run smoothly and decently on their cards. Now I hardly think that Valve is going to drastically alter their product to suit nvidia. Most of the work is going to be on nvidia's end. Saying though that Valve won't make any changes at all to help nvidia isn't really a fair call. I mean, I'm sure you could make HL2 look even sweeter if you jacked up the required processer to a P4 4gig. But they're not going to do that because few people have that. Same with the vid cards: Valve isn't going to ignore half of their potential customers. They will change small aspects of the game to help nvidia. From a marketing standpoint, they have little choice, unless nvidia can sort out all the problems on their end without requireing imput from Valve.
**** it looks like Nvidia is holding back the games industry just like 3dfx did before there was an uproar from the gaming public that forcibly bankrupted them. btw 3dfx was assimilated into Nvidia. that might explain what we are seeing.
3dfx once said we don't need higher than 16bit color rendering.
Nvidia once said we don't need bump-mapping.
i once said. yah that's right. we just need bread and water, right?
Well, I am getting a new graphics card within 4 hours.
It's going to be an ATI one. Not because of this post, but because it gets me more pretties for my £. It will be either a 9500, or a 9700. These are essentially the same card, since the 9500 can be conveniently overclocked to act like a 9700 <!--emo&:)--><img src='http://www.natural-selection.org/forums/html/emoticons/smile.gif' border='0' style='vertical-align:middle' alt='smile.gif'><!--endemo-->
Wish me luck, I am about to spend about 3 weeks worth of wages. (i.e, I work 1, maybe 2 days a week. £5 an hour is pretty damn good for a 17 year old so <!--emo&:p--><img src='http://www.natural-selection.org/forums/html/emoticons/tounge.gif' border='0' style='vertical-align:middle' alt='tounge.gif'><!--endemo-->)
<!--QuoteBegin--Ryo-Ohki+Sep 12 2003, 10:41 AM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Ryo-Ohki @ Sep 12 2003, 10:41 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> No, it is in Valve's intrests to make their product run on every system they can possibly reach. Cutting out half the market is the commerical equivilant of shooting yourself in the head. I'm certain that Valve will be doing everything in their power in conjunction with nvidia to make HL2 run nice and sweetly on nvidia cards. <!--QuoteEnd--> </td></tr></table><span class='postcolor'> <!--QuoteEEnd--> It would be more in Valve's interests to force nVidia to adhere to industry-supported standards, instead of running off and forcing people to write separate codepaths for them, which wastes time and money.
TalesinOur own little well of hateJoin Date: 2002-11-08Member: 7710NS1 Playtester, Forum Moderators
And programmers, who could otherwise be working on improving the game, rather than being diverted to write codepaths for a video card brand that should not need them to perform well.
The responsibility should be on the part manufacturer, NOT the programmer, to get the card performing acceptably on the same damn code.
Many parallels... 3Dfx went to a closed standard (anyone else remember GLIDE?), nVidia is going to a closed standard (Cg). 3Dfx put more stock in clocks over efficiency (SLI, Voodoo 4/5), nVidia puts more stock in clocks over efficiency (GFFX). 3Dfx cheated on benchmarks... wait. <!--emo&:p--><img src='http://www.unknownworlds.com/forums/html/emoticons/tounge.gif' border='0' style='vertical-align:middle' alt='tounge.gif'><!--endemo--> I'm just hoping that if nVidia meets the same fate (and yes, we still DO have 3Dfx fanboys) that ATI will be intelligent and *at most* buy the hardware, while not absorbing the people.
I dont know about you but a half-life without openGL is almost not worth buying <!--emo&:(--><img src='http://www.unknownworlds.com/forums/html/emoticons/sad.gif' border='0' style='vertical-align:middle' alt='sad.gif'><!--endemo-->
<!--QuoteBegin--Wheeee+Sep 12 2003, 06:15 PM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Wheeee @ Sep 12 2003, 06:15 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> <!--QuoteBegin--Ryo-Ohki+Sep 12 2003, 10:41 AM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Ryo-Ohki @ Sep 12 2003, 10:41 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> No, it is in Valve's intrests to make their product run on every system they can possibly reach. Cutting out half the market is the commerical equivilant of shooting yourself in the head. I'm certain that Valve will be doing everything in their power in conjunction with nvidia to make HL2 run nice and sweetly on nvidia cards. <!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd--> It would be more in Valve's interests to force nVidia to adhere to industry-supported standards, instead of running off and forcing people to write separate codepaths for them, which wastes time and money. <!--QuoteEnd--> </td></tr></table><span class='postcolor'> <!--QuoteEEnd--> they did adhere to the standards but they conveniently failed to mention that if you try to use Nvidia cards for direcxt9 you won't get real-time performance. the standard doesn't say it has to be playable.
as i understand it, the seperate code paths that Valve made were to try to get realtime peformance by only partially using directx9 quality but then they found that it only improved one of Nvidias cards and it still wasn't enough to give a decent frame rate so then they just decided to throw out all that work and just force Nvidia to run in crappy Directx8 quality mode coz that's all it can handle at a playable frame rate and even then it was still slower than the ATI card that was running at Full Directx9 quality mode.
anyway before i accuse Nvidia of some Enron type scandal i'm going to give them the benefit of the doubt and assume that maybe Valve could have screwed-up somehow because apparently they are capable of totally screwing-up judging by the quality of Steam.
Comments
I could care less whether I'll run HL2 or Doom3 better - they'll run satisfactorily. In this case, it was cost over performance, and I got a hell of a deal for my money, all things considered.
/mumbles "Damn SOB, I have a POS comp and hes got all this good stuff!"
No, it is in Valve's intrests to make their product run on every system they can possibly reach. Cutting out half the market is the commerical equivilant of shooting yourself in the head. I'm certain that Valve will be doing everything in their power in conjunction with nvidia to make HL2 run nice and sweetly on nvidia cards.
The most annoying thing I've found is that I was considering upgrading my card as well. Thanks to this thread, I'm now certain that if I posed the question "Which card should I get?" I'll be bombarded with responses that arn't exactly non-biased <!--emo&:p--><img src='http://www.unknownworlds.com/forums/html/emoticons/tounge.gif' border='0' style='vertical-align:middle' alt='tounge.gif'><!--endemo--> I had hoped vid cards were free from such divisions as those that exist between Athlon and Intel but it seems I was wrong.
Firstly, how has the site managed to get a copy of HL2 to test out the card's performance? It certainly can't be anything to do with Valve if they are trying to defend Nvidia from the comments.
Also, the fact it is saying most cards will run at 14fps? Am I the only person thinking what the fudge?
Yes. But only AFTER the game is complete. The fact that the ATI cards can take the standard OpenGL renderer and tear through what's asked of them is not a testament to ATI... it's a criticism of nVidia for REQUIRING programmers to code specifically for their card. If anyone, nVidia should be taking a long, hard look at their drivers if they can't intelligently take an OpenGL stream and convert it on the fly to the Cg crud.
So if it's a choice between having a kicka** game that only runs on half the video cards out there, or a mediocre game that runs on all of them...
Let's just leave it at that I'd rather play the evolution of the original Half-Life revolution on my Radeon, than yet another Daikatana and have it run exactly the same on nVidia and ATI.
(edit) Yes, sites DO get advance copies of games. It's how they have reviews and previews out on time. Alternately, the site may have been handed benchmark information from VALVe, if they don't want to even let one copy out of their control. Which is likely, given how tight-lipped they've been about HL2 in general, over years now. (/edit)
<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
Look I'm not here to debate whether ATI or nVidia is the better company. I know virtually nothing of either and I'm not taking sides. Regardless of what nVidia did to their cards they still have a very vested interest, as does Valve, in getting HL2 to run smoothly and decently on their cards. Now I hardly think that Valve is going to drastically alter their product to suit nvidia. Most of the work is going to be on nvidia's end. Saying though that Valve won't make any changes at all to help nvidia isn't really a fair call. I mean, I'm sure you could make HL2 look even sweeter if you jacked up the required processer to a P4 4gig. But they're not going to do that because few people have that. Same with the vid cards: Valve isn't going to ignore half of their potential customers. They will change small aspects of the game to help nvidia. From a marketing standpoint, they have little choice, unless nvidia can sort out all the problems on their end without requireing imput from Valve.
3dfx once said we don't need higher than 16bit color rendering.
Nvidia once said we don't need bump-mapping.
i once said. yah that's right. we just need bread and water, right?
It's going to be an ATI one. Not because of this post, but because it gets me more pretties for my £. It will be either a 9500, or a 9700. These are essentially the same card, since the 9500 can be conveniently overclocked to act like a 9700 <!--emo&:)--><img src='http://www.natural-selection.org/forums/html/emoticons/smile.gif' border='0' style='vertical-align:middle' alt='smile.gif'><!--endemo-->
Wish me luck, I am about to spend about 3 weeks worth of wages. (i.e, I work 1, maybe 2 days a week. £5 an hour is pretty damn good for a 17 year old so <!--emo&:p--><img src='http://www.natural-selection.org/forums/html/emoticons/tounge.gif' border='0' style='vertical-align:middle' alt='tounge.gif'><!--endemo-->)
No, it is in Valve's intrests to make their product run on every system they can possibly reach. Cutting out half the market is the commerical equivilant of shooting yourself in the head. I'm certain that Valve will be doing everything in their power in conjunction with nvidia to make HL2 run nice and sweetly on nvidia cards.
<!--QuoteEnd--> </td></tr></table><span class='postcolor'> <!--QuoteEEnd-->
It would be more in Valve's interests to force nVidia to adhere to industry-supported standards, instead of running off and forcing people to write separate codepaths for them, which wastes time and money.
The responsibility should be on the part manufacturer, NOT the programmer, to get the card performing acceptably on the same damn code.
Many parallels... 3Dfx went to a closed standard (anyone else remember GLIDE?), nVidia is going to a closed standard (Cg). 3Dfx put more stock in clocks over efficiency (SLI, Voodoo 4/5), nVidia puts more stock in clocks over efficiency (GFFX). 3Dfx cheated on benchmarks... wait. <!--emo&:p--><img src='http://www.unknownworlds.com/forums/html/emoticons/tounge.gif' border='0' style='vertical-align:middle' alt='tounge.gif'><!--endemo-->
I'm just hoping that if nVidia meets the same fate (and yes, we still DO have 3Dfx fanboys) that ATI will be intelligent and *at most* buy the hardware, while not absorbing the people.
No, it is in Valve's intrests to make their product run on every system they can possibly reach. Cutting out half the market is the commerical equivilant of shooting yourself in the head. I'm certain that Valve will be doing everything in their power in conjunction with nvidia to make HL2 run nice and sweetly on nvidia cards.
<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
It would be more in Valve's interests to force nVidia to adhere to industry-supported standards, instead of running off and forcing people to write separate codepaths for them, which wastes time and money. <!--QuoteEnd--> </td></tr></table><span class='postcolor'> <!--QuoteEEnd-->
they did adhere to the standards but they conveniently failed to mention that if you try to use Nvidia cards for direcxt9 you won't get real-time performance. the standard doesn't say it has to be playable.
as i understand it, the seperate code paths that Valve made were to try to get realtime peformance by only partially using directx9 quality but then they found that it only improved one of Nvidias cards and it still wasn't enough to give a decent frame rate so then they just decided to throw out all that work and just force Nvidia to run in crappy Directx8 quality mode coz that's all it can handle at a playable frame rate and even then it was still slower than the ATI card that was running at Full Directx9 quality mode.
anyway before i accuse Nvidia of some Enron type scandal i'm going to give them the benefit of the doubt and assume that maybe Valve could have screwed-up somehow because apparently they are capable of totally screwing-up judging by the quality of Steam.