Bad News For Nvidia Users
<div class="IPBDescription">no HL2 for you</div> oh my god, did you hear the news on planethalflife? Valve made a presentation saying basically that Nvidia graphic cards are not working with HL2. oh well too bad that sucks. good thing i waited so now i know to buy an ATI.
Comments
-JohnnySmash
Valve take serious issue with "optimizations" from NVIDIA as of late
In no certain words, Valve is highly disappointed with current NV3x hardware as a high-performance DX9 accelerator
Valve will have Half-Life 2 treat NV3x hardware as DX8.1 hardware for the default settings.
According to Gabe, the rumors and speculation of ATI paying them off is nothing but bull - he said Valve's top priority has everything to do with them wanting HL2 players to have the best experience. After doing some early-on benchmarking between NVIDIA and ATI, the choice was clear from Valve's standpoint of who to partner with.
Microsoft's DirectX team was on-hand to give full blessing to Valve's upcoming HL2 benchmark - in fact it's being referred to as the most complete DX9 benchmark to date. <!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
<a href='http://www.gamersdepot.com/interviews/gabe/002.htm' target='_blank'>From here:</a> (dated yesterday)
<!--QuoteBegin--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->
GD: What's your relationship with NVIDIA been like in light of all the recent ATI press over HL2?
Gabe: Valve and NVIDIA both know that we have a lot of shared customers, and we've invested a lot more time optimizing that rendering path to ensure the best experience for the most customers.<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
It sounds like ATI cards are obviously better from a programmer's standpoint, but they're not going to leave NVidia customers out in the cold, far from it.
<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd--> <!--QuoteEnd--> </td></tr></table><span class='postcolor'> <!--QuoteEEnd-->
LOL if he really thought only of halflife2 players he wouldn't make it a monthly cost for playing online! <!--emo&:(--><img src='http://www.unknownworlds.com/forums/html/emoticons/sad.gif' border='0' style='vertical-align:middle' alt='sad.gif'><!--endemo-->
rawr!
bert!
<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
LOL if he really thought only of halflife2 players he wouldn't make it a monthly cost for playing online! <!--emo&:(--><img src='http://www.unknownworlds.com/forums/html/emoticons/sad.gif' border='0' style='vertical-align:middle' alt='sad.gif'><!--endemo-->
rawr!
bert! <!--QuoteEnd--> </td></tr></table><span class='postcolor'> <!--QuoteEEnd-->
Halflife 2 is NOT p2p damn it!
Basically, goes like this: Until they find a way to work with the drivers for the new NVidia drivers, its going to have a performance decrease from 20%-40% compared to other games with the same graphic capability.
The whole Pay-To-Play rumor is full on bull crap. OK? So stop bringing it up. Planethalf-life.com is getting left in the dust, and in a desperate attempt(and being that gamespy.com doesn't run it themselves, just host it) to get more hits, they keep putting up BS news stories, and its been going for a while like this.
Half-Life 2:
Will run on all graphics cards with 64mb(but with VERY decreased detail[or unless said to not run on them, which it will say on their official site])
Will not be Play-to-pay Multiplayer
Will not ship in 3 different versions(this was a big rumor that many believed, and right now VALVe is basically trying to find every site that's posted that, so they can sue them for fraud and the such, being no one has any proof, and because its full on BS.
Will be out on shelves by September 31st(reason it won't be on shelves September 30th is because most stores are too stupid to put them on the shelves the second they get the shipment, not to mention, think of all the late shipments)
Will ship with *MOST* mod tools(The newest version of Hammer will be included[map editor], the Speach Unit[for lip synching], and steam. Through steam, they plan to have a model program you can buy called XSI[models let you make characters, guns, vehicles, and the such])
Will Run with little or no bugs(not sure yet, but thats what the official word recorded on MIDI from Gabe Newell says)
If any of this is incorrect, and you can point me to something to prove it- do it. I've been following this, and just about everything they say is unfounded. I bet that most of planethalf-life.com's stuff is off of the IRC channel #halflife2(which is a fake, BS channel)
Probably half a day after its out their gonna patch HL2 to run great with NVidia cards.
maybe those drivers cheat?
--------------------------------------------------------------------------------
And I'm proved right again.<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
Or, maybe you're <b>proven</b> right again. <!--emo&:p--><img src='http://www.unknownworlds.com/forums/html/emoticons/tounge.gif' border='0' style='vertical-align:middle' alt='tounge.gif'><!--endemo-->
(Hehe, couldn't resist)
Anyway, I've had quite enough with people spreading so many rumors about the game. Its just common sense that Valve would lose too much of its fanbase if Half Life 2 didn't run on NVidia cards. Advice to all you rumor starters/spreaders: Look for and read the solid facts first, before you start posting wildly about something that isn't even true.
btw what's interesting is that in general, Doom3 has better frame rates than Half life 2. this totally contradicts reports i've heard before.
<!--QuoteBegin--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->The optimal code path for ATI and NVIDIA GPUs is different - so trying to test them with the same code path will always disadvantage one or the other. The default settings for each game have been chosen by both the developers and NVIDIA in order to produce the best results for our consumers.
In addition to the developer efforts, our driver team has developed a next-generation automatic shader optimizer that vastly improves GeForce FX pixel shader performance across the board. The fruits of these efforts will be seen in our Rel.50 driver release. Many other improvements have also been included in Rel.50, and these were all created either in response to, or in anticipation of the first wave of shipping DirectX 9 titles, such as Half Life 2.
We are committed to working with Gabe to fully understand his concerns and with Valve to ensure that 100+ million NVIDIA consumers get the best possible experience with Half Life 2 on NVIDIA hardware.<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
Yes. Oh yes.
And also, HL2 is NOT going to be crap on NVidia cards. Valve would be stupid to let over 50% of their customers have crappy performance... (NVidia still holds market dominance last time I checked)
Yes. Oh yes.
And also, HL2 is NOT going to be crap on NVidia cards. Valve would be stupid to let over 50% of their customers have crappy performance... (NVidia still holds market dominance last time I checked) <!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
Actually, ATI holds market dominance (fighting with Diamond/S3) in a much larger field.. the OEM market. <!--emo&:D--><img src='http://www.unknownworlds.com/forums/html/emoticons/biggrin.gif' border='0' style='vertical-align:middle' alt='biggrin.gif'><!--endemo-->
nVidia *does* hold the budget card share quite well (mostly due to their marketing department running a smear campaign to kill the PVR Kyro2, which also killed the K3 even before it could be put in a card), and some of the mid-range.
Mid-high falls to ATI pretty overwhelmingly (if you need speed and image quality, and don't want to lay out $20,000 for a vid card, you're getting an ATI), and high end is (of course) dominated by Matrox and niche cards.
HL2 will *run acceptably* on a GFFX, but it will default to DX8.1 mode (no Pixel Shader v2.0 support, which is required to see all the pretty-pretties the way VALVe intended). You can shift it into full DX9 mode, just expect <b>huge</b> framerate hits due to nVidia's slipshod approach to architecture updates.
<!--QuoteBegin--PHL+ nVidia Press Release--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (PHL @ nVidia Press Release)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->
Since we know that obtaining the best pixel shader performance from the GeForce FX GPUs currently requires some specialized work, our developer technology team works very closely with game developers. Part of this is understanding that in many cases promoting PS 1.4 (DirectX 8) to PS 2.0 (DirectX 9) provides no image quality benefit. Sometimes this involves converting 32-bit floating point precision shader operations into 16-bit floating point precision shaders in order to obtain the performance benefit of this mode with no image quality degradation.
<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
Hmmm. Let's see. Admitting outright that their cards are hard for programmers to work with (requiring specific rendering pipelines to get any kind of performance), making excuses as to why the full new (standardized) way is no better than their current incomplete implementation thereof, and finally... ACTUALLY EXPECTING that people will <i>believe</i> that you have no image quality degradation when you're losing <u>16 bits of precision</u>!
<!--QuoteBegin--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->Well i don't know much about all this kinda stuff, but I have a simple question:
My system specs are an Athlon XP 1800, 512 meg of DDR ram and a Winfast Geforce 4 Ti4200 vid card. Will HL2 run fine on my system?
<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
And got an answer of "Yes". Thankyou CWAG <!--emo&:D--><img src='http://www.unknownworlds.com/forums/html/emoticons/biggrin.gif' border='0' style='vertical-align:middle' alt='biggrin.gif'><!--endemo-->
But now I'm after more specitifed information. I wish to know what kind of HL2 experiance I will be recieving. Could I have all graphical options on? Could I have full pixel shading ect *insert all those weird funky techy thingys that cards do these days*? Will the game run smoothly? Will I have to tone down some of the graphical options? If someone knows the answers to these questions please reply, because I can't make out anything from the tangled web this thread has become <!--emo&:p--><img src='http://www.unknownworlds.com/forums/html/emoticons/tounge.gif' border='0' style='vertical-align:middle' alt='tounge.gif'><!--endemo-->
Like I said, second guessing now. ATI is just out to make money and I can't afford to keep buying a small car for video cards
either valve/nividia is stirring the pot for kicks and giggles
or
they are farking well serious and I might just have to have a word with gabe, and make some eloquent arguments ( ie im gonna slap him like a ****)
whatever, hl2 is as sweet as a very sweet thing so it might be worth it anyway.
If you have a GF 4 card or higher, your gonna be able to play HL2 just fine, but you wont be getting the SEX you would with a new ATI.
The game will be playable just not full of the SEX you would get from the Radeon
You will not be able to have all graphical options on as you do not have a completely Directx9 optimized card - many of the cool effects in this video require that. A Geforce 4 was mainly optimized for 8.1 - grabbing a late model 9X00 ATI or Nvidia FX-class card will get you into DirectX 9 full compatibility...
As for the rest of this nonsensical thread (complete with 'Le Fanbois de ATI'), just wait and see. It's not like its in Valve's best interest to run badly on half the modern video cards Nvidia makes. I'm sure the performance differences at the high end will be rather negligible to a human eye, just like in all games. This sort of dopey thread happens with every big name game ever released, and it always ends up being a moot point.