IronHorseDeveloper, QA Manager, Technical Support & contributorJoin Date: 2010-05-08Member: 71669Members, Super Administrators, Forum Admins, Forum Moderators, NS2 Developer, NS2 Playtester, Squad Five Blue, Subnautica Playtester, Subnautica PT Lead, Pistachionauts
edited October 2011
no. simply put, no.
lol. i'm sorry id love to agree with you, but carmack himself has discussed in multiple interviews this year how the tech was designed with consoles in mind, low ram, high latency, better suited with megatexture tech with occlusion culling, whereas he said PCs dont really benefit too much from the tech in that they have much more ram, and low latency. <a href="http://www.maximumpc.com/article/news/john_carmack_says_id_will_take_console-first_approach_games" target="_blank">http://www.maximumpc.com/article/news/john..._approach_games</a>
the issue with the tech is this: its meant to work with highly compressed textures. you are making a trade off with the artist's ability for unique non tiled, non instanced textures for high resolution, low compression ones that are instanced often. theres talks of the rumored 1 tb of textures they used for internal developing. obviously not practical. heres a great read: <a href="http://gamasutra.com/blogs/BenjaminQuintero/20110822/8253/Things_to_Consider_About_Megatextures.php" target="_blank">http://gamasutra.com/blogs/BenjaminQuinter...egatextures.php</a>
"there are ways to make them better" you mean by manually changing these cvars? <a href="http://www.geforce.com/News/articles/how-to-unlock-rages-high-resolution-textures-with-a-few-simple-tweaks" target="_blank">http://www.geforce.com/News/articles/how-t...w-simple-tweaks</a>
yea.. youre still working with the compressed textures they packed into the game. they are the 4k and 8k, not the 16k promised. as for the promised texture pack? <a href="http://twitter.com/#!/ID_AA_Carmack" target="_blank">http://twitter.com/#!/ID_AA_Carmack</a> carmack's latest twitter. source files were ###### textures anyways. and the twitter or two below that he says they'll include the "detail layer" like in UE3. ::rolls eyes:: rumor has it the environment artists there were very unhappy about having to create said crappy quality all to make their game solid 60fps on consoles.
that screenshot above is a far away shot, and you know it. i am not going to dump the 1,000s of blurry texture shots every rage forum is posting, you can look yourself, but to say the least, im sure we can both rattle off a dozen titles that came out in the past 3 years that supersede these graphics. hell even their last engine's graphics look better: <a href="http://www.dsogaming.com/news/doom-3-modded-overshadows-id-softwares-rage-new-drooling-screenshots/" target="_blank">http://www.dsogaming.com/news/doom-3-modde...ng-screenshots/</a>
bottom line: graphically, it failed.
it blew my mind as far as animations go, though? :) ns2 is beautiful, i still get caught staring at marine models in the rr or the odd texture here and there.
(p.s. couldnt help myself , had to edit to post shoddy texture link hehe) <a href="http://forums.steampowered.com/forums/showthread.php?t=2154243" target="_blank">http://forums.steampowered.com/forums/show...d.php?t=2154243</a>
NS2 textures maxed out are as bad as Rages at current, but hell, this is a beta, and I never said Rage had good textures. I specifically told you it didn't, so I dunno why you got all upiity.
Yeah, he said it was a mistake to make the game with consoles in mind, but that doesn't change the Engine, when they port it to Xbawks they are looking into making a D3D version, with that, alot more could happen. Carmack stated he expect the texture pack to 50-60GB(Utterly absurd)
The problem is MegaTextures provide an overall awesome look, but in the end, it is just one, giant texture, so it does blur like that.
However with current technology advancement and quite simply consumer choice, OpenGL is going to over take Direct X in the not too distant future and that's why Carmack is continuing it.
You might ask how I can say that, so go ahead, because I don't see DirectX being opened up to other platforms in the future and I can only see OpenGL being worked and worked on.
PS. Rage actually has some pretty amazing textures if you have a computer that can handle it. BETA drivers fix the performance problems on the auto-texture level, you can manually make your cache etc. high so it doesn't blur-load. Although Carmack has stated Rage was developed for consoles, so set it to Max FPS 30 and you'll get your Xbox experience with full effect.
Equally I recommend if you do have rage this link will sort you out: <a href="http://www.geforce.com/News/articles/how-to-unlock-rages-high-resolution-textures-with-a-few-simple-tweaks" target="_blank">http://www.geforce.com/News/articles/how-t...w-simple-tweaks</a>.
Note: Default textures are 4MB (the console/not so good ones people are on about). You can set them to 8MB, or if you're like me and you have balls and a half-decent system, 16MB textures are set and are pretty much beautiful, like that shot previously seen in the thread.
<!--quoteo(post=1879983:date=Oct 14 2011, 11:08 PM:name=ToSsHiBa)--><div class='quotetop'>QUOTE (ToSsHiBa @ Oct 14 2011, 11:08 PM) <a href="index.php?act=findpost&pid=1879983"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->i'll prefer always dx9 for ns2 to get better performace, dx11 will get lower perform<!--QuoteEnd--></div><!--QuoteEEnd-->
That's not true and makes no sense. Implementing the same features in DX9 and DX11 will have little performance difference, except when the capabilities of DX11 allow a feature to be implemented more efficiently(i.e. DX11 is faster).
IronHorseDeveloper, QA Manager, Technical Support & contributorJoin Date: 2010-05-08Member: 71669Members, Super Administrators, Forum Admins, Forum Moderators, NS2 Developer, NS2 Playtester, Squad Five Blue, Subnautica Playtester, Subnautica PT Lead, Pistachionauts
edited October 2011
<!--quoteo(post=1880176:date=Oct 16 2011, 12:15 PM:name=Mkilbride)--><div class='quotetop'>QUOTE (Mkilbride @ Oct 16 2011, 12:15 PM) <a href="index.php?act=findpost&pid=1880176"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->NS2 textures maxed out are as bad as Rages at current<!--QuoteEnd--></div><!--QuoteEEnd--> didnt mean to sound "uppity" ? just saying you are wrong, good sir. and wish to explain why :) and here i am to say it again to that quote above. i searched for the worst textures in the nook and crannies to get these, so these arent "beauty shots" one would publish. and NS2 still wrecks rage's textures. <img src="http://i.imgur.com/RPtfw.jpg" border="0" class="linked-image" /> <img src="http://i.imgur.com/RnXhA.jpg" border="0" class="linked-image" />
@ konata
did you read my post above yours, i already provided that link? "Rage actually has some pretty amazing textures if you have a computer that can handle it." no. they dont. i have everything manually set to highest possible and get 60 fps. character textures yes, general item and world textures, no. <a href="http://forums.steampowered.com/forums/showpost.php?p=25757695&postcount=507" target="_blank">http://forums.steampowered.com/forums/show...p;postcount=507</a> reference that last link i pasted in my other post at the bottom for 36 pages worth of examples. and just like that article states, the 16mb textures were not included with the game thats why you see no difference its been proven in forums it still sets back to 8. also, 16 mb textures would require 3gb+ vram video cards<u> as noted by bethesda and that site you linked</u>. uh, goodluck? just like the <i>other</i> link i provided explains the limitations with megatexture tech, you reach the threshold of hard drive I/O and will experience thrashing etc.
i'd LOVE to agree with you guys on this matter but the bottom line is, once again, graphically, id dropped the ball. this is not the game we were advertised for years. gameplay is great though? and finally, NS2 is looking awesome, graphically, with actual dynamic lighting and high resolution textures. i think my point is nailed home. :) good discussion guys.
<!--quoteo(post=1880209:date=Oct 17 2011, 01:56 AM:name=kabab)--><div class='quotetop'>QUOTE (kabab @ Oct 17 2011, 01:56 AM) <a href="index.php?act=findpost&pid=1880209"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->I don't understand why "players" care so much about graphic API's the choice of API has very minimal effect on the player...
It's really a much more important factor for the developer depending on their goals and experience but has sfa effect on players..<!--QuoteEnd--></div><!--QuoteEEnd-->
Writing a game in OpenGL or Dx9 or dx10 or dx11 is not going to magically make it leaps and bounds faster there are many considerations which need to be factored well beyond what graphics API is being used... After all the graphics API is just a layer between the game and hardware...
So really it has no effect on players... If NS2 was built on DX11 or OpenGL we would have the exact same problems as we do now...
So really it does not matter which graphics API the developer picks all that much for the gamer....
It does, because even after the optimization, it could have better performance in DX11, if properly implemented. Christ, I'm not saying anything against them, I am only saying it against the people here whom seem to think wrongly about DX11 and API's.
OpenGL is slower, and has less graphical fidelity these days. It is behind DirectX in terms of performance & graphics.
IronHorseDeveloper, QA Manager, Technical Support & contributorJoin Date: 2010-05-08Member: 71669Members, Super Administrators, Forum Admins, Forum Moderators, NS2 Developer, NS2 Playtester, Squad Five Blue, Subnautica Playtester, Subnautica PT Lead, Pistachionauts
<!--quoteo(post=1880211:date=Oct 16 2011, 07:00 PM:name=Mkilbride)--><div class='quotetop'>QUOTE (Mkilbride @ Oct 16 2011, 07:00 PM) <a href="index.php?act=findpost&pid=1880211"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->Ironhorse, if you're going to be childish, then I'm not going to respond to you anymore.<!--QuoteEnd--></div><!--QuoteEEnd-->
then dont? i dont see how you keep getting insulted man?? or why you return with offensive responses? i've disagreed, laid out my argument with supporting evidence and links and avoid using anything i'd consider trolling or "childish"? promise i have not written anything in sarcasm, either. sorry if i've offended you somehow.
but i do take your lack of rebuttal as conceding my argument i suppose.
<!--quoteo(post=1880216:date=Oct 17 2011, 01:23 PM:name=Mkilbride)--><div class='quotetop'>QUOTE (Mkilbride @ Oct 17 2011, 01:23 PM) <a href="index.php?act=findpost&pid=1880216"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->It does, because even after the optimization, it could have better performance in DX11, if properly implemented. Christ, I'm not saying anything against them, I am only saying it against the people here whom seem to think wrongly about DX11 and API's.
OpenGL is slower, and has less graphical fidelity these days. It is behind DirectX in terms of performance & graphics.<!--QuoteEnd--></div><!--QuoteEEnd-->Yeah i know DX11 is faster but they have limited resources so they either develop DX11 and throw away the DX9 market or they develop for both render paths but given their limited resources that will be problematic..
DX9 vs OpenGL is really going to come down to developer experience and if they plan to go to other platforms. Even if one API is slower then another those losses can easily be offset by how resource constraint they are in terms of development man power...
Kouji_SanSr. Hινε UÏкεεÏεг - EUPT DeputyThe NetherlandsJoin Date: 2003-05-13Member: 16271Members, NS2 Playtester, Squad Five Blue
<!--quoteo(post=1880212:date=Oct 17 2011, 02:06 AM:name=kabab)--><div class='quotetop'>QUOTE (kabab @ Oct 17 2011, 02:06 AM) <a href="index.php?act=findpost&pid=1880212"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->So really it does not matter which graphics API the developer picks all that much for the gamer....<!--QuoteEnd--></div><!--QuoteEEnd--> I like my Windows XP Pro experience dammit! :D
Ugh, you're not getting it, I've said it like three times now. I AM NOT SAYING THEY SHOULD. I was just explaining that DX11 offers performance benefits over DX9, because some people were saying it was slower.
Kouji_SanSr. Hινε UÏкεεÏεг - EUPT DeputyThe NetherlandsJoin Date: 2003-05-13Member: 16271Members, NS2 Playtester, Squad Five Blue
<!--quoteo(post=1880336:date=Oct 17 2011, 07:49 PM:name=Mkilbride)--><div class='quotetop'>QUOTE (Mkilbride @ Oct 17 2011, 07:49 PM) <a href="index.php?act=findpost&pid=1880336"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->Ugh, you're not getting it, I've said it like three times now. I AM NOT SAYING THEY SHOULD. I was just explaining that DX11 offers performance benefits over DX9, because some people were saying it was slower.<!--QuoteEnd--></div><!--QuoteEEnd--> Keep yar pantz on, I was making a funny :D
Kouji_SanSr. Hινε UÏкεεÏεг - EUPT DeputyThe NetherlandsJoin Date: 2003-05-13Member: 16271Members, NS2 Playtester, Squad Five Blue
edited October 2011
<!--quoteo(post=1880415:date=Oct 18 2011, 05:15 AM:name=konata)--><div class='quotetop'>QUOTE (konata @ Oct 18 2011, 05:15 AM) <a href="index.php?act=findpost&pid=1880415"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->And if he liked his linux experience he'd have made a funny about wanting it OpenGL ;)<!--QuoteEnd--></div><!--QuoteEEnd--> <i>They mostly come in DirectX... Mostly...</i>
<!--quoteo(post=1880212:date=Oct 16 2011, 09:06 PM:name=kabab)--><div class='quotetop'>QUOTE (kabab @ Oct 16 2011, 09:06 PM) <a href="index.php?act=findpost&pid=1880212"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->Writing a game in OpenGL or Dx9 or dx10 or dx11 is not going to magically make it leaps and bounds faster there are many considerations which need to be factored well beyond what graphics API is being used... After all the graphics API is just a layer between the game and hardware...<!--QuoteEnd--></div><!--QuoteEEnd-->
There is a massive difference between different APIs, that "magically" make them quite a good bit faster or slower on different things.
The cost of draw calls in openGL has been much cheaper than in direct 3D for a very long time; that's been a key reason why professional applications have prefered openGL for so long. DX10 really slashed driver overhead for draw calls and state changes; there was a change to the vista driver model so that you could avoid going back and forth between kernel mode for each draw call or state change(which ****ing sucks, performance wise), and a lot of fixed function legacy crap has been removed so you don't have to make quite as many state changes and there is less state-data to track.
Resources have to be validated so that they are correctly formatted and don't cause problems; in DX9 this happens for every draw call. In DX10 it happens ONCE when the resource is created.
In DX10 you can make texture arrays; so you have fewer texture binding operations and you don't have to resort to kludges like texture atlasing.
Occlusion queries in DX9 require CPU intervention. In DX10 they're entirely on the GPU.
In DX9 you had to change each state individually. In DX10 you have state objects that allow you to change a whole bunch of them in a single call.
In DX9 you updated shader constants one at a time. In DX10 you have constant buffers that allow you to update a whole bunch of them at a time.
In DX9 you only have 4 render targets; in DX10 you have 8. NS2 uses a deferred renderer, so this should be very useful in reducing CPU overhead.
For a game that is totally and utterly CPU bound this really matters.
BTW: the reason for choosing DX9 only was not unsound. 20% of people are still on XP and they expected to be done by now. Making good use of several APIs is a pain.
<!--quoteo(post=1880520:date=Oct 18 2011, 12:14 PM:name=Soylent_green)--><div class='quotetop'>QUOTE (Soylent_green @ Oct 18 2011, 12:14 PM) <a href="index.php?act=findpost&pid=1880520"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->There is a massive difference between different APIs, that "magically" make them quite a good bit faster or slower on different things.
The cost of draw calls in openGL has been much cheaper than in direct 3D for a very long time; that's been a key reason why professional applications have prefered openGL for so long. DX10 really slashed driver overhead for draw calls and state changes; there was a change to the vista driver model so that you could avoid going back and forth between kernel mode for each draw call or state change(which ****ing sucks, performance wise), and a lot of fixed function legacy crap has been removed so you don't have to make quite as many state changes and there is less state-data to track.
Resources have to be validated so that they are correctly formatted and don't cause problems; in DX9 this happens for every draw call. In DX10 it happens ONCE when the resource is created.
In DX10 you can make texture arrays; so you have fewer texture binding operations and you don't have to resort to kludges like texture atlasing.
Occlusion queries in DX9 require CPU intervention. In DX10 they're entirely on the GPU.
In DX9 you had to change each state individually. In DX10 you have state objects that allow you to change a whole bunch of them in a single call.
In DX9 you updated shader constants one at a time. In DX10 you have constant buffers that allow you to update a whole bunch of them at a time.
In DX9 you only have 4 render targets; in DX10 you have 8. NS2 uses a deferred renderer, so this should be very useful in reducing CPU overhead.
For a game that is totally and utterly CPU bound this really matters.
BTW: the reason for choosing DX9 only was not unsound. 20% of people are still on XP and they expected to be done by now. Making good use of several APIs is a pain.<!--QuoteEnd--></div><!--QuoteEEnd--> This is a nice summary. The only thing I'd add is that DirectX 10/11 has some interesting multi-threading features, but I think the driver support for that might not be in great shape yet. Only having to support a single API is exactly the reason we're only supporting DirectX 9 at the moment. For most of this project we've only had one engine programmer to work on the networking, physics, graphics, tools, etc. so we need to pick our battles.
I always thought you guys wanted to go cross platform. Which sure if you're looking X-Box D3D is a good idea. Although for those guys on Linux/Mac, aren't you going to have to re-write the game code from the almost-ground up again to port it across?
InsaneAnomalyJoin Date: 2002-05-13Member: 605Members, Super Administrators, Forum Admins, NS1 Playtester, Forum Moderators, NS2 Developer, Constellation, NS2 Playtester, Squad Five Blue, NS2 Map Tester, Subnautica Developer, Pistachionauts, Future Perfect Developer
<!--quoteo(post=1880532:date=Oct 18 2011, 08:47 PM:name=konata)--><div class='quotetop'>QUOTE (konata @ Oct 18 2011, 08:47 PM) <a href="index.php?act=findpost&pid=1880532"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->I always thought you guys wanted to go cross platform. Which sure if you're looking X-Box D3D is a good idea. Although for those guys on Linux/Mac, aren't you going to have to re-write the game code from the almost-ground up again to port it across?<!--QuoteEnd--></div><!--QuoteEEnd-->
I think I remember Max saying that the core renderer is quite self-contained so that only that bit would have to be re-written rather than writing the entire game engine from scratch all over again.
<!--quoteo(post=1880532:date=Oct 18 2011, 01:47 PM:name=konata)--><div class='quotetop'>QUOTE (konata @ Oct 18 2011, 01:47 PM) <a href="index.php?act=findpost&pid=1880532"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->I always thought you guys wanted to go cross platform. Which sure if you're looking X-Box D3D is a good idea. Although for those guys on Linux/Mac, aren't you going to have to re-write the game code from the almost-ground up again to port it across?<!--QuoteEnd--></div><!--QuoteEEnd--> There are about 6,500 lines of Direct3D code in the engine compared to 194,399 total in the engine (and who knows how many in Lua) so it's only a small fraction.
<!--quoteo(post=1880520:date=Oct 19 2011, 06:14 AM:name=Soylent_green)--><div class='quotetop'>QUOTE (Soylent_green @ Oct 19 2011, 06:14 AM) <a href="index.php?act=findpost&pid=1880520"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->There is a massive difference between different APIs, that "magically" make them quite a good bit faster or slower on different things.
The cost of draw calls in openGL has been much cheaper than in direct 3D for a very long time; that's been a key reason why professional applications have prefered openGL for so long. DX10 really slashed driver overhead for draw calls and state changes; there was a change to the vista driver model so that you could avoid going back and forth between kernel mode for each draw call or state change(which ****ing sucks, performance wise), and a lot of fixed function legacy crap has been removed so you don't have to make quite as many state changes and there is less state-data to track.
Resources have to be validated so that they are correctly formatted and don't cause problems; in DX9 this happens for every draw call. In DX10 it happens ONCE when the resource is created.
In DX10 you can make texture arrays; so you have fewer texture binding operations and you don't have to resort to kludges like texture atlasing.
Occlusion queries in DX9 require CPU intervention. In DX10 they're entirely on the GPU.
In DX9 you had to change each state individually. In DX10 you have state objects that allow you to change a whole bunch of them in a single call.
In DX9 you updated shader constants one at a time. In DX10 you have constant buffers that allow you to update a whole bunch of them at a time.
In DX9 you only have 4 render targets; in DX10 you have 8. NS2 uses a deferred renderer, so this should be very useful in reducing CPU overhead.
For a game that is totally and utterly CPU bound this really matters.
BTW: the reason for choosing DX9 only was not unsound. 20% of people are still on XP and they expected to be done by now. Making good use of several APIs is a pain.<!--QuoteEnd--></div><!--QuoteEEnd--> Good info man :)
My post was more in the context of they only got one graphics programmer so they don't have the resources to write and support multiply render paths.
I wonder how much performance they would gain going to dx10 I wonder if the bottle necks in the game and server code would make most of these efficiencies mute..
Comments
simply put, no.
lol.
i'm sorry id love to agree with you, but carmack himself has discussed in multiple interviews this year how the tech was designed with consoles in mind, low ram, high latency, better suited with megatexture tech with occlusion culling, whereas he said PCs dont really benefit too much from the tech in that they have much more ram, and low latency. <a href="http://www.maximumpc.com/article/news/john_carmack_says_id_will_take_console-first_approach_games" target="_blank">http://www.maximumpc.com/article/news/john..._approach_games</a>
the issue with the tech is this: its meant to work with highly compressed textures. you are making a trade off with the artist's ability for unique non tiled, non instanced textures for high resolution, low compression ones that are instanced often. theres talks of the rumored 1 tb of textures they used for internal developing. obviously not practical.
heres a great read:
<a href="http://gamasutra.com/blogs/BenjaminQuintero/20110822/8253/Things_to_Consider_About_Megatextures.php" target="_blank">http://gamasutra.com/blogs/BenjaminQuinter...egatextures.php</a>
"there are ways to make them better" you mean by manually changing these cvars?
<a href="http://www.geforce.com/News/articles/how-to-unlock-rages-high-resolution-textures-with-a-few-simple-tweaks" target="_blank">http://www.geforce.com/News/articles/how-t...w-simple-tweaks</a>
yea.. youre still working with the compressed textures they packed into the game. they are the 4k and 8k, not the 16k promised.
as for the promised texture pack?
<a href="http://twitter.com/#!/ID_AA_Carmack" target="_blank">http://twitter.com/#!/ID_AA_Carmack</a>
carmack's latest twitter. source files were ###### textures anyways. and the twitter or two below that he says they'll include the "detail layer" like in UE3. ::rolls eyes::
rumor has it the environment artists there were very unhappy about having to create said crappy quality all to make their game solid 60fps on consoles.
that screenshot above is a far away shot, and you know it. i am not going to dump the 1,000s of blurry texture shots every rage forum is posting, you can look yourself, but to say the least, im sure we can both rattle off a dozen titles that came out in the past 3 years that supersede these graphics. hell even their last engine's graphics look better:
<a href="http://www.dsogaming.com/news/doom-3-modded-overshadows-id-softwares-rage-new-drooling-screenshots/" target="_blank">http://www.dsogaming.com/news/doom-3-modde...ng-screenshots/</a>
bottom line:
graphically, it failed.
it blew my mind as far as animations go, though? :)
ns2 is beautiful, i still get caught staring at marine models in the rr or the odd texture here and there.
(p.s. couldnt help myself , had to edit to post shoddy texture link hehe)
<a href="http://forums.steampowered.com/forums/showthread.php?t=2154243" target="_blank">http://forums.steampowered.com/forums/show...d.php?t=2154243</a>
sorry man, i google it before i post and there was saying that dx9 was kind of better cuz dx11 get much better graph.
my bad =x
luck i bought ati 5770 and has compatibility with dx11
Yeah, he said it was a mistake to make the game with consoles in mind, but that doesn't change the Engine, when they port it to Xbawks they are looking into making a D3D version, with that, alot more could happen. Carmack stated he expect the texture pack to 50-60GB(Utterly absurd)
The problem is MegaTextures provide an overall awesome look, but in the end, it is just one, giant texture, so it does blur like that.
However with current technology advancement and quite simply consumer choice, OpenGL is going to over take Direct X in the not too distant future and that's why Carmack is continuing it.
You might ask how I can say that, so go ahead, because I don't see DirectX being opened up to other platforms in the future and I can only see OpenGL being worked and worked on.
PS. Rage actually has some pretty amazing textures if you have a computer that can handle it. BETA drivers fix the performance problems on the auto-texture level, you can manually make your cache etc. high so it doesn't blur-load. Although Carmack has stated Rage was developed for consoles, so set it to Max FPS 30 and you'll get your Xbox experience with full effect.
Equally I recommend if you do have rage this link will sort you out: <a href="http://www.geforce.com/News/articles/how-to-unlock-rages-high-resolution-textures-with-a-few-simple-tweaks" target="_blank">http://www.geforce.com/News/articles/how-t...w-simple-tweaks</a>.
Note: Default textures are 4MB (the console/not so good ones people are on about). You can set them to 8MB, or if you're like me and you have balls and a half-decent system, 16MB textures are set and are pretty much beautiful, like that shot previously seen in the thread.
That's not true and makes no sense. Implementing the same features in DX9 and DX11 will have little performance difference, except when the capabilities of DX11 allow a feature to be implemented more efficiently(i.e. DX11 is faster).
didnt mean to sound "uppity" ? just saying you are wrong, good sir. and wish to explain why :)
and here i am to say it again to that quote above.
i searched for the worst textures in the nook and crannies to get these, so these arent "beauty shots" one would publish. and NS2 still wrecks rage's textures.
<img src="http://i.imgur.com/RPtfw.jpg" border="0" class="linked-image" />
<img src="http://i.imgur.com/RnXhA.jpg" border="0" class="linked-image" />
@ konata
did you read my post above yours, i already provided that link?
"Rage actually has some pretty amazing textures if you have a computer that can handle it."
no. they dont. i have everything manually set to highest possible and get 60 fps. character textures yes, general item and world textures, no. <a href="http://forums.steampowered.com/forums/showpost.php?p=25757695&postcount=507" target="_blank">http://forums.steampowered.com/forums/show...p;postcount=507</a>
reference that last link i pasted in my other post at the bottom for 36 pages worth of examples.
and just like that article states, the 16mb textures were not included with the game thats why you see no difference its been proven in forums it still sets back to 8.
also, 16 mb textures would require 3gb+ vram video cards<u> as noted by bethesda and that site you linked</u>. uh, goodluck? just like the <i>other</i> link i provided explains the limitations with megatexture tech, you reach the threshold of hard drive I/O and will experience thrashing etc.
i'd LOVE to agree with you guys on this matter but the bottom line is, once again, graphically, id dropped the ball. this is not the game we were advertised for years. gameplay is great though?
and finally, NS2 is looking awesome, graphically, with actual dynamic lighting and high resolution textures. i think my point is nailed home. :) good discussion guys.
It's really a much more important factor for the developer depending on their goals and experience but has sfa effect on players..
It's really a much more important factor for the developer depending on their goals and experience but has sfa effect on players..<!--QuoteEnd--></div><!--QuoteEEnd-->
Players want better graphics and performance?
Graphic fidelity & performance is minimal to you? So they game could chomp along @ 20FPS, and you'd be just dandy?:P
Ironhorse, if you're going to be childish, then I'm not going to respond to you anymore.
So really it has no effect on players... If NS2 was built on DX11 or OpenGL we would have the exact same problems as we do now...
So really it does not matter which graphics API the developer picks all that much for the gamer....
OpenGL is slower, and has less graphical fidelity these days. It is behind DirectX in terms of performance & graphics.
then dont?
i dont see how you keep getting insulted man?? or why you return with offensive responses?
i've disagreed, laid out my argument with supporting evidence and links and avoid using anything i'd consider trolling or "childish"? promise i have not written anything in sarcasm, either.
sorry if i've offended you somehow.
but i do take your lack of rebuttal as conceding my argument i suppose.
@kabab
you are correct
OpenGL is slower, and has less graphical fidelity these days. It is behind DirectX in terms of performance & graphics.<!--QuoteEnd--></div><!--QuoteEEnd-->Yeah i know DX11 is faster but they have limited resources so they either develop DX11 and throw away the DX9 market or they develop for both render paths but given their limited resources that will be problematic..
DX9 vs OpenGL is really going to come down to developer experience and if they plan to go to other platforms. Even if one API is slower then another those losses can easily be offset by how resource constraint they are in terms of development man power...
<a href="http://www.tested.com/news/opengl-vs-directx-the-graphics-api-wars-begin-anew/483/" target="_blank">http://www.tested.com/news/opengl-vs-direc...begin-anew/483/</a>
<a href="http://blog.wolfire.com/2010/01/Why-you-should-use-OpenGL-and-not-DirectX" target="_blank">http://blog.wolfire.com/2010/01/Why-you-sh...and-not-DirectX</a>
NS2 looks great on DX9 and it is their decision to make it with DX9. Let them do what THEY want to do.
I like my Windows XP Pro experience dammit! :D
Keep yar pantz on, I was making a funny :D
<i>They mostly come in DirectX... Mostly...</i>
There is a massive difference between different APIs, that "magically" make them quite a good bit faster or slower on different things.
The cost of draw calls in openGL has been much cheaper than in direct 3D for a very long time; that's been a key reason why professional applications have prefered openGL for so long. DX10 really slashed driver overhead for draw calls and state changes; there was a change to the vista driver model so that you could avoid going back and forth between kernel mode for each draw call or state change(which ****ing sucks, performance wise), and a lot of fixed function legacy crap has been removed so you don't have to make quite as many state changes and there is less state-data to track.
Resources have to be validated so that they are correctly formatted and don't cause problems; in DX9 this happens for every draw call. In DX10 it happens ONCE when the resource is created.
In DX10 you can make texture arrays; so you have fewer texture binding operations and you don't have to resort to kludges like texture atlasing.
Occlusion queries in DX9 require CPU intervention. In DX10 they're entirely on the GPU.
In DX9 you had to change each state individually. In DX10 you have state objects that allow you to change a whole bunch of them in a single call.
In DX9 you updated shader constants one at a time. In DX10 you have constant buffers that allow you to update a whole bunch of them at a time.
In DX9 you only have 4 render targets; in DX10 you have 8. NS2 uses a deferred renderer, so this should be very useful in reducing CPU overhead.
For a game that is totally and utterly CPU bound this really matters.
BTW: the reason for choosing DX9 only was not unsound. 20% of people are still on XP and they expected to be done by now. Making good use of several APIs is a pain.
The cost of draw calls in openGL has been much cheaper than in direct 3D for a very long time; that's been a key reason why professional applications have prefered openGL for so long. DX10 really slashed driver overhead for draw calls and state changes; there was a change to the vista driver model so that you could avoid going back and forth between kernel mode for each draw call or state change(which ****ing sucks, performance wise), and a lot of fixed function legacy crap has been removed so you don't have to make quite as many state changes and there is less state-data to track.
Resources have to be validated so that they are correctly formatted and don't cause problems; in DX9 this happens for every draw call. In DX10 it happens ONCE when the resource is created.
In DX10 you can make texture arrays; so you have fewer texture binding operations and you don't have to resort to kludges like texture atlasing.
Occlusion queries in DX9 require CPU intervention. In DX10 they're entirely on the GPU.
In DX9 you had to change each state individually. In DX10 you have state objects that allow you to change a whole bunch of them in a single call.
In DX9 you updated shader constants one at a time. In DX10 you have constant buffers that allow you to update a whole bunch of them at a time.
In DX9 you only have 4 render targets; in DX10 you have 8. NS2 uses a deferred renderer, so this should be very useful in reducing CPU overhead.
For a game that is totally and utterly CPU bound this really matters.
BTW: the reason for choosing DX9 only was not unsound. 20% of people are still on XP and they expected to be done by now. Making good use of several APIs is a pain.<!--QuoteEnd--></div><!--QuoteEEnd-->
This is a nice summary. The only thing I'd add is that DirectX 10/11 has some interesting multi-threading features, but I think the driver support for that might not be in great shape yet. Only having to support a single API is exactly the reason we're only supporting DirectX 9 at the moment. For most of this project we've only had one engine programmer to work on the networking, physics, graphics, tools, etc. so we need to pick our battles.
I think I remember Max saying that the core renderer is quite self-contained so that only that bit would have to be re-written rather than writing the entire game engine from scratch all over again.
There are about 6,500 lines of Direct3D code in the engine compared to 194,399 total in the engine (and who knows how many in Lua) so it's only a small fraction.
The cost of draw calls in openGL has been much cheaper than in direct 3D for a very long time; that's been a key reason why professional applications have prefered openGL for so long. DX10 really slashed driver overhead for draw calls and state changes; there was a change to the vista driver model so that you could avoid going back and forth between kernel mode for each draw call or state change(which ****ing sucks, performance wise), and a lot of fixed function legacy crap has been removed so you don't have to make quite as many state changes and there is less state-data to track.
Resources have to be validated so that they are correctly formatted and don't cause problems; in DX9 this happens for every draw call. In DX10 it happens ONCE when the resource is created.
In DX10 you can make texture arrays; so you have fewer texture binding operations and you don't have to resort to kludges like texture atlasing.
Occlusion queries in DX9 require CPU intervention. In DX10 they're entirely on the GPU.
In DX9 you had to change each state individually. In DX10 you have state objects that allow you to change a whole bunch of them in a single call.
In DX9 you updated shader constants one at a time. In DX10 you have constant buffers that allow you to update a whole bunch of them at a time.
In DX9 you only have 4 render targets; in DX10 you have 8. NS2 uses a deferred renderer, so this should be very useful in reducing CPU overhead.
For a game that is totally and utterly CPU bound this really matters.
BTW: the reason for choosing DX9 only was not unsound. 20% of people are still on XP and they expected to be done by now. Making good use of several APIs is a pain.<!--QuoteEnd--></div><!--QuoteEEnd-->
Good info man :)
My post was more in the context of they only got one graphics programmer so they don't have the resources to write and support multiply render paths.
I wonder how much performance they would gain going to dx10 I wonder if the bottle necks in the game and server code would make most of these efficiencies mute..