I thought Far Cry would be mentioned first (although Crysis _has_ been derived from it). There is quite a bit of Lua-script in there, but they had to go through extreme lengths to get it functioning with an acceptable performance, something that so far hasn't been the case with NS2 (whatever happened to that super-duper new binding-layer of which was expected so much). I also don't believe Far Cry's Lua-system did multithreading (or for that matter, Far Cry period). Needless to say vanilla Lua isn't suited for multithreading, but there are some libraries out there that add basic essential threading-functionality, I just can't think of any real-world examples.
<!--quoteo(post=1850023:date=Jun 5 2011, 12:20 PM:name=Twiggeh)--><div class='quotetop'>QUOTE (Twiggeh @ Jun 5 2011, 12:20 PM) <a href="index.php?act=findpost&pid=1850023"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->First of all, any game thats over 20fps is playable. That doesnt mean its enjoyable.. but its still playable, because you CAN play it (heck, i whooped ass in cs back in the days with 15 fps max on a 56k modem).
And why are you using a 32bit OS when you have 4gb of ram? Why are you even using a 32bit OS at all? :S Imho... acquire a win7-64bit install disc and reformat your C-partition and slap the 64bit win7 on it.<!--QuoteEnd--></div><!--QuoteEEnd-->
Compatibility 1GB isn't a big deal.
On topic I got also a 5870 supported by an AMD "Phenomen II 955 X4" and usually I get around 40 FPS, drops sometimes in the 20's (really rare)
To get the most performance possible out of Lua, the developers would have to use the latest LuaJIT beta (which is very stable) and use its FFI rather than slow bindings (where possible) to communicate between Lua and C++.
<!--quoteo(post=1850162:date=Jun 6 2011, 03:19 AM:name=kflow47)--><div class='quotetop'>QUOTE (kflow47 @ Jun 6 2011, 03:19 AM) <a href="index.php?act=findpost&pid=1850162"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->Compatibility 1GB isn't a big deal.
On topic I got also a 5870 supported by an AMD "Phenomen II 955 X4" and usually I get around 40 FPS, drops sometimes in the 20's (really rare)
AFAIK the engine uses the CPU mostly ATM<!--QuoteEnd--></div><!--QuoteEEnd-->
I hope your joking about Compatibility with 64bit Operating Systems. There haven't been any major compatibility issues since, what, Windows XP 64bit? Also, 32bit can only allocate ~2.5GB
As for framerate, I have so many problems with my computer, yet I can still hit ~30-40FPS at any given time. It only gets real bad when theres lots of Hydras or Infestation.
Any kind of concurrency is extremely OS-specific. The Lua language isn't going to sacrifice cross-platform compatibility for those features.
Maybe what you really mean to ask is whether there's a Lua EXTENSION that has concurrency. Well, there's Lua Lanes, but I don't know if any game uses it.
"Multi-core Lua" is the wrong place to look for optimization. As soon as you say "multi-core," you aren't really talking about Lua anymore.
Which is all a fancy way of saying that a game that uses "single core LUA" can run perfectly fine, given a whole boatload of assumptions. For UWE, it's about tuning what they have, and moving the really time-sensitive bits over to the C++ side, where they can work directly with the OS.
Saying "it can't be optimized because it's LUA" is to blame the wrong thing. At the end of the day, most major software programs have a variety of components written in different languages, all talking to each other. (Often called interoperability, or "interop".) Interop is a fact of life, and there's nothing unique, or tragically flawed, about UWE's approach. Plenty of big systems use lots of interop and are still very fast. Of course, those big systems took a lot of time and money to write.
If NS2 isn't fast enough, blame lack of time and money, not LUA. Optimization is hella expensive.
<!--quoteo(post=1849960:date=Jun 5 2011, 12:58 PM:name=Ricaz)--><div class='quotetop'>QUOTE (Ricaz @ Jun 5 2011, 12:58 PM) <a href="index.php?act=findpost&pid=1849960"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->I have seen videos from NS2HD which are completely clean, looks like 60+ FPS. I don't know what his setup is like.<!--QuoteEnd--></div><!--QuoteEEnd-->
An AMD 1090T @ 3.8Ghz. Sometimes I run the game on my 5870, sometimes on a GTS 450 (Usually only used for my video editing). I see no perceptible FPS difference between the two cards, despite the massive gulf in power between them. Only one thing matters right now: Your CPU. But if you compare the peformance of say, build 149 with the performance of build 177, the difference is massive. Almost every build performance increases a bit. That trend will continue, and one day we will get to the performance we all want :).
And as someone mentioned earlier, my fps when recording is capped at 29.97 (Youtube framerate).
<!--quoteo(post=1850170:date=Jun 5 2011, 10:52 PM:name=VarXX)--><div class='quotetop'>QUOTE (VarXX @ Jun 5 2011, 10:52 PM) <a href="index.php?act=findpost&pid=1850170"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->I hope your joking about Compatibility with 64bit Operating Systems. There haven't been any major compatibility issues since, what, Windows XP 64bit? Also, 32bit can only allocate ~2.5GB<!--QuoteEnd--></div><!--QuoteEEnd-->
~3.5GB, actually.
(And to anyone who thinks this is because Windows sucks, be aware it's a HARDWARE limit, not Windows. I think this summarizes it pretty well: <a href="http://www.codinghorror.com/blog/2007/03/dude-wheres-my-4-gigabytes-of-ram.html)" target="_blank">http://www.codinghorror.com/blog/2007/03/d...es-of-ram.html)</a>
I did see some compat issues with 64-bit Vista, especially with drivers. But those have generally been sorted out, so I agree that it's not much of a concern.
In any case, I've found NS2 just doesn't run well on 4GB, 32-bit boxes. I get a lot of paging even on relatively clean boxes. I recently upgraded a box to 6GB (64-bit of course), and NS2 ran much better. I saw no further improvement when moving to 8GB.
fsfodukJoin Date: 2004-04-09Member: 27810Members, NS2 Developer, Constellation, NS2 Playtester, Squad Five Blue, Squad Five Silver, Squad Five Gold, Subnautica Playtester, NS2 Community Developer, Pistachionauts
edited June 2011
<!--quoteo(post=1850165:date=Jun 6 2011, 03:35 AM:name=slime)--><div class='quotetop'>QUOTE (slime @ Jun 6 2011, 03:35 AM) <a href="index.php?act=findpost&pid=1850165"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->To get the most performance possible out of Lua, the developers would have to use the latest LuaJIT beta (which is very stable) and use its FFI rather than slow bindings (where possible) to communicate between Lua and C++.<!--QuoteEnd--></div><!--QuoteEEnd--> This a thousand times they were using luajit2 before but max never upgraded to a version with ffi before deciding to switch to the normal a normal lua VM so he could add his own custom VM instructions/stack allocation
<!--quoteo(post=1850179:date=Jun 6 2011, 12:14 AM:name=SiniStarR)--><div class='quotetop'>QUOTE (SiniStarR @ Jun 6 2011, 12:14 AM) <a href="index.php?act=findpost&pid=1850179"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->I don't quite know alot about computer architecture but, is there a problem or delay as to why the game isn't relying alot more on the gpu?<!--QuoteEnd--></div><!--QuoteEEnd-->
the gpu only paints the house. it's the cpu that's gotta build it. especially when the house has a lot of AI and npcs like macs, drifters, and turret and chamber spam.
<!--quoteo(post=1850175:date=Jun 6 2011, 12:37 AM:name=fsfod)--><div class='quotetop'>QUOTE (fsfod @ Jun 6 2011, 12:37 AM) <a href="index.php?act=findpost&pid=1850175"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->This a thousand times they were using luajit2 before but max never upgraded to a version with ffi before deciding to switch to the normal a normal lua VM so he could add his own custom VM instructions/stack allocation<!--QuoteEnd--></div><!--QuoteEEnd--> Oh I didn't know he did that, that's disappointing. :(
The problem here is obviously his pirated version of Win7 that can't get the regular MS updates and the fact that he pirated the 32 bit version. You guys do know that your video card's ram counts towards that 32bit limit, right?
<!--quoteo(post=1850167:date=Jun 6 2011, 02:43 AM:name=Koruyo)--><div class='quotetop'>QUOTE (Koruyo @ Jun 6 2011, 02:43 AM) <a href="index.php?act=findpost&pid=1850167"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->Would you be happier if they just worked on a copycat fast ns2 source version for fast money?<!--QuoteEnd--></div><!--QuoteEEnd-->
In 2006/7 that'd be a great idea for several reasons. That ship has sailed now though.
<!--quoteo(post=1850171:date=Jun 6 2011, 05:10 AM:name=Squidget)--><div class='quotetop'>QUOTE (Squidget @ Jun 6 2011, 05:10 AM) <a href="index.php?act=findpost&pid=1850171"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->Any kind of concurrency is extremely OS-specific. The Lua language isn't going to sacrifice cross-platform compatibility for those features.
Maybe what you really mean to ask is whether there's a Lua EXTENSION that has concurrency. Well, there's Lua Lanes, but I don't know if any game uses it.
"Multi-core Lua" is the wrong place to look for optimization. As soon as you say "multi-core," you aren't really talking about Lua anymore.
Which is all a fancy way of saying that a game that uses "single core LUA" can run perfectly fine, given a whole boatload of assumptions. For UWE, it's about tuning what they have, and moving the really time-sensitive bits over to the C++ side, where they can work directly with the OS.
Saying "it can't be optimized because it's LUA" is to blame the wrong thing. At the end of the day, most major software programs have a variety of components written in different languages, all talking to each other. (Often called interoperability, or "interop".) Interop is a fact of life, and there's nothing unique, or tragically flawed, about UWE's approach. Plenty of big systems use lots of interop and are still very fast. Of course, those big systems took a lot of time and money to write.
If NS2 isn't fast enough, blame lack of time and money, not LUA. Optimization is hella expensive.<!--QuoteEnd--></div><!--QuoteEEnd-->
It is patently obvious Lua was not designed for what it's being used for in NS2, this can be clearly seen by way of the garbage-collector that is seriously overworked with heap-garbage (partly because there are some lousy memory-wasting things going on in NS2's VM right now), the enormous amounts of cycles wasted on what should be trivial pathfinding\collision-detection (uncertain whether some of this blame belongs to UWE for shoddy Lua-scripting) leading to the server-tickrates ending up in the crapper.
I agree that multi-threading isn't really something Lua was ment\designed to do, the reason it's being brought up is because servers right now are just seriously bricked by the load put upon them by the Lua-VM. We're talking about a factor of 10-20x of performance-improvements that have to be accomplished somehow, in order to gain acceptable tickrates when the going gets tough (hydras\infestation). That said, I don't think anybody wants multiple cores of their machine eaten up by a single NS2-server, so the aim should certainly be to keep things single-threaded, if only for the sake of affordable servers.
<!--quoteo--><div class='quotetop'>QUOTE </div><div class='quotemain'><!--quotec-->and moving the really time-sensitive bits over to the C++ side, where they can work directly with the OS.<!--QuoteEnd--></div><!--QuoteEEnd--> Personally I would've loved it if they just went with the tried-and-true gamelogic-DLL approach. It would've spared them all this hassle with Lua, and it would've been a lot more pleasant for modders to be able to work with C++ instead (just selfish me talking). The problem now is that anything in the engine is closed-source, thus anything they need to move out of Lua into the engine is going to be closed off from modding, which gives me a sadface.
<!--quoteo--><div class='quotetop'>QUOTE </div><div class='quotemain'><!--quotec-->Saying "it can't be optimized because it's LUA" is to blame the wrong thing. At the end of the day, most major software programs have a variety of components written in different languages, all talking to each other. (Often called interoperability, or "interop".) Interop is a fact of life, and there's nothing unique, or tragically flawed, about UWE's approach. Plenty of big systems use lots of interop and are still very fast. Of course, those big systems took a lot of time and money to write.<!--QuoteEnd--></div><!--QuoteEEnd--> There is a difference between machine-code and (interpreted) byte-code though in all honesty. In time-critical areas, which there are a lot of given NS2's desktop highperformance triple-A nature, falling back on a byte-code based language may land you in trouble. Of course it should be said that when it comes to bytecode, Lua is amongst the fastest of these, so things could've been a lot worse.
<!--quoteo--><div class='quotetop'>QUOTE </div><div class='quotemain'><!--quotec-->If NS2 isn't fast enough, blame lack of time and money, not LUA. Optimization is hella expensive.<!--QuoteEnd--></div><!--QuoteEEnd--> You may be right that Lua can optimized to an extend of proper performance (heck they managed it with Far Cry), but as you pointed out, it is a time-consuming and tedious process, time = money after all, and I'm not sure how well they're stocked in both departments.
Honestly, I would give them a near endless supply of rope if they decided to spend the time optimizing the Lua-script (and the engine, we mustn't forget about some of the poor things going on in there too), they just don't seem to be doing it. Perhaps they're waiting for it to become feature-complete...
<!--QuoteBegin-'Koruyo '+--><div class='quotetop'>QUOTE ('Koruyo ')</div><div class='quotemain'><!--QuoteEBegin-->Would you be happier if they just worked on a copycat fast ns2 source version for fast money?<!--QuoteEnd--></div><!--QuoteEEnd--> I think many of us we're looking forward to something like that sure. It's not as if they wouldn't get away with charging money for a graphics-rehash, NS is pretty unique still these days, and it would've enabled them to get a better financial basis to work on an engine of their own.
It's quite possible the game just doesn't like your setup, testing on all different hardware is really expensive so UWE can't really afford to do it before release, which is one of the benefits of the beta.
If you submit your specs and any information like logs or crash dumps or whatever to getsatisfaction they'll probably be able to improve performance if it's just a weird interaction between the game and your hardware setup.
<!--quoteo(post=1850267:date=Jun 6 2011, 03:57 PM:name=w0dk4)--><div class='quotetop'>QUOTE (w0dk4 @ Jun 6 2011, 03:57 PM) <a href="index.php?act=findpost&pid=1850267"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->Wait for the next patch which should hopefully include the occlusion query optimization.<!--QuoteEnd--></div><!--QuoteEEnd-->
I Don't think OCC will be in this patch, last I saw Max was still working on it and wanting to get it right first time before including it in a patch.
<!--quoteo(post=1850260:date=Jun 6 2011, 04:25 PM:name=kabab)--><div class='quotetop'>QUOTE (kabab @ Jun 6 2011, 04:25 PM) <a href="index.php?act=findpost&pid=1850260"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->What you say is interesting but isn't LUA's rise to fame pretty much based on its ease use in games?
Many other games/engines have successfully done it why can't it be achieved in NS2?<!--QuoteEnd--></div><!--QuoteEEnd--> If I had to take a guess, it's because UWE set it out to enable developers to create a game entirely from within Lua without touching the engine, it's how they're doing NS2 I suppose. Trouble is this isn't feasable to do for a AAA-title, even with today's CPUs, at some point you really need to delegate work to a machine-code language.
I'm sorry about the slow response, I went to sleep after posting. :>
It could very well be the CPU that's causing my problems; I have a Phenom II X6 running on 3.6GHz, and if the game only uses one or two cores, well .. I had some problems with my graphics card before, maybe they are returning. I can still run very high FPS on engines like Unreal Engine 3, Source, CryENGINE 3 etc. I have no problems at all running games like Bulletstorm, Dragon Age 2, SC2, DiRT 3 and such. Those should be, well, up there.
I sort of hope it's just all engine optimization, since I really want to play on release, and I can't afford new hardware.
<!--quoteo(post=1850290:date=Jun 7 2011, 03:02 AM:name=player)--><div class='quotetop'>QUOTE (player @ Jun 7 2011, 03:02 AM) <a href="index.php?act=findpost&pid=1850290"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->If I had to take a guess, it's because UWE set it out to enable developers to create a game entirely from within Lua without touching the engine, it's how they're doing NS2 I suppose. Trouble is this isn't feasable to do for a AAA-title, even with today's CPUs, at some point you really need to delegate work to a machine-code language.<!--QuoteEnd--></div><!--QuoteEEnd-->Once they have the game written and sorted in LUA couldn't they re-write the slow scripts in C++
I'd imagine there is only a hand full of things causing most of the workload...
<!--quoteo(post=1850293:date=Jun 6 2011, 01:14 PM:name=kabab)--><div class='quotetop'>QUOTE (kabab @ Jun 6 2011, 01:14 PM) <a href="index.php?act=findpost&pid=1850293"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->Once they have the game written and sorted in LUA couldn't they re-write the slow scripts in C++
I'd imagine there is only a hand full of things causing most of the workload...<!--QuoteEnd--></div><!--QuoteEEnd-->
They COULD, of course. But there's at least two big problems with that:
1) It's about the least efficient way to build a product you can think of. Writing code, throwing it out, and then writing it again in another language is extremely wasteful of time and money. And there will be bugs in translation.
2) It conflicts with UWE's goal of ultimate modibility. C++ is much "less moddable" than Lua, due to both complexity and security. Basically anything located in the C++ side won't be touchable by the mod community. There's a balance here that all games have to strike. I think in this case, UWE's goal has exceeded their resources.
Player: While Lua has some GC issues, The latest LuaJIT betas approach and even sometimes surpass C# and Java in speed, without using the FFI at all. Even some simple structs created with LuaJIT 2's FFI can speed up certain operations by orders of magnitude.
<!--quoteo(post=1850260:date=Jun 6 2011, 10:25 AM:name=kabab)--><div class='quotetop'>QUOTE (kabab @ Jun 6 2011, 10:25 AM) <a href="index.php?act=findpost&pid=1850260"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->What you say is interesting but isn't LUA's rise to fame pretty much based on its ease use in games?
Many other games/engines have successfully done it why can't it be achieved in NS2?<!--QuoteEnd--></div><!--QuoteEEnd-->
Every language has its purpose. Lua is a very fast, simple scripting language with automatic garbage collection, it's super easy to embed into other languages. Which means it's a great "glue" language to drop into a game to provide scripting.
But "scripting" generally means: simple logic, loading and saving configuration settings, and other crap previously done in configuration files. You want to change how long it takes for an egg to hatch, or how much a shotgun costs? Lua does that well.
But Lua is NOT suited for very processor-intensive or memory-intensive operations, like vector math. (Hence why Max has altered it so much.) So every game has a certain chunk in native code, and a certain chunk in Lua code. There's always a balance of ease-of-use vs. performance.
So the difference between other games and NS2 is that, as far as I know, no other game has SO MUCH of the game engine on the Lua side. And NS2's performance has suffered accordingly.
What Player mentioned is a more standard approach: the critical stuff gets pushed out to C++ modules, and Lua calls those.
<!--quoteo(post=1850253:date=Jun 6 2011, 03:43 PM:name=player)--><div class='quotetop'>QUOTE (player @ Jun 6 2011, 03:43 PM) <a href="index.php?act=findpost&pid=1850253"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->It is patently obvious Lua was not designed for what it's being used for in NS2, this can be clearly seen by way of the garbage-collector that is seriously overworked with heap-garbage (partly because there are some lousy memory-wasting things going on in NS2's VM right now), the enormous amounts of cycles wasted on what should be trivial pathfinding\collision-detection (uncertain whether some of this blame belongs to UWE for shoddy Lua-scripting) leading to the server-tickrates ending up in the crapper.<!--QuoteEnd--></div><!--QuoteEEnd--> Some time ago, I looked through the Lua code and noticed they are using table constructors {} in the functions that are executed very often. Doing so, causes the old tables to be passed on to garbage collection, but whether or not this has a huge impact on the Lua performance in NS2 directly, I do not know.
<!--quoteo(post=1850346:date=Jun 6 2011, 11:25 PM:name=slime)--><div class='quotetop'>QUOTE (slime @ Jun 6 2011, 11:25 PM) <a href="index.php?act=findpost&pid=1850346"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->Player: While Lua has some GC issues, The latest LuaJIT betas approach and even sometimes surpass C# and Java in speed, without using the FFI at all. Even some simple structs created with LuaJIT 2's FFI can speed up certain operations by orders of magnitude.<!--QuoteEnd--></div><!--QuoteEEnd--> Aye that's right. I'm wondering why Max didn't go with LuaJIT after all (it was in at some point but not anymore?).
<!--QuoteBegin-'Max'+--><div class='quotetop'>QUOTE ('Max')</div><div class='quotemain'><!--QuoteEBegin-->Also, as for why disabling LuaJIT wouldn't affect the performance, that one is easy. Executing the actual Lua instructions isn't the bottleneck or a significant factor on the performance. The Lua VM is very fast, so this is expected. The bigger factor is the garbage collector and the layer that interfaces the Lua code with the C++ code. Once build 162 is released this is my next target for optimization.<!--QuoteEnd--></div><!--QuoteEEnd--> This old quote from Max justified a new lua binding-layer, which gave a very moderate performance boost in Build 171 (or 172\3 somewhere), but far from alleviated the issues. Combined with the fact that the hydra\turret-targetting script was shown to be poorly written at best, and being a major slowdown, reintroducing LuaJIT would've made sense. Instead Max added new opcodes to Lua to deal with the Vector-problem, which might've rendered it incompatible with LuaJIT.
I'm pretty ignorant regarding Lua, as I've only picked up what I needed to know to get NS2 to interop with C++, but I'm very curious indeed as to how these performance issues are going to be solved in the near future.
Comments
That doesnt mean its enjoyable.. but its still playable, because you CAN play it (heck, i whooped ass in cs back in the days with 15 fps max on a 56k modem).
And why are you using a 32bit OS when you have 4gb of ram? Why are you even using a 32bit OS at all? :S
Imho... acquire a win7-64bit install disc and reformat your C-partition and slap the 64bit win7 on it.<!--QuoteEnd--></div><!--QuoteEEnd-->
Compatibility 1GB isn't a big deal.
On topic I got also a 5870 supported by an AMD "Phenomen II 955 X4" and usually I get around 40 FPS, drops sometimes in the 20's (really rare)
AFAIK the engine uses the CPU mostly ATM
I dont understand what you guys expect them to do :(
Would you be happier if they just worked on a copycat fast ns2 source version for fast money?
On topic I got also a 5870 supported by an AMD "Phenomen II 955 X4" and usually I get around 40 FPS, drops sometimes in the 20's (really rare)
AFAIK the engine uses the CPU mostly ATM<!--QuoteEnd--></div><!--QuoteEEnd-->
I hope your joking about Compatibility with 64bit Operating Systems. There haven't been any major compatibility issues since, what, Windows XP 64bit? Also, 32bit can only allocate ~2.5GB
As for framerate, I have so many problems with my computer, yet I can still hit ~30-40FPS at any given time. It only gets real bad when theres lots of Hydras or Infestation.
Maybe what you really mean to ask is whether there's a Lua EXTENSION that has concurrency. Well, there's Lua Lanes, but I don't know if any game uses it.
"Multi-core Lua" is the wrong place to look for optimization. As soon as you say "multi-core," you aren't really talking about Lua anymore.
Which is all a fancy way of saying that a game that uses "single core LUA" can run perfectly fine, given a whole boatload of assumptions. For UWE, it's about tuning what they have, and moving the really time-sensitive bits over to the C++ side, where they can work directly with the OS.
Saying "it can't be optimized because it's LUA" is to blame the wrong thing. At the end of the day, most major software programs have a variety of components written in different languages, all talking to each other. (Often called interoperability, or "interop".) Interop is a fact of life, and there's nothing unique, or tragically flawed, about UWE's approach. Plenty of big systems use lots of interop and are still very fast. Of course, those big systems took a lot of time and money to write.
If NS2 isn't fast enough, blame lack of time and money, not LUA. Optimization is hella expensive.
An AMD 1090T @ 3.8Ghz. Sometimes I run the game on my 5870, sometimes on a GTS 450 (Usually only used for my video editing). I see no perceptible FPS difference between the two cards, despite the massive gulf in power between them. Only one thing matters right now: Your CPU. But if you compare the peformance of say, build 149 with the performance of build 177, the difference is massive. Almost every build performance increases a bit. That trend will continue, and one day we will get to the performance we all want :).
And as someone mentioned earlier, my fps when recording is capped at 29.97 (Youtube framerate).
~3.5GB, actually.
(And to anyone who thinks this is because Windows sucks, be aware it's a HARDWARE limit, not Windows. I think this summarizes it pretty well: <a href="http://www.codinghorror.com/blog/2007/03/dude-wheres-my-4-gigabytes-of-ram.html)" target="_blank">http://www.codinghorror.com/blog/2007/03/d...es-of-ram.html)</a>
I did see some compat issues with 64-bit Vista, especially with drivers. But those have generally been sorted out, so I agree that it's not much of a concern.
In any case, I've found NS2 just doesn't run well on 4GB, 32-bit boxes. I get a lot of paging even on relatively clean boxes. I recently upgraded a box to 6GB (64-bit of course), and NS2 ran much better. I saw no further improvement when moving to 8GB.
This a thousand times they were using luajit2 before but max never upgraded to a version with ffi before deciding to switch to the normal a normal lua VM so he could add his own custom VM instructions/stack allocation
the gpu only paints the house. it's the cpu that's gotta build it. especially when the house has a lot of AI and npcs like macs, drifters, and turret and chamber spam.
Oh I didn't know he did that, that's disappointing. :(
In 2006/7 that'd be a great idea for several reasons. That ship has sailed now though.
Maybe what you really mean to ask is whether there's a Lua EXTENSION that has concurrency. Well, there's Lua Lanes, but I don't know if any game uses it.
"Multi-core Lua" is the wrong place to look for optimization. As soon as you say "multi-core," you aren't really talking about Lua anymore.
Which is all a fancy way of saying that a game that uses "single core LUA" can run perfectly fine, given a whole boatload of assumptions. For UWE, it's about tuning what they have, and moving the really time-sensitive bits over to the C++ side, where they can work directly with the OS.
Saying "it can't be optimized because it's LUA" is to blame the wrong thing. At the end of the day, most major software programs have a variety of components written in different languages, all talking to each other. (Often called interoperability, or "interop".) Interop is a fact of life, and there's nothing unique, or tragically flawed, about UWE's approach. Plenty of big systems use lots of interop and are still very fast. Of course, those big systems took a lot of time and money to write.
If NS2 isn't fast enough, blame lack of time and money, not LUA. Optimization is hella expensive.<!--QuoteEnd--></div><!--QuoteEEnd-->
It is patently obvious Lua was not designed for what it's being used for in NS2, this can be clearly seen by way of the garbage-collector that is seriously overworked with heap-garbage (partly because there are some lousy memory-wasting things going on in NS2's VM right now), the enormous amounts of cycles wasted on what should be trivial pathfinding\collision-detection (uncertain whether some of this blame belongs to UWE for shoddy Lua-scripting) leading to the server-tickrates ending up in the crapper.
I agree that multi-threading isn't really something Lua was ment\designed to do, the reason it's being brought up is because servers right now are just seriously bricked by the load put upon them by the Lua-VM. We're talking about a factor of 10-20x of performance-improvements that have to be accomplished somehow, in order to gain acceptable tickrates when the going gets tough (hydras\infestation). That said, I don't think anybody wants multiple cores of their machine eaten up by a single NS2-server, so the aim should certainly be to keep things single-threaded, if only for the sake of affordable servers.
<!--quoteo--><div class='quotetop'>QUOTE </div><div class='quotemain'><!--quotec-->and moving the really time-sensitive bits over to the C++ side, where they can work directly with the OS.<!--QuoteEnd--></div><!--QuoteEEnd-->
Personally I would've loved it if they just went with the tried-and-true gamelogic-DLL approach. It would've spared them all this hassle with Lua, and it would've been a lot more pleasant for modders to be able to work with C++ instead (just selfish me talking). The problem now is that anything in the engine is closed-source, thus anything they need to move out of Lua into the engine is going to be closed off from modding, which gives me a sadface.
<!--quoteo--><div class='quotetop'>QUOTE </div><div class='quotemain'><!--quotec-->Saying "it can't be optimized because it's LUA" is to blame the wrong thing. At the end of the day, most major software programs have a variety of components written in different languages, all talking to each other. (Often called interoperability, or "interop".) Interop is a fact of life, and there's nothing unique, or tragically flawed, about UWE's approach. Plenty of big systems use lots of interop and are still very fast. Of course, those big systems took a lot of time and money to write.<!--QuoteEnd--></div><!--QuoteEEnd-->
There is a difference between machine-code and (interpreted) byte-code though in all honesty. In time-critical areas, which there are a lot of given NS2's desktop highperformance triple-A nature, falling back on a byte-code based language may land you in trouble. Of course it should be said that when it comes to bytecode, Lua is amongst the fastest of these, so things could've been a lot worse.
<!--quoteo--><div class='quotetop'>QUOTE </div><div class='quotemain'><!--quotec-->If NS2 isn't fast enough, blame lack of time and money, not LUA. Optimization is hella expensive.<!--QuoteEnd--></div><!--QuoteEEnd-->
You may be right that Lua can optimized to an extend of proper performance (heck they managed it with Far Cry), but as you pointed out, it is a time-consuming and tedious process, time = money after all, and I'm not sure how well they're stocked in both departments.
Honestly, I would give them a near endless supply of rope if they decided to spend the time optimizing the Lua-script (and the engine, we mustn't forget about some of the poor things going on in there too), they just don't seem to be doing it. Perhaps they're waiting for it to become feature-complete...
<!--QuoteBegin-'Koruyo '+--><div class='quotetop'>QUOTE ('Koruyo ')</div><div class='quotemain'><!--QuoteEBegin-->Would you be happier if they just worked on a copycat fast ns2 source version for fast money?<!--QuoteEnd--></div><!--QuoteEEnd-->
I think many of us we're looking forward to something like that sure. It's not as if they wouldn't get away with charging money for a graphics-rehash, NS is pretty unique still these days, and it would've enabled them to get a better financial basis to work on an engine of their own.
Many other games/engines have successfully done it why can't it be achieved in NS2?
If you submit your specs and any information like logs or crash dumps or whatever to getsatisfaction they'll probably be able to improve performance if it's just a weird interaction between the game and your hardware setup.
I Don't think OCC will be in this patch, last I saw Max was still working on it and wanting to get it right first time before including it in a patch.
Many other games/engines have successfully done it why can't it be achieved in NS2?<!--QuoteEnd--></div><!--QuoteEEnd-->
If I had to take a guess, it's because UWE set it out to enable developers to create a game entirely from within Lua without touching the engine, it's how they're doing NS2 I suppose. Trouble is this isn't feasable to do for a AAA-title, even with today's CPUs, at some point you really need to delegate work to a machine-code language.
It could very well be the CPU that's causing my problems; I have a Phenom II X6 running on 3.6GHz, and if the game only uses one or two cores, well ..
I had some problems with my graphics card before, maybe they are returning. I can still run very high FPS on engines like Unreal Engine 3, Source, CryENGINE 3 etc.
I have no problems at all running games like Bulletstorm, Dragon Age 2, SC2, DiRT 3 and such. Those should be, well, up there.
I sort of hope it's just all engine optimization, since I really want to play on release, and I can't afford new hardware.
Thanks for all the replies!
I'd imagine there is only a hand full of things causing most of the workload...
I'd imagine there is only a hand full of things causing most of the workload...<!--QuoteEnd--></div><!--QuoteEEnd-->
They COULD, of course. But there's at least two big problems with that:
1) It's about the least efficient way to build a product you can think of. Writing code, throwing it out, and then writing it again in another language is extremely wasteful of time and money. And there will be bugs in translation.
2) It conflicts with UWE's goal of ultimate modibility. C++ is much "less moddable" than Lua, due to both complexity and security. Basically anything located in the C++ side won't be touchable by the mod community. There's a balance here that all games have to strike. I think in this case, UWE's goal has exceeded their resources.
Many other games/engines have successfully done it why can't it be achieved in NS2?<!--QuoteEnd--></div><!--QuoteEEnd-->
Every language has its purpose. Lua is a very fast, simple scripting language with automatic garbage collection, it's super easy to embed into other languages. Which means it's a great "glue" language to drop into a game to provide scripting.
But "scripting" generally means: simple logic, loading and saving configuration settings, and other crap previously done in configuration files. You want to change how long it takes for an egg to hatch, or how much a shotgun costs? Lua does that well.
But Lua is NOT suited for very processor-intensive or memory-intensive operations, like vector math. (Hence why Max has altered it so much.) So every game has a certain chunk in native code, and a certain chunk in Lua code. There's always a balance of ease-of-use vs. performance.
So the difference between other games and NS2 is that, as far as I know, no other game has SO MUCH of the game engine on the Lua side. And NS2's performance has suffered accordingly.
What Player mentioned is a more standard approach: the critical stuff gets pushed out to C++ modules, and Lua calls those.
Some time ago, I looked through the Lua code and noticed they are using table constructors {} in the functions that are executed very often. Doing so, causes the old tables to be passed on to garbage collection, but whether or not this has a huge impact on the Lua performance in NS2 directly, I do not know.
Aye that's right. I'm wondering why Max didn't go with LuaJIT after all (it was in at some point but not anymore?).
<!--QuoteBegin-'Max'+--><div class='quotetop'>QUOTE ('Max')</div><div class='quotemain'><!--QuoteEBegin-->Also, as for why disabling LuaJIT wouldn't affect the performance, that one is easy. Executing the actual Lua instructions isn't the bottleneck or a significant factor on the performance. The Lua VM is very fast, so this is expected. The bigger factor is the garbage collector and the layer that interfaces the Lua code with the C++ code. Once build 162 is released this is my next target for optimization.<!--QuoteEnd--></div><!--QuoteEEnd-->
This old quote from Max justified a new lua binding-layer, which gave a very moderate performance boost in Build 171 (or 172\3 somewhere), but far from alleviated the issues. Combined with the fact that the hydra\turret-targetting script was shown to be poorly written at best, and being a major slowdown, reintroducing LuaJIT would've made sense. Instead Max added new opcodes to Lua to deal with the Vector-problem, which might've rendered it incompatible with LuaJIT.
I'm pretty ignorant regarding Lua, as I've only picked up what I needed to know to get NS2 to interop with C++, but I'm very curious indeed as to how these performance issues are going to be solved in the near future.