<!--quoteo(post=2046170:date=Dec 15 2012, 04:36 AM:name=Salt)--><div class='quotetop'>QUOTE (Salt @ Dec 15 2012, 04:36 AM) <a href="index.php?act=findpost&pid=2046170"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->I'm a game programmer myself and let me just bust your balls here and say the two aren't correlated at ALL. and isn't making much sense, but i'll teach you a bit.
Python is a programming language that is categorized as "high level", which is pretty much short for "easy to understand and read" as opposed to "low level" which dips down into c# C++ and Assembly.
Python is up there in the ranks of Java, Perl, PHP and Visual Basic / Basic as part of the more 'well known' programming languages, Other than that there's ruby and some other stuff down along the line.
high level means it is first compiled downwards to the level of the 'high' interpreter which can be C++ or C#, of which it receives it's instructions fron and is compiled downwards further into Assembly which is the lowest you can go.
Allright, enough about that, LUA like Actionscript for flash or Javascript for websites or Unreal script (for unreal games) or any thing that has the word 'script' in it is a 'scripting' language. Most of them are based upon populair low level entries like C++ and C# as far as its 'syntax' is concerned. But they are, and remain scripting languages.
A scripting language basically runs on an engine that incorporates several functions and libraries into a front that you can approach with this scripting language. Lets say you write a function in C++ and gave it a bunch of paramters, then that function takes care of all the internal memory managing and typecasting and error checking. Lua uses this function and you can enter its parameters, without the scare of it crashing if you enter the wrong things or leave them empty.
The game engine currently in use by UWE uses this as an interpreter for the core game mechanics which ARE written in C# and C++. Lua is used as a catalyst in this manner to speed up the process without too much complex designs. So when you got an idea you can pretty much implement a quick showcase, without months of interior engine design.
The Lua code written is inherent which means it will only be as fast as the parent that runs it, in this case, the engine written in C++. Whenever you see changes to the game that have much to do with graphical stuff and fixes, it's mostly done so to the engine (which is c++) rather than the game itself. (which is Lua)
Hope that clears things up a little.
Ninja edit: As far as performance goes that really is the engine running the game currently, it just needs a lot of optimizing. Like i said, the game will only run as fast as the engine allows, so they've got to start cutting corners everywhere to make things run smoother. It just takes time, buy a better rig or upgrade your current one if you want it to make a difference.
I picked up this 1200$ pc last year around this time, and it runs smooth as balls. I imagine what you can get for that price will be twice as good now than it was then.<!--QuoteEnd--></div><!--QuoteEEnd-->
Is it just me or is this post full of misinformation?
Python is an interpreted scripting language. So is Lua. In both cases the interpreter / VM for the language is written in C/C++. Technically even Java and C# are interpreted languages because they don't run on the host instruction set, and must instead operate through a native run time. These days when people use the world interpreted language, they are referring to a language which doesn't need to be compiled before it is executed. (perl, python, lua, and javascript being the most commonly used)
Calling C# low level is a bit strange. C# and Java are comparable in complexity and usability. In both cases they compile to byte code executed by a native VM.
In NS2, the game engine is usually waiting on the Lua, not the other way around.
FC3 devs totally ######ed over the community by making the multiplayer a CoD clone. You have all these amazing new map tools like being able to put in vehicles animals and even AI which is is actually a lot better than even the original which has the most amazing multiplayer experience, but no they are not in multiplayer only stupid game modes with chopper and UAV killstreaks. Have to deal with this stupid uplay upload system for our castrated custom maps and you can't even upload maps for co-op which makes no sense at all since the map maker can edit for co-op map settings. The developers had no idea what the ###### they were doing with it. Single-player is amazing but the multiplayer is a heap of trash.
<!--quoteo(post=2046373:date=Dec 15 2012, 05:01 PM:name=Katana-)--><div class='quotetop'>QUOTE (Katana- @ Dec 15 2012, 05:01 PM) <a href="index.php?act=findpost&pid=2046373"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->Is it just me or is this post full of misinformation?
Python is an interpreted scripting language. So is Lua. In both cases the interpreter / VM for the language is written in C/C++. Technically even Java and C# are interpreted languages because they don't run on the host instruction set, and must instead operate through a native run time. These days when people use the world interpreted language, they are referring to a language which doesn't need to be compiled before it is executed. (perl, python, lua, and javascript being the most commonly used)
Calling C# low level is a bit strange. C# and Java are comparable in complexity and usability. In both cases they compile to byte code executed by a native VM.
In NS2, the game engine is usually waiting on the Lua, not the other way around.<!--QuoteEnd--></div><!--QuoteEEnd-->
Yeah which is why I kind of ignored it.
Languages are called high and low level based on how departed from assembly they are/how much like english they read. Most programmers will be using high level languages.
<a href="http://puu.sh/1BtSW" target="_blank">http://puu.sh/1BtSW</a> 120k DX11 tesselated ingame objects (barrels) <a href="http://puu.sh/1BtLL" target="_blank">http://puu.sh/1BtLL</a> physics applied, to only some of them though, from far away. Also sharks n sheet
Yea okay buddy. This is on Ultra. I can get 70fps in ULTRA DX11 on the campaign too, in areas where there aren't any AI entities. With a Q6600 and a GTX460.
<a href="http://cloud.steampowered.com/ugc/920120753268586669/CB2CF585C29AF56964F4FB3B9E3AA9A8978E66D7/" target="_blank">http://cloud.steampowered.com/ugc/92012075...3AA9A8978E66D7/</a> MEANWHILE, IN NS2.
FC3 doesn't run like a dream. I have a pretty opposite experience. Medium is the highest I can go before it gets unbearable, while I can play NS2 on high pretty much until late game. I'm also GPU bottlenecked, though.
@the OP: How is Far Cry 3 your favorite single player game of all time? I think it's a great game but spoiled a lot of its potential. The looting system is really boring, there is very little reason to explore (relics mean nothing), and its a little limited in the weapons category. I think if they would have taken more ques from, say, Fallout 3, it would have been a MUCH better game.
FC3 single player is just a bit big in map size to accomadate the very interesting story elements. 29 radio towers, I mean come on.
Co-Op has *yet again* fallen by the way side ... I played the mega clunky FC1 co-op mod yesterday with a flatmate .. nuff said.
OP: In the last ~8 months alot of builds/patches have had optimization elements, most people seem to forget NS1 was a system drain too, it's long legs are in someway attributable to the games visual ascpect benefitting from hardware increases.
<!--quoteo(post=2046373:date=Dec 15 2012, 06:01 PM:name=Katana-)--><div class='quotetop'>QUOTE (Katana- @ Dec 15 2012, 06:01 PM) <a href="index.php?act=findpost&pid=2046373"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->In NS2, the game engine is usually waiting on the Lua, not the other way around.<!--QuoteEnd--></div><!--QuoteEEnd--> That would interest me...is this true?
I mean, just to clarify, normally when I see a scripting language like Lua, it's used to define actions and give basic instructions to the engine on the gameplay. ie;
Lua tells the engine "Create a particle system here, with this effect / sparsity, etc". Then, the C++-driven engine takes care of that particle system until Lua tells it to do something different.
In a similar vein, any webpage you go to will actually "freeze" during the time that JavaScript is executing. You don't notice it, because it all does very basic and asynchronous things. ie, in order to show a menu, it will change a CSS class which is then noticed and handled by the browser (ie, engine). For actions that can take a long time, they work asynchronously, leaving execution until the next component is ready and available.
On FC3: It would not surprise me that Ubisoft simply have far more experienced engineers working on that sort of thing than Unknown Worlds does. Engine design, and tweaking performance on the hundreds of GPUs out there, is not an easy thing.
There are potentially faster alternatives. There are already multiple JIT compilers for Lua, or even static compilers, that can execute Lua code many times faster than the reference interpreter. There are two problems, though:
1) I have no idea what the performance of Max's Lua VM is like as compared to the reference Lua interpreter 2) This may not even be a large bottleneck for NS2 on the client-side
If you're running FC3 on max settings I don't know what the hell you're using but it must not be available to the public. I can barely run FC3 on max settings and I have an i7 2600k water cooled and OC'ed to around 4.52ghz as well as 16gb of ram and 2x gtx 680 signature OC'ed. And that ###### still gets choppy in parts.
Anyway I remember this was a big discussion way back many months ago. As far as multi-threading and multi-core processing, two different things. Threads come from a single core and aren't actually simulatenous, really only 1 thread can be processed at a time but the state of the thread is saved so it appears that both threads are operating simulatenously. Multi-processing is true parallelism where things are actually being processed at the same time by each core.
Now the OS is actually what assigns what thread to what core, and that is done in programming by specifically stating which co-routines will be running where. In LUA that option isn't really available through the standard library. You can however divide things up into different VM's which the OS theoretically runs in different threads and therefore on separate cores. I have my doubts as to the legitimacy of this being truly parallel but that's really not the point.
The truth of the matter is yes, LUA is significantly slower than C++ in the vast majority of applications. Depending on the usage it can be 40x slower, or it could actually be 2x faster. My guess is that here, it is significantly slower, but since we can't test the game written in 100% C++ we'll never really know.
Either way, Max has done a pretty great job so far increasing performance and I'm sure as time goes on it will get better.
it must be SLI. I have a GTX460 and the only place it gets choppy for me is on the compound mission, where the fps drops below 40. And that's because I have a Q6600 lol.
I don't have FC3 (yet), but I heard that it eats GTX 680's for breakfast on Ultra. Kinda like the first Crysis that a single card still can't render properly ( <a href="http://benchmarkreviews.com/index.php?option=com_content&task=view&id=877&Itemid=72&limit=1&limitstart=4" target="_blank">http://benchmarkreviews.com/index.php?opti...mp;limitstart=4</a> notice how the GTX 680 OC delivers an AVERAGE framerate of 66, which means half the time it's below 60)... and how long ago was it released, 5 years ago?
I'm still hoping that Maxwell (coming out next year) can run Crysis 1 properly. I've been waiting for a card to replay that with :-) .
There's no way you run FC3 maxed out at a decent resolution with a gtx460 at a playable framerate...
Farcry 3's graphics don't even look better than Farcry 2's, in fact it may be worse if it wasn't for DX11. You're probably looking at old driver benchmarks. The GTX460 has no trouble with FC3.
I hope performance gets better but we are certainly far far away from the initial "min requirements" that got advertised with the first pre-orders (1 GHZ Pentium 4 or something like that?).
Also i would like to remind people about earlier builds that didn't limit client fps to server fps. I don't remember the exact build anymore maybe 218 or something like that, it's the one where Skulks got hit pretty hard because suddenly all movement got messed up. Since that build i've been suffering from performance issues i didn't have before. The game used to run awesome fine with 100+ fps maxed out, since the changes to client fps/server fps way too much of the client performance seems to be tied to server performance. For me that lead to general worse client fps and a very noticeable difference in how fluid/reactive the game feels overall, especially in later stages of a round when the map is plastered with buildings and units.
And that's a pretty big bummer, because issues like these you can't fix by simply giving your client more performance as that won't help at all when the actual bottleneck seems to be the server.
<!--quoteo(post=2047041:date=Dec 17 2012, 10:29 AM:name=NeoRussia)--><div class='quotetop'>QUOTE (NeoRussia @ Dec 17 2012, 10:29 AM) <a href="index.php?act=findpost&pid=2047041"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->Farcry 3's graphics don't even look better than Farcry 2's, in fact it may be worse if it wasn't for DX11.<!--QuoteEnd--></div><!--QuoteEEnd-->
Maybe you should take another look at Far Cry 2? Because nowadays it looks really crappy in places, especially compared to FC3.
I have taken an indepth look at it. In FC3 the model fidelity is only at the highest setting on main characters used in cutscene and for viewmodels. Shadow and HDR quality is lower than it is in the gameplay reveal at gamescon and the trailers. I don't remember them making these little tricks in FC2, if I had both games I would make a comparison.
<!--quoteo(post=2046885:date=Dec 17 2012, 01:17 AM:name=Katana314)--><div class='quotetop'>QUOTE (Katana314 @ Dec 17 2012, 01:17 AM) <a href="index.php?act=findpost&pid=2046885"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->That would interest me...is this true?
I mean, just to clarify, normally when I see a scripting language like Lua, it's used to define actions and give basic instructions to the engine on the gameplay. ie;
Lua tells the engine "Create a particle system here, with this effect / sparsity, etc". Then, the C++-driven engine takes care of that particle system until Lua tells it to do something different.<!--QuoteEnd--></div><!--QuoteEEnd-->
From my experience with Unreal tech: Engine runs all the low-level calculations in the background. It is basically looping it's execution in the background for the entire time while the game is running until you request to exit it. In this loop the engine basically does: <ul><li>Load and initialize all Actors and Objects once at the start of the game/level.</li><li>Tick all Actors that are pre-asynchronous.</li><li>Tick all Actors that are asynchronous AND in a separate thread update the physics for all Actors that are not.</li><li>Tick all Actors that are post-asynchronous.</li><li>Render the game world to the screen.</li></ul>
Ticking means that each frame the script event Tick(float DeltaTime) gets called by the engine and tells the Actor how much time passed since the last tick, so he can implement some custom behaviour that happens every frame. Native C++ code also takes care of updating any pending animations/AnimTree data during the Tick. It also processes all pending Timers on an Actor and calls the associated function for that Timer if it pops. Pre-/post-asynchronous simply means that these Actors are either update before or after the physics get calculated for them in that frame. Asynchronous Actors gets updated in a separate thread while physics are updated. This can obviously only be used for Actors that don't require any physic updates for themselves at all.
During the physics update, the engine translates the Actor's Acceleration property into Velocity and the Velocity property ultimately into a change of it's Location and Rotation in the game world. This is all handled by the native C++ code as well. During this update, the engine may call other script events to notify the Actor that it bumped into a blocking Actor or just touched a non-blocking one.
Other engine events include stuff like calling several initialization functions right after an Actor gets spawned by the engine. Or notifying when it gains a "child" Actor. Or gets it's base changed. Or landed after a falling physics calculation. Or took damage. Or is about to get destroyed by the engine. Or fell out of world. Or had a "long fall". Or to notify a specific Actor of user key input. Or when a property has been replicated to a client.
The script can still call some native functions, but the general flow is that the engine is always the one to call the script events and then the script can call other engine functions from there again. And this is not even taking about client-server differences yet. ;)
<!--quoteo(post=2046885:date=Dec 16 2012, 04:17 PM:name=Katana314)--><div class='quotetop'>QUOTE (Katana314 @ Dec 16 2012, 04:17 PM) <a href="index.php?act=findpost&pid=2046885"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->That would interest me...is this true?<!--QuoteEnd--></div><!--QuoteEEnd-->
my understanding, is that ns2 spends more time executing lua code, then it spends executing c++ code, thus the engine is waiting on the lua :)
There is some truth in what you say though, the lua is making calls on the engine not the other way around. My point was that the engine isn't the bottleneck for performance on most peoples set ups.
This should be surprising as the entire game logic is written in lua, and then engine is effectively running multiple instances of the lua code in order to do lag compensation / prediction.
to some extent you can check to see if lua or the engine are the bottle neck on your machine by typing 'r_stats 1' in the ns2 console. In the GPU is waiting on the cpu, lua is what is limiting your frame rate :)
Your turn, UWE.<!--QuoteEnd--></div><!--QuoteEEnd-->
Consider me thoroughly baited, for the entertainment of everyone on this board. If you would like us to take your posts seriously in future, I suggest you stop your persistent trolling across the NS2 community. Posting 'my GTX 460 run's Far Cry 3 like a dream' in the same breath as 'UWE fix your performance' is a surefire way to put you on my ignore list. I have a fully watercooled OC'd 3930k + GTX 590 setup and it chokes on that game. And for all it's successes, it doesn't look any better than Crysis 1.
You even chose to troll Flaterectomy during his ModSpot this weekend on NS2HD. Check yourself. You are distracting attention from what is a wonderfully interesting technical discussion.
<!--quoteo(post=2047335:date=Dec 17 2012, 05:16 PM:name=Strayan (NS2HD))--><div class='quotetop'>QUOTE (Strayan (NS2HD) @ Dec 17 2012, 05:16 PM) <a href="index.php?act=findpost&pid=2047335"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->Consider me thoroughly baited, for the entertainment of everyone on this board. If you would like us to take your posts seriously in future, I suggest you stop your persistent trolling across the NS2 community. Posting 'my GTX 460 run's Far Cry 3 like a dream' in the same breath as 'UWE fix your performance' is a surefire way to put you on my ignore list. I have a fully watercooled OC'd 3930k + GTX 590 setup and it chokes on that game. And for all it's successes, it doesn't look any better than Crysis 1.
You even chose to troll Flaterectomy during his ModSpot this weekend on NS2HD. Check yourself. You are distracting attention from what is a wonderfully interesting technical discussion.<!--QuoteEnd--></div><!--QuoteEEnd-->
10/10 mad
edit: at 1080p your setup with no AA should take Far Cry 3 on ultra?
Yup. "should" - the most frustrating word a graphics engineer can hear.
All games and all computer setups are different in ways people don't realize. It's far from a raw power-output number. Anyone posting some sort of "proof" or reasoning that is less than a paragraph long is missing the point.
Your turn, UWE.<!--QuoteEnd--></div><!--QuoteEEnd--> <!--quoteo(post=2047412:date=Dec 17 2012, 10:25 PM:name=6john)--><div class='quotetop'>QUOTE (6john @ Dec 17 2012, 10:25 PM) <a href="index.php?act=findpost&pid=2047412"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->What I found more noticeable about this is:<!--QuoteEnd--></div><!--QuoteEEnd-->
I posted this in another thread but I'll post it here too.
The NS2 SLI profile has been in since 310.54 Beta (Released Nov 12th).
The 310.70 WHQL drivers are exactly the same as the 310.70 Beta except for the WHQL status.
Driver notes only note improvements between WHQL releases and not between beta releases. The previous WHQL driver was 306.97.
I have an i7 2600k @ 4.8ghz with a GTX680, dont get me wrong I am not trying to e-peen but for me NS2 runs awesome. My boy runs a Q6600 at stock with a GTX9800, i have his settings on LOW and it runs smooth too, so I am not sure what all the complains are about, I am sure that there are some people out there suffering with performance issues, but do they just blame the game because it is the easy scape-goat or do they expect too much from their system and demand it to run on high settings? I mean a Q6600 and GTX9800 is probably about as low as you would like to go here and it runs pretty good considering there are no overclocks at all, but my machine just chews up NS2 and spits it out.
Problem isn't how Far Cry 3 runs on absolute maximum settings though. Problem is how NS2 runs bad on anything below a certain hardware level.
Far Cry 3 <b>does</b> run like a dream on high (not ultra w/dx11) settings on 2 year old machines. Meanwhile NS2 late game requires 4Ghz+ overclocked next gen CPU just to get 40+ FPS during combat, regardless of settings.
Just to compare, minimum system requirements of Far Cry 3, from Steam store page:
OS:Windows XP, Windows Vista and Windows 7 Processor:<b>Intel Core®2 Duo E6700 @ 2.6 GHz</b> or AMD Athlon64 X2 6000+ @ 3.0Ghz or better Memory:2 GB RAM Graphics:512MB Video RAM (1GB Video RAM), DirectX9c (DirectX11) Shader Model 3.0 (Shader Model 5.0) DirectX®:9.0c Hard Drive:15 GB HD space Sound:DirectX Compatible (Recommended Surround Sound 5.1 capable) Other Requirements:Broadband Internet connection Additional:*Supported Video Cards at Time of Release: AMD Radeon™ HD 2900 / 3000 / 4000 / 5000 / 6000 / 7000 series, NVIDIA® <b>GeForce® 8800 GTX</b> / 9 / 200 / 400 / 500 / 600 series. Laptop versions of these cards may work, but are not supported. These chipsets are the only ones that will run this game. For the most up-to-date minimum requirement listings, please visit the FAQ on our support website at <a href="http://support.ubi.com" target="_blank">http://support.ubi.com</a>.<!--QuoteEnd--></div><!--QuoteEEnd-->
And Natural Selection 2 minimum system requirements, directly from Steam store page:
OS:Windows 7 32/64-bit / Vista 32/64 / XP Processor:<b>Core 2 Duo 2.6 ghz</b> Memory:2 GB RAM Graphics:DirectX 9 compatible video card with <b>1GB, ATI X800, NVidia 8600 </b>or better DirectX®:9.0 Hard Drive:5 GB HD space<!--QuoteEnd--></div><!--QuoteEEnd-->
Guess which one runs at acceptable framerates at minimum settings?
I get that it must be frustrating to deal with these forums, but please, don't boast your system specs and claim it chokes on a certain game while ignoring your game's own problems. It's better not to post at all in these situations.
<!--quoteo(post=2047505:date=Dec 18 2012, 10:34 AM:name=Uruktos)--><div class='quotetop'>QUOTE (Uruktos @ Dec 18 2012, 10:34 AM) <a href="index.php?act=findpost&pid=2047505"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->Meanwhile NS2 late game requires 4Ghz+ overclocked next gen CPU just to get 40+ FPS during combat, regardless of settings.<!--QuoteEnd--></div><!--QuoteEEnd-->
This is not true. You get >40 fps in combat with a stock 3.x Ghz CPU. With a non-OC i7 2600k I get 40 FPS in heavy combat situations and MAX Setting With minimum Details I have constant >>60 FPS.
There is a simple truth to the performance issues: NS2 is a Computer game while FC3 is made for console. Sure FC3 looks better with the same Hardware but on the other hand it has no dynamic objects. While in NS2 you pretty much can build whereever and whatever you like you cant do anything remotely similar in FC3.
FC3 has its map and nothing will ever change about this map. Missions are separated instances ensuring there is no dynamic component to the environment. In NS2 everything changes all the time. New structures are added and removed, Infestation spreads and is removed constantly.
Also there is the obvious fact that FC3 is a multi million dollar franchise backed by a big Publisher while NS2 is developed by less than 10 guys being virtually bankrupt for major part of the development.
Comparing NS2 to FC3 really is like comparing Monsanto to your local organic farmer selling his product on the weekly market.
People keep spitting out computer performance's as if it mattered for a beta game.
My computer hasn't changed, yet my fps went up and down with each patch, and the latest couple of patches made my fps drop considerably in combat. This isn't about computer builds, this is about something very specific within the game engine causing chokes.
When its found, i expect that my fps should either remain steady, or rise with each patch.
I have two fears for ns2.
1- performance does not improve, people keep boasting about their high-end specs and forget that the reason ns1 was awesome is cause everyone could play.
2- new content actually worsens performance! Thats a scary thought!
When i first build this new pc a few years back, i kept ns2 in mind and had a ballpark of what it would take to play... thats been ######ed up, but with an OC i was at least hanging in there! Now its just comm, gorge, build, repair, or play wheres waldo.
<!--quoteo(post=2047522:date=Dec 18 2012, 11:25 AM:name=gnoarch)--><div class='quotetop'>QUOTE (gnoarch @ Dec 18 2012, 11:25 AM) <a href="index.php?act=findpost&pid=2047522"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->This is not true. You get >40 fps in combat with a stock 3.x Ghz CPU. With a non-OC i7 2600k I get 40 FPS in heavy combat situations and MAX Setting With minimum Details I have constant >>60 FPS.<!--QuoteEnd--></div><!--QuoteEEnd-->
Not to rain on your parade... but i honestly doubt those numbers, especially late game on big servers with 16+ players. Why do i do that? Because i own a 2600k myself and because people are only looking at client fps without noticing how choppy the actual game feels regardless of client fps.
You can have >40 fps in combat and have no packet loss and the game will still feel horribly unresponsive especially if the round has been going on for a while. People are looking for the problem in all the wrong places imho, most of the performance issues for highend CPU users seem to have started when UWE capped client fps calculation for some stuff to the server tick rate. Maybe that's where another look should be taken?
Comments
Python is a programming language that is categorized as "high level", which is pretty much short for "easy to understand and read" as opposed to "low level" which dips down into c# C++ and Assembly.
Python is up there in the ranks of Java, Perl, PHP and Visual Basic / Basic as part of the more 'well known' programming languages, Other than that there's ruby and some other stuff down along the line.
high level means it is first compiled downwards to the level of the 'high' interpreter which can be C++ or C#, of which it receives it's instructions fron and is compiled downwards further into Assembly which is the lowest you can go.
Allright, enough about that, LUA like Actionscript for flash or Javascript for websites or Unreal script (for unreal games) or any thing that has the word 'script' in it is a 'scripting' language. Most of them are based upon populair low level entries like C++ and C# as far as its 'syntax' is concerned.
But they are, and remain scripting languages.
A scripting language basically runs on an engine that incorporates several functions and libraries into a front that you can approach with this scripting language.
Lets say you write a function in C++ and gave it a bunch of paramters, then that function takes care of all the internal memory managing and typecasting and error checking.
Lua uses this function and you can enter its parameters, without the scare of it crashing if you enter the wrong things or leave them empty.
The game engine currently in use by UWE uses this as an interpreter for the core game mechanics which ARE written in C# and C++.
Lua is used as a catalyst in this manner to speed up the process without too much complex designs.
So when you got an idea you can pretty much implement a quick showcase, without months of interior engine design.
The Lua code written is inherent which means it will only be as fast as the parent that runs it, in this case, the engine written in C++.
Whenever you see changes to the game that have much to do with graphical stuff and fixes, it's mostly done so to the engine (which is c++) rather than the game itself. (which is Lua)
Hope that clears things up a little.
Ninja edit:
As far as performance goes that really is the engine running the game currently, it just needs a lot of optimizing. Like i said, the game will only run as fast as the engine allows, so they've got to start cutting corners everywhere to make things run smoother. It just takes time, buy a better rig or upgrade your current one if you want it to make a difference.
I picked up this 1200$ pc last year around this time, and it runs smooth as balls. I imagine what you can get for that price will be twice as good now than it was then.<!--QuoteEnd--></div><!--QuoteEEnd-->
Is it just me or is this post full of misinformation?
Python is an interpreted scripting language. So is Lua. In both cases the interpreter / VM for the language is written in C/C++. Technically even Java and C# are interpreted languages because they don't run on the host instruction set, and must instead operate through a native run time. These days when people use the world interpreted language, they are referring to a language which doesn't need to be compiled before it is executed. (perl, python, lua, and javascript being the most commonly used)
Calling C# low level is a bit strange. C# and Java are comparable in complexity and usability. In both cases they compile to byte code executed by a native VM.
In NS2, the game engine is usually waiting on the Lua, not the other way around.
Python is an interpreted scripting language. So is Lua. In both cases the interpreter / VM for the language is written in C/C++. Technically even Java and C# are interpreted languages because they don't run on the host instruction set, and must instead operate through a native run time. These days when people use the world interpreted language, they are referring to a language which doesn't need to be compiled before it is executed. (perl, python, lua, and javascript being the most commonly used)
Calling C# low level is a bit strange. C# and Java are comparable in complexity and usability. In both cases they compile to byte code executed by a native VM.
In NS2, the game engine is usually waiting on the Lua, not the other way around.<!--QuoteEnd--></div><!--QuoteEEnd-->
Yeah which is why I kind of ignored it.
Languages are called high and low level based on how departed from assembly they are/how much like english they read. Most programmers will be using high level languages.
<a href="http://puu.sh/1BtSW" target="_blank">http://puu.sh/1BtSW</a> 120k DX11 tesselated ingame objects (barrels)
<a href="http://puu.sh/1BtLL" target="_blank">http://puu.sh/1BtLL</a> physics applied, to only some of them though, from far away. Also sharks n sheet
Yea okay buddy.
This is on Ultra. I can get 70fps in ULTRA DX11 on the campaign too, in areas where there aren't any AI entities. With a Q6600 and a GTX460.
<a href="http://cloud.steampowered.com/ugc/920120753268586669/CB2CF585C29AF56964F4FB3B9E3AA9A8978E66D7/" target="_blank">http://cloud.steampowered.com/ugc/92012075...3AA9A8978E66D7/</a> MEANWHILE, IN NS2.
@the OP: How is Far Cry 3 your favorite single player game of all time? I think it's a great game but spoiled a lot of its potential. The looting system is really boring, there is very little reason to explore (relics mean nothing), and its a little limited in the weapons category. I think if they would have taken more ques from, say, Fallout 3, it would have been a MUCH better game.
Co-Op has *yet again* fallen by the way side ... I played the mega clunky FC1 co-op mod yesterday with a flatmate .. nuff said.
OP: In the last ~8 months alot of builds/patches have had optimization elements, most people seem to forget NS1 was a system drain too, it's long legs are in someway attributable to the games visual ascpect benefitting from hardware increases.
That would interest me...is this true?
I mean, just to clarify, normally when I see a scripting language like Lua, it's used to define actions and give basic instructions to the engine on the gameplay. ie;
Lua tells the engine "Create a particle system here, with this effect / sparsity, etc". Then, the C++-driven engine takes care of that particle system until Lua tells it to do something different.
In a similar vein, any webpage you go to will actually "freeze" during the time that JavaScript is executing. You don't notice it, because it all does very basic and asynchronous things. ie, in order to show a menu, it will change a CSS class which is then noticed and handled by the browser (ie, engine). For actions that can take a long time, they work asynchronously, leaving execution until the next component is ready and available.
On FC3: It would not surprise me that Ubisoft simply have far more experienced engineers working on that sort of thing than Unknown Worlds does. Engine design, and tweaking performance on the hundreds of GPUs out there, is not an easy thing.
1) I have no idea what the performance of Max's Lua VM is like as compared to the reference Lua interpreter
2) This may not even be a large bottleneck for NS2 on the client-side
Refernce: <a href="http://luajit.org/performance_x86.html" target="_blank">http://luajit.org/performance_x86.html</a>
Anyway I remember this was a big discussion way back many months ago. As far as multi-threading and multi-core processing, two different things. Threads come from a single core and aren't actually simulatenous, really only 1 thread can be processed at a time but the state of the thread is saved so it appears that both threads are operating simulatenously. Multi-processing is true parallelism where things are actually being processed at the same time by each core.
Now the OS is actually what assigns what thread to what core, and that is done in programming by specifically stating which co-routines will be running where. In LUA that option isn't really available through the standard library. You can however divide things up into different VM's which the OS theoretically runs in different threads and therefore on separate cores. I have my doubts as to the legitimacy of this being truly parallel but that's really not the point.
The truth of the matter is yes, LUA is significantly slower than C++ in the vast majority of applications. Depending on the usage it can be 40x slower, or it could actually be 2x faster. My guess is that here, it is significantly slower, but since we can't test the game written in 100% C++ we'll never really know.
Either way, Max has done a pretty great job so far increasing performance and I'm sure as time goes on it will get better.
I don't have FC3 (yet), but I heard that it eats GTX 680's for breakfast on Ultra. Kinda like the first Crysis that a single card still can't render properly ( <a href="http://benchmarkreviews.com/index.php?option=com_content&task=view&id=877&Itemid=72&limit=1&limitstart=4" target="_blank">http://benchmarkreviews.com/index.php?opti...mp;limitstart=4</a> notice how the GTX 680 OC delivers an AVERAGE framerate of 66, which means half the time it's below 60)... and how long ago was it released, 5 years ago?
I'm still hoping that Maxwell (coming out next year) can run Crysis 1 properly. I've been waiting for a card to replay that with :-) .
There's no way you run FC3 maxed out at a decent resolution with a gtx460 at a playable framerate...
Also i would like to remind people about earlier builds that didn't limit client fps to server fps. I don't remember the exact build anymore maybe 218 or something like that, it's the one where Skulks got hit pretty hard because suddenly all movement got messed up. Since that build i've been suffering from performance issues i didn't have before. The game used to run awesome fine with 100+ fps maxed out, since the changes to client fps/server fps way too much of the client performance seems to be tied to server performance. For me that lead to general worse client fps and a very noticeable difference in how fluid/reactive the game feels overall, especially in later stages of a round when the map is plastered with buildings and units.
And that's a pretty big bummer, because issues like these you can't fix by simply giving your client more performance as that won't help at all when the actual bottleneck seems to be the server.
<!--quoteo(post=2047041:date=Dec 17 2012, 10:29 AM:name=NeoRussia)--><div class='quotetop'>QUOTE (NeoRussia @ Dec 17 2012, 10:29 AM) <a href="index.php?act=findpost&pid=2047041"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->Farcry 3's graphics don't even look better than Farcry 2's, in fact it may be worse if it wasn't for DX11.<!--QuoteEnd--></div><!--QuoteEEnd-->
Maybe you should take another look at Far Cry 2? Because nowadays it looks really crappy in places, especially compared to FC3.
I mean, just to clarify, normally when I see a scripting language like Lua, it's used to define actions and give basic instructions to the engine on the gameplay. ie;
Lua tells the engine "Create a particle system here, with this effect / sparsity, etc". Then, the C++-driven engine takes care of that particle system until Lua tells it to do something different.<!--QuoteEnd--></div><!--QuoteEEnd-->
From my experience with Unreal tech:
Engine runs all the low-level calculations in the background. It is basically looping it's execution in the background for the entire time while the game is running until you request to exit it.
In this loop the engine basically does:
<ul><li>Load and initialize all Actors and Objects once at the start of the game/level.</li><li>Tick all Actors that are pre-asynchronous.</li><li>Tick all Actors that are asynchronous AND in a separate thread update the physics for all Actors that are not.</li><li>Tick all Actors that are post-asynchronous.</li><li>Render the game world to the screen.</li></ul>
Ticking means that each frame the script event Tick(float DeltaTime) gets called by the engine and tells the Actor how much time passed since the last tick, so he can implement some custom behaviour that happens every frame. Native C++ code also takes care of updating any pending animations/AnimTree data during the Tick. It also processes all pending Timers on an Actor and calls the associated function for that Timer if it pops.
Pre-/post-asynchronous simply means that these Actors are either update before or after the physics get calculated for them in that frame.
Asynchronous Actors gets updated in a separate thread while physics are updated. This can obviously only be used for Actors that don't require any physic updates for themselves at all.
During the physics update, the engine translates the Actor's Acceleration property into Velocity and the Velocity property ultimately into a change of it's Location and Rotation in the game world. This is all handled by the native C++ code as well. During this update, the engine may call other script events to notify the Actor that it bumped into a blocking Actor or just touched a non-blocking one.
Other engine events include stuff like calling several initialization functions right after an Actor gets spawned by the engine. Or notifying when it gains a "child" Actor. Or gets it's base changed. Or landed after a falling physics calculation. Or took damage. Or is about to get destroyed by the engine. Or fell out of world. Or had a "long fall". Or to notify a specific Actor of user key input. Or when a property has been replicated to a client.
The script can still call some native functions, but the general flow is that the engine is always the one to call the script events and then the script can call other engine functions from there again. And this is not even taking about client-server differences yet. ;)
my understanding, is that ns2 spends more time executing lua code, then it spends executing c++ code, thus the engine is waiting on the lua :)
There is some truth in what you say though, the lua is making calls on the engine not the other way around. My point was that the engine isn't the bottleneck for performance on most peoples set ups.
This should be surprising as the entire game logic is written in lua, and then engine is effectively running multiple instances of the lua code in order to do lag compensation / prediction.
to some extent you can check to see if lua or the engine are the bottle neck on your machine by typing 'r_stats 1' in the ns2 console. In the GPU is waiting on the cpu, lua is what is limiting your frame rate :)
oh hey look, FC3 GTX680 performance fixed.
Your turn, UWE.
oh hey look, FC3 GTX680 performance fixed.
Your turn, UWE.<!--QuoteEnd--></div><!--QuoteEEnd-->
To be fair this driver update pretty much did nothing for me in FC3. But hey.
oh hey look, FC3 GTX680 performance fixed.
Your turn, UWE.<!--QuoteEnd--></div><!--QuoteEEnd-->
Consider me thoroughly baited, for the entertainment of everyone on this board. If you would like us to take your posts seriously in future, I suggest you stop your persistent trolling across the NS2 community. Posting 'my GTX 460 run's Far Cry 3 like a dream' in the same breath as 'UWE fix your performance' is a surefire way to put you on my ignore list. I have a fully watercooled OC'd 3930k + GTX 590 setup and it chokes on that game. And for all it's successes, it doesn't look any better than Crysis 1.
You even chose to troll Flaterectomy during his ModSpot this weekend on NS2HD. Check yourself. You are distracting attention from what is a wonderfully interesting technical discussion.
You even chose to troll Flaterectomy during his ModSpot this weekend on NS2HD. Check yourself. You are distracting attention from what is a wonderfully interesting technical discussion.<!--QuoteEnd--></div><!--QuoteEEnd-->
10/10 mad
edit: at 1080p your setup with no AA should take Far Cry 3 on ultra?
All games and all computer setups are different in ways people don't realize. It's far from a raw power-output number. Anyone posting some sort of "proof" or reasoning that is less than a paragraph long is missing the point.
oh hey look, FC3 GTX680 performance fixed.
Your turn, UWE.<!--QuoteEnd--></div><!--QuoteEEnd-->
What I found more noticeable about this is:
<!--quoteo(post=0:date=:name=link)--><div class='quotetop'>QUOTE (link)</div><div class='quotemain'><!--quotec-->Natural Selection 2 – added SLI profile<!--QuoteEnd--></div><!--QuoteEEnd-->
oh hey look, FC3 GTX680 performance fixed.
Your turn, UWE.<!--QuoteEnd--></div><!--QuoteEEnd-->
<!--quoteo(post=2047412:date=Dec 17 2012, 10:25 PM:name=6john)--><div class='quotetop'>QUOTE (6john @ Dec 17 2012, 10:25 PM) <a href="index.php?act=findpost&pid=2047412"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->What I found more noticeable about this is:<!--QuoteEnd--></div><!--QuoteEEnd-->
I posted this in another thread but I'll post it here too.
The NS2 SLI profile has been in since 310.54 Beta (Released Nov 12th).
The 310.70 WHQL drivers are exactly the same as the 310.70 Beta except for the WHQL status.
Driver notes only note improvements between WHQL releases and not between beta releases. The previous WHQL driver was 306.97.
My boy runs a Q6600 at stock with a GTX9800, i have his settings on LOW and it runs smooth too, so I am not sure what all the complains are about, I am sure that there are some people out there suffering with performance issues, but do they just blame the game because it is the easy scape-goat or do they expect too much from their system and demand it to run on high settings? I mean a Q6600 and GTX9800 is probably about as low as you would like to go here and it runs pretty good considering there are no overclocks at all, but my machine just chews up NS2 and spits it out.
Far Cry 3 <b>does</b> run like a dream on high (not ultra w/dx11) settings on 2 year old machines. Meanwhile NS2 late game requires 4Ghz+ overclocked next gen CPU just to get 40+ FPS during combat, regardless of settings.
Just to compare, minimum system requirements of Far Cry 3, from Steam store page:
<!--quoteo--><div class='quotetop'>QUOTE </div><div class='quotemain'><!--quotec-->Minimum:
OS:Windows XP, Windows Vista and Windows 7
Processor:<b>Intel Core®2 Duo E6700 @ 2.6 GHz</b> or AMD Athlon64 X2 6000+ @ 3.0Ghz or better
Memory:2 GB RAM
Graphics:512MB Video RAM (1GB Video RAM), DirectX9c (DirectX11) Shader Model 3.0 (Shader Model 5.0)
DirectX®:9.0c
Hard Drive:15 GB HD space
Sound:DirectX Compatible (Recommended Surround Sound 5.1 capable)
Other Requirements:Broadband Internet connection
Additional:*Supported Video Cards at Time of Release: AMD Radeon™ HD 2900 / 3000 / 4000 / 5000 / 6000 / 7000 series, NVIDIA® <b>GeForce® 8800 GTX</b> / 9 / 200 / 400 / 500 / 600 series. Laptop versions of these cards may work, but are not supported. These chipsets are the only ones that will run this game. For the most up-to-date minimum requirement listings, please visit the FAQ on our support website at <a href="http://support.ubi.com" target="_blank">http://support.ubi.com</a>.<!--QuoteEnd--></div><!--QuoteEEnd-->
And Natural Selection 2 minimum system requirements, directly from Steam store page:
<!--quoteo--><div class='quotetop'>QUOTE </div><div class='quotemain'><!--quotec-->Minimum:
OS:Windows 7 32/64-bit / Vista 32/64 / XP
Processor:<b>Core 2 Duo 2.6 ghz</b>
Memory:2 GB RAM
Graphics:DirectX 9 compatible video card with <b>1GB, ATI X800, NVidia 8600 </b>or better
DirectX®:9.0
Hard Drive:5 GB HD space<!--QuoteEnd--></div><!--QuoteEEnd-->
Guess which one runs at acceptable framerates at minimum settings?
I get that it must be frustrating to deal with these forums, but please, don't boast your system specs and claim it chokes on a certain game while ignoring your game's own problems. It's better not to post at all in these situations.
This is not true. You get >40 fps in combat with a stock 3.x Ghz CPU.
With a non-OC i7 2600k I get 40 FPS in heavy combat situations and MAX Setting
With minimum Details I have constant >>60 FPS.
There is a simple truth to the performance issues: NS2 is a Computer game while FC3 is made for console.
Sure FC3 looks better with the same Hardware but on the other hand it has no dynamic objects. While in NS2 you pretty much can build whereever and whatever you like you cant do anything remotely similar in FC3.
FC3 has its map and nothing will ever change about this map. Missions are separated instances ensuring there is no dynamic component to the environment. In NS2 everything changes all the time. New structures are added and removed, Infestation spreads and is removed constantly.
Also there is the obvious fact that FC3 is a multi million dollar franchise backed by a big Publisher while NS2 is developed by less than 10 guys being virtually bankrupt for major part of the development.
Comparing NS2 to FC3 really is like comparing Monsanto to your local organic farmer selling his product on the weekly market.
My computer hasn't changed, yet my fps went up and down with each patch, and the latest couple of patches made my fps drop considerably in combat. This isn't about computer builds, this is about something very specific within the game engine causing chokes.
When its found, i expect that my fps should either remain steady, or rise with each patch.
I have two fears for ns2.
1- performance does not improve, people keep boasting about their high-end specs and forget that the reason ns1 was awesome is cause everyone could play.
2- new content actually worsens performance! Thats a scary thought!
When i first build this new pc a few years back, i kept ns2 in mind and had a ballpark of what it would take to play... thats been ######ed up, but with an OC i was at least hanging in there! Now its just comm, gorge, build, repair, or play wheres waldo.
With a non-OC i7 2600k I get 40 FPS in heavy combat situations and MAX Setting
With minimum Details I have constant >>60 FPS.<!--QuoteEnd--></div><!--QuoteEEnd-->
Not to rain on your parade... but i honestly doubt those numbers, especially late game on big servers with 16+ players. Why do i do that? Because i own a 2600k myself and because people are only looking at client fps without noticing how choppy the actual game feels regardless of client fps.
You can have >40 fps in combat and have no packet loss and the game will still feel horribly unresponsive especially if the round has been going on for a while.
People are looking for the problem in all the wrong places imho, most of the performance issues for highend CPU users seem to have started when UWE capped client fps calculation for some stuff to the server tick rate. Maybe that's where another look should be taken?