The behaviour they're describing is what's called "wear levelling". SSDs do this so that all the NAND circuits are evenly used over the lifetime of the device itself. The more space you use on the drive, the less ability it has to wear level, and this increases long-term wear on the NAND circuits. Newer SSDs (not all) have hidden over provisioning (as in, more space than you can actually use), to address this as people typically like being able to use the majority of the listed capacity of the drive. If this situation is happening to you, this could be addressed by either a firmware update, wiping the drive and following the manufacturer's instructions for regaining performance of the drive, or perhaps enabling TRIM if it isn't enabled. The method that actually works varies from model to model, as it depends on the features of the SSD, and the generation too, as older ones don't handle such things as good as newer ones do.
...also if the drive is near full (~85-90%) it will have to keep moving data around the whole drive and will get a lot quicker to that "cellwritecount"
This just shows how much the UWE development team doesnt care about performance, a QoL change that could change how the game feels like...
I spent all of today re-writing the effects manager for the game to be much more efficient, and we've got other performance improvements on the way too. Please do not assume that we don't care about performance, just because it isn't always the highest priority issue for us.
...also if the drive is near full (~85-90%) it will have to keep moving data around the whole drive and will get a lot quicker to that "cellwritecount"
This just shows how much the UWE development team doesnt care about performance, a QoL change that could change how the game feels like...
I spent all of today re-writing the effects manager for the game to be much more efficient, and we've got other performance improvements on the way too. Please do not assume that we don't care about performance, just because it isn't always the highest priority issue for us.
I do agree that i didnt choose the best words to describe it, but performance at this stage should be a real no.1 priority.. It should always be prioritized! Look at what Rainbow 6.. They finally decided to fix all their performance issues, they've gained now from 100-200k concurrent players up to 700k. After that they've kept on adding content.
I know UWE studio is smaller than Ubisoft, but it shows how a great plan/future plan does for you. Fix the performance before adding content.
HandschuhJoin Date: 2005-03-08Member: 44338Members, NS2 Playtester, NS2 Community Developer
I think the issue with performance is... if you gain players you fastly loose them if the game doesnt have a good performance... and once you gain higher performance the reputation will rise... otherwise I have always mixed feelings to advertise this game to my friends who might like this game If I know they dont have a top notch pc...
DC_DarklingJoin Date: 2003-07-10Member: 18068Members, Constellation, Squad Five Blue, Squad Five Silver
While some of the big servers out there are good on performance, there are plenty smaller community servers with server ops who dont really look into performance setup that well.
That said, Ive heard reports of server ops saying they saw odd performance behavior not showing up anywhere. So.. research ongoing?
On topic of netcode:
I dont have a problem with dying around corners, its prediction/compensation doing its job and at least to me its not happening frequently.
Should it be somewhat limited so that having 200+ ping doesnt result in a massive advantage? - yes.
I still have a massive problem with is frame time variance.
test conditions:
Empty server standing in Hiveroom
double buffered Vsync at 120hz
Frametime reporting via the "fps 2" command set to report frame times 2ms over the average (i.e. 10ms or 100fps)
1080p everything on minimal detail.
Do a quick 180 in the hive: frametimes over 14ms are reported (less then 70 fps)!
Steadily looking in each direction the game maintains 120fps (vsync) unsynct its 200.
i.e. computer has enough horsepower to maintain over 200fps rendering these scenes.
How am i supposed to track a skulk/lerk zipping by if simply turning results in massive framedrops.
This fits nicely with the whole "everything is fine until there are enemies on the screen" theme.
2ms variance is already 20% margin, the variance im seeing is up to 40%.
And this is not some late game szenario with lots of stuff going on, this is looking at a mostly static szene on an empty server!
To me this looks like the engine unloads data it should not unload.
"The Spark engine does support higher tickrate, it will just consume more CPU (serverside mostly) and bandwidth (the low sendrate is one thing that's causing many issues like ammo counts going back and forth when firing, etc). Increasing it will help, but won't totally fix the issue.
That's too big of a discrepancy to be simply down to tickrate and it's certainly not a "small precision error". Is the interpolation that different between the client and server that something that obvious and linear in movement gets mispredicted with that amount of ease of reproduction? There's other games running at laughably low tickrates that handle prediction more complex than that just fine. If it's a discrepancy it's certainly not small, it's a bit embarrassing to look at, honestly.
The problem with UWE and how they work is that they will keep on saying that something will take very long to do and therefore never even start doing it, even if the benefits are huge and really apparent. In any case I don't think with their current team they could tackle this properly, this is something that should have been done near release.
Increasing the hitboxes is an easy hack, but it also creates new problems. Tracking down the exact problem and fixing would take proper work."
On topic of netcode:
I still have a massive problem with is frame time variance.
test conditions:
Empty server standing in Hiveroom
double buffered Vsync at 120hz
Frametime reporting via the "fps 2" command set to report frame times 2ms over the average (i.e. 10ms or 100fps)
1080p everything on minimal detail.
Do a quick 180 in the hive: frametimes over 14ms are reported (less then 70 fps)!
Steadily looking in each direction the game maintains 120fps (vsync) unsynct its 200.
i.e. computer has enough horsepower to maintain over 200fps rendering these scenes.
How am i supposed to track a skulk/lerk zipping by if simply turning results in massive framedrops.
This fits nicely with the whole "everything is fine until there are enemies on the screen" theme.
To me this looks like the engine unloads data it should not unload.
I discovered recently, while checking my map with r_wireframe, that some geometry and props pop up at the sides of your screen (all 4 sides), which should be covered by walls and occlusion culling. They arent visible, if you look directly at them.
This sounds familiar with what you are describing and it means in some map spots that way more stuff is rendered than necessary.
Just try it yourself, stand still in-game and just turn with mouse while using r_wireframe.
You can see props, especially big props (skybox) and geometry from adjacent rooms pop up at the sides, that should be blocked by occlusion culling and actually are blocked, when you look straight in that direction.
I'm not sure, if maybe this is supposed to work like that, so geometry doesn't pop up too late, when you turn in-game,
or wether this is some kind of bug or sloppy occlusion culling.
I discovered recently, while checking my map with r_wireframe, that some geometry and props pop up at the sides of your screen (all 4 sides), which should be covered by walls and occlusion culling. They arent visible, if you look directly at them.
This sounds familiar with what you are describing and it means in some map spots that way more stuff is rendered than necessary.
Just try it yourself, stand still in-game and just turn with mouse while using r_wireframe.
You can see props, especially big props (skybox) and geometry from adjacent rooms pop up at the sides, that should be blocked by occlusion culling and actually are blocked, when you look straight in that direction.
I'm not sure, if maybe this is supposed to work like that, so geometry doesn't pop up too late, when you turn in-game,
or wether this is some kind of bug or sloppy occlusion culling.
So we identified performance issues in the engine's occlusion culling?
Nice and all but to my limited programming knowledge resolving those would be rather hard thing to do.
This is rather fundamental stuff for a 3d engine, buried deep down in the complexity.
And i can't envision the dev team having the time to work on intensive stuff like this.
For the player its super frustrating though:
1. Skulk is bouncing towards you at high speed, you fire at it doing good damage.
2. skulk closes the distance, you need to start turning to keep tracking it.
3. game hitches, you loose track
4. *you die*
5. skulk escapes with 30hp
Unlike dying around corners, this happens to me reguarly.
dePARAJoin Date: 2011-04-29Member: 96321Members, Squad Five Blue
For me it was always interisting to see that some people seem to have zero problems with any kind of fps drops, stutter, warping skulks and inconsistent mouse feeling.
I remember the beta where we had unplayable builds sometimes. But even there, perfect aim.
Well, looks like the ordinary mortals cant see the matrix.
1. Skulk is bouncing towards you at high speed, you fire at it doing good damage.
2. skulk closes the distance, you need to start turning to keep tracking it.
3. game hitches, you loose track
4. *you die*
5. skulk escapes with 30hp
Unlike dying around corners, this happens to me reguarly.
I have rather bad specs, I have the usual fps drops, long loading times and lot of freezes and hitches when I first load up the game and join a server.
The hitches should be gone eventually tho..
After that I can enjoy the game and play well.
If you keep getting hitches, do you have enough memory, do you use the low video memory option and low texture quality?
MephillesGermanyJoin Date: 2013-08-07Member: 186634Members, NS2 Map Tester, NS2 Community Developer
I my game sometimes freezes for half a secondwhen I see the first alien after the round has started. Doesn't matter how often I have seen one in the pregame (btw would installing ns2 on an SSD get rid of that problem?)
I my game sometimes freezes for half a secondwhen I see the first alien after the round has started. Doesn't matter how often I have seen one in the pregame (btw would installing ns2 on an SSD get rid of that problem?)
I've had a fairly bad spec about a year ago (i3, GTX630 passive cooled), and on top of that I was playing on linux with opengl. I was running ns with the lowest graphic settings and had like 20-40 fps.
BUT it was installed on an SSD, and have never had glitches like that... so I guess an SSD would get rid of that problem.
I chose to investigate further:
I did a comparism test between limiting the framerate with vsync or the "maxfps" command.
Result: frametimes were more stable with "maxfps".
Lower variance and the drops not as severe. (78fps vs. 100fps min fps)
Ironhorse has been saying for years that maxfps is far better than vsync or a gpu driver frame limiting. I personally run with a maxfps 121. I really would like a mod that would set my maxfps to a value I can set. It would be even cooler if the mod could set my maxfps to 30 when I am afk just to save power.
Also, oddly enough, even when I use double-buffered vsync, it doesn't lock to half my refresh if I go below. Is adaptive vsync automatically happening? I have it set to application-controlled in NVCP.
Ironhorse has been saying for years that maxfps is far better than vsync or a gpu driver frame limiting. I personally run with a maxfps 121. I really would like a mod that would set my maxfps to a value I can set. It would be even cooler if the mod could set my maxfps to 30 when I am afk just to save power.
Just put "maxfps 121" in your steam launch options to have it set all the time.
And if you are idle just minimize the game then it renders at 1 fps AFAIK.
Or get yourself some keybindings for the maxfps command.
Maxfps doesn't work with steam launch options. I certainly tried.
Minimizing the game doesn't work long term. I often let my computer sit idle for hours while seeing. If minimized, it will sometimes crash.
Keybindings are pretty easy, but it is not automatic like I would like.
So I use fpsmax 145... but I don't quite understand how it it works. (Keep fan noise down with my open hd598)
For example if I'm in an area where I get a solid 100fps, and I use fpsmax 50... which 50 frames don't render? Does it just change the timing so that 50 frames are rendered evenly across a second?
Kouji_SanSr. Hινε UÏкεεÏεг - EUPT DeputyThe NetherlandsJoin Date: 2003-05-13Member: 16271Members, NS2 Playtester, Squad Five Blue
Last I checked that console command, it felts like it's simply dropping frames out the window, causing choppyness. Not like a sync option to sync it to your screen, which with V-Sync at least, feels floaty due to input latency
IronHorseDeveloper, QA Manager, Technical Support & contributorJoin Date: 2010-05-08Member: 71669Members, Super Administrators, Forum Admins, Forum Moderators, NS2 Developer, NS2 Playtester, Squad Five Blue, Subnautica Playtester, Subnautica PT Lead, Pistachionauts
edited April 2017
maxfps just drops the frames in order to keep what's shown on screen as current as possible, even if it results in screen tearing still.
Vsync will increase input delay substantially as its buffering more frames, and therefore prevent any screen tearing.
Maxfps will just drop frames, which may still produce tearing, but will never induce any additional input delay.
Any other method of frame limiting besides maxfps will induce a certain amount of input delay.
@Mephilles Run p_logall in the console once you're in the ready room/ before you see an alien. After you experience the stutter then type p_endlog or just quit the game.
You'll find the *.plog file in your hidden %appdata%/Natural selection 2/ folder where your log.txt is. Please zip it up and PM me it in discord along with your techsupport zip file.
There's other games running at laughably low tickrates that handle prediction more complex than that just fine.
@Ryssk
Examples? Are they running ~20 sendrate as well, and does it contain fast moving entities similar to NS2, with potentially thousands of entities?
I've seen games get away with running ~30 tickrate/sendrate before, but the entities move as slow as CS, which effectively hides the symptoms.
And idk who you were quoting there, but no NS2 does not handle a faster tickrate just fine, but Spark might as you say. It's the difference between the engine and the gamecode.
Any server op can test out a sendrate of 60, with interp of 30, and even if the server handles it just fine performance wise.. you'll still notice bugs like momentary world freezes due to lua updates.
In other words, there's bugs and further optimizations required to even allow such a thing to be used reliably, as I said before.
There's a reason OW increased their comp sendrate to 60. Even with much slower moving entities than NS2 it was still noticeable.
Comments
I spent all of today re-writing the effects manager for the game to be much more efficient, and we've got other performance improvements on the way too. Please do not assume that we don't care about performance, just because it isn't always the highest priority issue for us.
@coolitic, I see @BloodyIron has answered already
I do agree that i didnt choose the best words to describe it, but performance at this stage should be a real no.1 priority.. It should always be prioritized! Look at what Rainbow 6.. They finally decided to fix all their performance issues, they've gained now from 100-200k concurrent players up to 700k. After that they've kept on adding content.
I know UWE studio is smaller than Ubisoft, but it shows how a great plan/future plan does for you. Fix the performance before adding content.
That said, Ive heard reports of server ops saying they saw odd performance behavior not showing up anywhere. So.. research ongoing?
Performance equalizers!
*autistic screeching*
I dont have a problem with dying around corners, its prediction/compensation doing its job and at least to me its not happening frequently.
Should it be somewhat limited so that having 200+ ping doesnt result in a massive advantage? - yes.
I still have a massive problem with is frame time variance.
test conditions:
Empty server standing in Hiveroom
double buffered Vsync at 120hz
Frametime reporting via the "fps 2" command set to report frame times 2ms over the average (i.e. 10ms or 100fps)
1080p everything on minimal detail.
Do a quick 180 in the hive: frametimes over 14ms are reported (less then 70 fps)!
Steadily looking in each direction the game maintains 120fps (vsync) unsynct its 200.
i.e. computer has enough horsepower to maintain over 200fps rendering these scenes.
How am i supposed to track a skulk/lerk zipping by if simply turning results in massive framedrops.
This fits nicely with the whole "everything is fine until there are enemies on the screen" theme.
2ms variance is already 20% margin, the variance im seeing is up to 40%.
And this is not some late game szenario with lots of stuff going on, this is looking at a mostly static szene on an empty server!
To me this looks like the engine unloads data it should not unload.
"The Spark engine does support higher tickrate, it will just consume more CPU (serverside mostly) and bandwidth (the low sendrate is one thing that's causing many issues like ammo counts going back and forth when firing, etc). Increasing it will help, but won't totally fix the issue.
That's too big of a discrepancy to be simply down to tickrate and it's certainly not a "small precision error". Is the interpolation that different between the client and server that something that obvious and linear in movement gets mispredicted with that amount of ease of reproduction? There's other games running at laughably low tickrates that handle prediction more complex than that just fine. If it's a discrepancy it's certainly not small, it's a bit embarrassing to look at, honestly.
The problem with UWE and how they work is that they will keep on saying that something will take very long to do and therefore never even start doing it, even if the benefits are huge and really apparent. In any case I don't think with their current team they could tackle this properly, this is something that should have been done near release.
Increasing the hitboxes is an easy hack, but it also creates new problems. Tracking down the exact problem and fixing would take proper work."
@IronHorse
I discovered recently, while checking my map with r_wireframe, that some geometry and props pop up at the sides of your screen (all 4 sides), which should be covered by walls and occlusion culling. They arent visible, if you look directly at them.
This sounds familiar with what you are describing and it means in some map spots that way more stuff is rendered than necessary.
Just try it yourself, stand still in-game and just turn with mouse while using r_wireframe.
You can see props, especially big props (skybox) and geometry from adjacent rooms pop up at the sides, that should be blocked by occlusion culling and actually are blocked, when you look straight in that direction.
I'm not sure, if maybe this is supposed to work like that, so geometry doesn't pop up too late, when you turn in-game,
or wether this is some kind of bug or sloppy occlusion culling.
I noticed the exact same thing too on my map.
Nice and all but to my limited programming knowledge resolving those would be rather hard thing to do.
This is rather fundamental stuff for a 3d engine, buried deep down in the complexity.
And i can't envision the dev team having the time to work on intensive stuff like this.
For the player its super frustrating though:
1. Skulk is bouncing towards you at high speed, you fire at it doing good damage.
2. skulk closes the distance, you need to start turning to keep tracking it.
3. game hitches, you loose track
4. *you die*
5. skulk escapes with 30hp
Unlike dying around corners, this happens to me reguarly.
I remember the beta where we had unplayable builds sometimes. But even there, perfect aim.
Well, looks like the ordinary mortals cant see the matrix.
I have rather bad specs, I have the usual fps drops, long loading times and lot of freezes and hitches when I first load up the game and join a server.
The hitches should be gone eventually tho..
After that I can enjoy the game and play well.
If you keep getting hitches, do you have enough memory, do you use the low video memory option and low texture quality?
I've had a fairly bad spec about a year ago (i3, GTX630 passive cooled), and on top of that I was playing on linux with opengl. I was running ns with the lowest graphic settings and had like 20-40 fps.
BUT it was installed on an SSD, and have never had glitches like that... so I guess an SSD would get rid of that problem.
I did a comparism test between limiting the framerate with vsync or the "maxfps" command.
Result: frametimes were more stable with "maxfps".
Lower variance and the drops not as severe. (78fps vs. 100fps min fps)
system: R7 1800x, 16Gb, R9 290, NS2 on HDD
Also, oddly enough, even when I use double-buffered vsync, it doesn't lock to half my refresh if I go below. Is adaptive vsync automatically happening? I have it set to application-controlled in NVCP.
Just put "maxfps 121" in your steam launch options to have it set all the time.
And if you are idle just minimize the game then it renders at 1 fps AFAIK.
Or get yourself some keybindings for the maxfps command.
Minimizing the game doesn't work long term. I often let my computer sit idle for hours while seeing. If minimized, it will sometimes crash.
Keybindings are pretty easy, but it is not automatic like I would like.
apearently not hard enough!
You need to put it in the launch options EXACTLY as i posted it. (including the quote marks)
For example if I'm in an area where I get a solid 100fps, and I use fpsmax 50... which 50 frames don't render? Does it just change the timing so that 50 frames are rendered evenly across a second?
Vsync will increase input delay substantially as its buffering more frames, and therefore prevent any screen tearing.
Maxfps will just drop frames, which may still produce tearing, but will never induce any additional input delay.
Any other method of frame limiting besides maxfps will induce a certain amount of input delay.
@Mephilles Run p_logall in the console once you're in the ready room/ before you see an alien. After you experience the stutter then type p_endlog or just quit the game.
You'll find the *.plog file in your hidden %appdata%/Natural selection 2/ folder where your log.txt is. Please zip it up and PM me it in discord along with your techsupport zip file.
@Ryssk
Examples? Are they running ~20 sendrate as well, and does it contain fast moving entities similar to NS2, with potentially thousands of entities?
I've seen games get away with running ~30 tickrate/sendrate before, but the entities move as slow as CS, which effectively hides the symptoms.
And idk who you were quoting there, but no NS2 does not handle a faster tickrate just fine, but Spark might as you say. It's the difference between the engine and the gamecode.
Any server op can test out a sendrate of 60, with interp of 30, and even if the server handles it just fine performance wise.. you'll still notice bugs like momentary world freezes due to lua updates.
In other words, there's bugs and further optimizations required to even allow such a thing to be used reliably, as I said before.
There's a reason OW increased their comp sendrate to 60. Even with much slower moving entities than NS2 it was still noticeable.