Integration step on server
<div class="IPBDescription">I don't get it</div>I'm looking into the movement code but I'm a bit confused on how it's actually working.
Basically the game compute the velocity of the player and then update its position. The sever do that 30 times per second and the client fps times.
The basic formula for doing this is : newPosition = currentPosition + dt * velocity
which come for newton law of motion, F=ma (well more like velocity definition).
Now the integration step dt ("delta t") should be the time between each update: 1/30 second on the server and 1/fps second on the client.
For example if the velocity is 1, the server will move the player by 1/30 meter at each update, which will move you of one meter after 30 update, or one second at 30 ticks. If you use a different dt, for example 1/60, then you will move only 0.5 meter after one second.
On the client for 100 fps it will move you from 1/100 meter at each update, which will move you of one meter after 100 update, or one second at 100 fps.
However the server is using dt = 1/fps instead of 1/30. Here I just print the velocity (with parentheses) and the dt (just after):
<img src="http://i.imgur.com/Iuk2C.png" border="0" class="linked-image" />
Same velocity same dt (1/100), but about 3 times more updates on the client. So for the client you're moving 3 times faster than for the server.
The only explanation I see is that you're not moved during the prediction (Client*) but then I'm confused on why the function is even called, or why the client doesn't integrate the position with dt=1/30.
I probably missed something, but what?
Basically the game compute the velocity of the player and then update its position. The sever do that 30 times per second and the client fps times.
The basic formula for doing this is : newPosition = currentPosition + dt * velocity
which come for newton law of motion, F=ma (well more like velocity definition).
Now the integration step dt ("delta t") should be the time between each update: 1/30 second on the server and 1/fps second on the client.
For example if the velocity is 1, the server will move the player by 1/30 meter at each update, which will move you of one meter after 30 update, or one second at 30 ticks. If you use a different dt, for example 1/60, then you will move only 0.5 meter after one second.
On the client for 100 fps it will move you from 1/100 meter at each update, which will move you of one meter after 100 update, or one second at 100 fps.
However the server is using dt = 1/fps instead of 1/30. Here I just print the velocity (with parentheses) and the dt (just after):
<img src="http://i.imgur.com/Iuk2C.png" border="0" class="linked-image" />
Same velocity same dt (1/100), but about 3 times more updates on the client. So for the client you're moving 3 times faster than for the server.
The only explanation I see is that you're not moved during the prediction (Client*) but then I'm confused on why the function is even called, or why the client doesn't integrate the position with dt=1/30.
I probably missed something, but what?
Comments
Server runs at 30 fps => Server dt = 1/30s = 0.33
Client runs at 100 fps => Client dt = 1/100s = 0.1
Let's take an example:
Let velocity = v = 20
and Starting position = x = 100
At each step, we'll have:
t += dt
x += dt * v
Server updates (dt = 0.33):
t = 0, x = 100
t += 0.33 = 0.33, x += 0.33 * 20 = 106.6
t += 0.33 = 0.66, x += 0.33 * 20 = 113.2
Client updates (dt = 0.1):
t = 0, x = 100
t += 0.1 = 0.1, x += 0.1 * 20 = 102
t += 0.1 = 0.2, x += 0.1 * 20 = 104
t += 0.1 = 0.3, x += 0.1 * 20 = 106
t += 0.1 = 0.4, x += 0.1 * 20 = 108
t += 0.1 = 0.5, x += 0.1 * 20 = 110
t += 0.1 = 0.6, x += 0.1 * 20 = 112
let's see at t = 0.66: x += 0.06 * 20 = 113.2
The client will make 3 times as much updates than the server, but will ultimately get the same value.
You should re-read the post.
Isn't there a way to log variable into a file btw?