Does Fps Affect The Amount Of Addrenaline Used?
ShotgunEd
Join Date: 2004-01-02 Member: 24966Members
I've just got a new pc which manages to run NS at a half decent fps. Anyway, when I fade on the old pc I hardly seem to use any adrenaline when blinking. I get about 20-30 fps on that machine. When I fade on the new pc, getting about 150-200 fps, just tapping blink seems to use loads of adrenaline. Seems to me that I'm better off setting my max fps to 25 when I fade...
Comments
And back then, more fps = better (like infinite jetpack fuel)
anyways, yeah i tested it once actually with fps_max 30 and felt the same. not sure though.
Its not all good though, my graphics has died a sad death after 1 week, another 2 weeks before I get the replacement <!--emo&:(--><img src='http://www.unknownworlds.com/forums/html/emoticons/sad-fix.gif' border='0' style='vertical-align:middle' alt='sad-fix.gif' /><!--endemo-->
yeah i wonder how some ppl can say like that
That's not true, a human eye registers only 60 fps. That's why tv's (well dunno about new ones) only display 50 to 60 fps.
And there still is a visible difference between 60 and 100 fps.
Its not all good though, my graphics has died a sad death after 1 week, another 2 weeks before I get the replacement <!--emo&:(--><img src='http://www.unknownworlds.com/forums/html/emoticons/sad-fix.gif' border='0' style='vertical-align:middle' alt='sad-fix.gif' /><!--endemo--> <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
I'm afraid it does not. It only shows the theoretical maximum with current setup, not what you're getting. You *CANNOT* exceed 100FPS.
FPS still effects many things. Although "the higher the FPS, the better" is mostly true, the best FPS from a maths POV is 25. This follows research done by "a civilian" and detailed explanation from Soylent Green.
No, I do not recommend you set your max fps to 25. The rof/adren etc difference is hardly noticeable, whereas the jerkyness and ugliness certainly is.
I thought the human eye can percieve up to 35Hz, but why a 60Hz monitor looks awful (headaches if you can't notice the flickering anyway,) and a 75Hz monitor is fine I don't know.
@topic:
I don't know if fps really affect these things, but maybe it's just that you can time your actions better with a higher framerate, so that it seems like less adren is used.
When can this myth die?
The human eye can register fps in excess of 220+ and likely much more then that, the eye doesn't see in fps, it is a constant stream of information, after 100 fps though the video is so fluid that its hard to notice a difference however it is still there.
<a href='http://www.100fps.com/how_many_frames_can_humans_see.htm' target='_blank'>http://www.100fps.com/how_many_frames_can_humans_see.htm</a>
A simple test would be to script a blink at 100 FPS with 10 waits and then use a script with half the "wait"s at 50 FPS. If frames makes no different, the blinks should be identical in stamina usage and distance travelled. (Remember that a "wait" lasts one frame, so you'd need half as much waits for 50 FPS to equal the exact same length of time). If you're having a hard time measuring the difference, use FPS_MAX 10 with 1 wait for an extreme case.
// with hud_fastswitch 1 and FPS_MAX 100
slot2; +attack; wait; wait; wait; wait; wait; wait; wait; wait; wait; wait; -attack
// with FPS_MAX 50
slot2; +attack; wait; wait; wait; wait; wait; -attack
Side note: The "human eye can only register X frames a second" thing is an obvious myth.
That's not true, a human eye registers only 60 fps. That's why tv's (well dunno about new ones) only display 50 to 60 fps. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
Thats why some fighter jet pilots can see upto 100fps?
TV is set to the average fps seen by a certain amount of ppl not to all ppl just the average...
Not everyone is the same ya know <!--emo&:p--><img src='http://www.unknownworlds.com/forums/html/emoticons/tounge.gif' border='0' style='vertical-align:middle' alt='tounge.gif' /><!--endemo-->
In a nutshell, our eye is a bit like a camera: even if data gets sent every so often to the brain (a strictly continuous stream isn't possible by a nervous system, it's more like morse code) the eye gets the data continuously. It's a bit like overexposure on a camera, when the shutter doesn't close itself fast enough: the image is blurred. This blurriness allows fluidity because our brain sees each "picture" as a multitude of pictures. However, even if the real world does send out a continuous image, a screen only gives one crisp image every so often. The bluriness of the movie (made when filming) makes up for the lack of FPS, but games usually don't have bluriness.
Are you holding down attack for the same length of time? If so, then the reason you're using more adrenaline at higher fps is that you're "attacking" more than you did at a lower fps.
My understanding of the HL engine is that in each frame you can perform one command. So, with 30 fps, you can only perform 30 commands per second, with 100 fps, you can perform 100 per second. So, if you hold down attack for the same length of time at 30 fps as at 100 fps, you'll attack more in that given amount of time at 100 fps, therefore using more adrenaline.
You'll never be better off running at a lower fps. The rate of fire differences between 30 fps and 100 fps are <b>extremely</b> noticeable. This, combined with the choppiness of video, will be a serious detriment to your gameplay. Just give yourself some time with the higher fps and you'll get used to it.
Are you holding down attack for the same length of time? If so, then the reason you're using more adrenaline at higher fps is that you're "attacking" more than you did at a lower fps.
My understanding of the HL engine is that in each frame you can perform one command. So, with 30 fps, you can only perform 30 commands per second, with 100 fps, you can perform 100 per second. So, if you hold down attack for the same length of time at 30 fps as at 100 fps, you'll attack more in that given amount of time at 100 fps, therefore using more adrenaline.
You'll never be better off running at a lower fps. The rate of fire differences between 30 fps and 100 fps are <b>extremely</b> noticeable. This, combined with the choppiness of video, will be a serious detriment to your gameplay. Just give yourself some time with the higher fps and you'll get used to it. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
That's why low-poly models are the win. <!--emo&:D--><img src='http://www.unknownworlds.com/forums/html/emoticons/biggrin-fix.gif' border='0' style='vertical-align:middle' alt='biggrin-fix.gif' /><!--endemo-->
You're saying the exact opposite of what the original poster said. He said lower fps means less adrenaline usage, which makes sense to me because the +attack command isn't sent as many times.
I suspect the "developer mode doesn't render over 100fps" comment came from the same guy that said "we can't see over 30fps" and "high pingers lag the server".
Wrt to blinking, I can hold blink down until all the addrenaline is used for far longer on my old pc than I can on my new.
Are you holding down attack for the same length of time? If so, then the reason you're using more adrenaline at higher fps is that you're "attacking" more than you did at a lower fps.
My understanding of the HL engine is that in each frame you can perform one command. So, with 30 fps, you can only perform 30 commands per second, with 100 fps, you can perform 100 per second. So, if you hold down attack for the same length of time at 30 fps as at 100 fps, you'll attack more in that given amount of time at 100 fps, therefore using more adrenaline.
You'll never be better off running at a lower fps. The rate of fire differences between 30 fps and 100 fps are <b>extremely</b> noticeable. This, combined with the choppiness of video, will be a serious detriment to your gameplay. Just give yourself some time with the higher fps and you'll get used to it. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
What? Wouldn't this mean that you can't shoot, crouch, jump, and move at the same time?
howver it's not really THAT noticible.. maybe 10-20% increase from 100fps to ~200.
theclam: Unless your FPS is incredibly low (under 20), you will not run into that problem.
cl_cmdrate is how many commands are how many commands are sent to the server per second. As many commands are possible, up to the number set by cl_cmdrate, are sent, so if your FPS doesn't catch up to cl_cmdrate, it will only send the number of commands possible by FPS. Yes, this can cause people to have horrible hitreg, as can having certain netcode commands set very low (there is a reason cl_rate is locked). Adjusting your interp to a low setting usually reveals who still has updaterate and cmdrate at the default setting due to the terrible warping they exhibit versus someone who adjusted their rates (keep in mind servers cap updaterate and rate, often leaving them very low, usually 25 or lower).
Sandstorm: Developer mode only shows how high FPS "Could" go, not how high they are. It is not possible to get over 100fps. It also varies from person to person how sensitive their sight is, I personally have shoddy vision in one eye and better than normal vision in the other eye. I can't tell the difference in regular television (320x200@60Hz) versus my computer screen (1900x1440@75Hz) in one eye, but can tell if a monitor is running 75 or 60Hz in the other just by glancing at a screen (thankfully in humans both eyes work together, otherwise I'd be quite screwed up in the vision department). Most people cannot tell over 100fps, their eyesight simply isn't sensitive enough. I personally don't notice over 60. I'm sure their are people who can tell 150 from 100, it will simply vary from person to person.
err, isn't "wait" at ~200 fps shorter then "wait" at 100 fps?
it's not about rendering it at all, it just seems to be calculations.