DruBoBack In BeigeJoin Date: 2002-02-06Member: 172Members, NS1 Playtester
On the subject of fps, yes, it is impossible to really see any difference above about 32 fps. This is one of the reasons that TV is at 30 fps. More simply aren't needed for that application.
However, it's true that having extra fps as a lag spike buffer is helpful.
<!--QuoteBegin--Hida Tsuzua+Aug. 12 2002,03:25--></span><table border="0" align="center" width="95%" cellpadding="3" cellspacing="1"><tr><td><b>Quote</b> (Hida Tsuzua @ Aug. 12 2002,03:25)</td></tr><tr><td id="QUOTE"><!--QuoteEBegin-->I used to play NS at 15 or so fps (I had the highest resolution on at the time). So you can play it. However when I lowered the resolution, I did better. It should be noted that the most the eye can do is 32 or so fps, so hitting anything higher is kindof useless (execpt as a buffer for lag spikes).<!--QuoteEnd--></td></tr></table><span id='postcolor'><!--QuoteEEnd--> I'd heard of this theory, but if you can't process all the images that flicker in front of your eyes if it's faster then 30 fps, then why can you see a difference between 60 and 85hz moniter refresh?
Im not sure but maybe 45 or some where around there lag stinks. lol <!--emo&:p--><img src="http://www.natural-selection.org/iB_html/non-cgi/emoticons/tounge.gif" border="0" valign="absmiddle" alt=':p'><!--endemo-->
<!--QuoteBegin--Hida Tsuzua+Aug. 11 2002,22:25--></span><table border="0" align="center" width="95%" cellpadding="3" cellspacing="1"><tr><td><b>Quote</b> (Hida Tsuzua @ Aug. 11 2002,22:25)</td></tr><tr><td id="QUOTE"><!--QuoteEBegin-->I used to play NS at 15 or so fps (I had the highest resolution on at the time). So you can play it. However when I lowered the resolution, I did better. It should be noted that the most the eye can do is 32 or so fps, so hitting anything higher is kindof useless (execpt as a buffer for lag spikes).<!--QuoteEnd--></td></tr></table><span id='postcolor'><!--QuoteEEnd--> no.
The human eye can distinguish, but it sees motion at much lower than that. 19 fps looks a little bit jumpy, but it looks fairly motion-like. after lowering my FPS to 800x600 I now get about 35 fps. Sure, it may look poo-like, but It is faster, it looks much more fluid. In the same sense, when I go to my cousin's place (his computer is about 6 times faster than mind, and he has an ATI Radeon) and I see him running at 80+ fps, I can definately tell the difference, and it makes a BIG difference. After this it really gets redundant anyways..
<!--QuoteBegin--greyfox555+Aug. 12 2002,01:58--></span><table border="0" align="center" width="95%" cellpadding="3" cellspacing="1"><tr><td><b>Quote</b> (greyfox555 @ Aug. 12 2002,01:58)</td></tr><tr><td id="QUOTE"><!--QuoteEBegin-->Your wrong hida, the human brain refreshes at 24 frames a second. How do I know this? A while back a buncha geeks got together in a major corperation lab, to figure out the problem of expensive movie film....they found out that the human brain refreshes at 24fps, and switched the film to that fps, so film would become MUCH cheaper..one of those geeks was my father in RCA technical labs <!--emo&:)--><img src="http://www.natural-selection.org/iB_html/non-cgi/emoticons/smile.gif" border="0" valign="absmiddle" alt=':)'><!--endemo--> If the frames per second hits 24 on your screen or a tv, or a movie screen, you see flickering lines, because that 24 fps interfiers with the 24fps that your brain is running at, thats why they dim the lights in the movie theators.<!--QuoteEnd--></td></tr></table><span id='postcolor'><!--QuoteEEnd--> sorry, no.
if the mind DID refresh at 24 fps (I don't know about this one) then when TV or movies were at 24 fps you wouldn't see any flicker at all, in fact, it would be completely fluid. It would look just like real life. How on earth would it interfere?
The only flicker I ever see is watching a computer going on on the TV (someone filming a computer). The Frames Per Second's are different, that's the only FPS interference I can think of.
<!--QuoteBegin--DrunkenBozo+Aug. 12 2002,11:42--></span><table border="0" align="center" width="95%" cellpadding="3" cellspacing="1"><tr><td><b>Quote</b> (DrunkenBozo @ Aug. 12 2002,11:42)</td></tr><tr><td id="QUOTE"><!--QuoteEBegin-->On the subject of fps, yes, it is impossible to really see any difference above about 32 fps. This is one of the reasons that TV is at 30 fps. More simply aren't needed for that application.
However, it's true that having extra fps as a lag spike buffer is helpful.<!--QuoteEnd--></td></tr></table><span id='postcolor'><!--QuoteEEnd--> No. TV can run at really any speed, it is virtually doubled by motion blurring because of the way it is filmed (from real life...)
<!--QuoteBegin--Cyanide+Aug. 12 2002,14:03--></span><table border="0" align="center" width="95%" cellpadding="3" cellspacing="1"><tr><td><b>Quote</b> (Cyanide @ Aug. 12 2002,14:03)</td></tr><tr><td id="QUOTE"><!--QuoteEBegin-->Ugh, my broadband and IE managed to mess up at the same time, sorry for the double/triple posts.
In a lighter note, im not rich, and niether are most of us gamers, however 20 dollars for a new video card is NOT that much money.<!--QuoteEnd--></td></tr></table><span id='postcolor'><!--QuoteEEnd--> My problem isn't my video card.. it's my slow-as-poo processor. I've looked into this, I would need a new motherboard, power supply, RAM, even a new tower, plus the actual processor. Sure, my video card sucks, but my processor is truly pants.
<!--QuoteBegin--Sinister+Sep. 02 2002,01:52--></span><table border="0" align="center" width="95%" cellpadding="3" cellspacing="1"><tr><td><b>Quote</b> (Sinister @ Sep. 02 2002,01:52)</td></tr><tr><td id="QUOTE"><!--QuoteEBegin-->dragonsblade, please stop digging up ancient threads!
edit: and wolf? <!--emo&;)--><img src="http://www.natural-selection.org/iB_html/non-cgi/emoticons/wink.gif" border="0" valign="absmiddle" alt=';)'><!--endemo--><!--QuoteEnd--></td></tr></table><span id='postcolor'><!--QuoteEEnd--> Well I saw this thread on my bi-weekly posting spree, and I thought I would comment on the billions of different "Your eye sees things at *insert FPS here*"
I've seen people say it is at 12, 20, 32, 34, 45, 50, and once even 100.
<!--QuoteBegin--greyfox555+Aug. 12 2002,01:58--></span><table border="0" align="center" width="95%" cellpadding="3" cellspacing="1"><tr><td><b>Quote</b> (greyfox555 @ Aug. 12 2002,01:58)</td></tr><tr><td id="QUOTE"><!--QuoteEBegin-->Your wrong hida, the human brain refreshes at 24 frames a second. How do I know this? A while back a buncha geeks got together in a major corperation lab, to figure out the problem of expensive movie film....they found out that the human brain refreshes at 24fps, and switched the film to that fps, so film would become MUCH cheaper..one of those geeks was my father in RCA technical labs <!--emo&:)--><img src="http://www.natural-selection.org/iB_html/non-cgi/emoticons/smile.gif" border="0" valign="absmiddle" alt=':)'><!--endemo--> If the frames per second hits 24 on your screen or a tv, or a movie screen, you see flickering lines, because that 24 fps interfiers with the 24fps that your brain is running at, thats why they dim the lights in the movie theators.<!--QuoteEnd--></td></tr></table><span id='postcolor'><!--QuoteEEnd--> im sorry but youre really wrong. All those old studies are just buggered. If you do a course in the human senses (which is now a part of Computer Science:Games Development) you will find that the eye or brain dont 'refresh' at any rate. Tests have proved that people can recognise and subliminally read single frames shown inbetween blanks at over 200fps. The eye just constantly receives photons (and it is sensitive enough to be able to recognise a single photon in a black room) and this stimulation is constantly fed into the brain. Youre brain doesnt run on a fetch-execute cycle. Its not a linear program. We have a persistance of vision of around 15 fps, which means anything shown ~15 times per second will be recognised as a moving object, of course this depends on the person, but 15 frames is usually sufficiant for your brain to be fooled into thinking that things are actually moving. But 70fps fools the brain bettererer! <!--emo&:D--><img src="http://www.natural-selection.org/iB_html/non-cgi/emoticons/biggrin.gif" border="0" valign="absmiddle" alt=':D'><!--endemo-->
Thank you for just admitting your an exploited h2o <!--emo&:)--><img src="http://www.natural-selection.org/iB_html/non-cgi/emoticons/smile.gif" border="0" valign="absmiddle" alt=':)'><!--endemo-->
I'm prolly beating dead horse here, but when I went from a a 400 Mhz Comcrap to a 900 mhz home built, my Tribes 2 FPS went from 10 to 20.
That was a huge difference!
Then, when I got my new GF3, my FPS went from 20 to 50...
Oh my god. It was awesome.
Now, after I got my new processor, 1.67 Ghz or gaming glory, I get upwards of 100 Fps.
There is a <b>big</b> damn difference between 50 and 100 fps.
Comments
However, it's true that having extra fps as a lag spike buffer is helpful.
In a lighter note, im not rich, and niether are most of us gamers, however 20 dollars for a new video card is NOT that much money.
I'd heard of this theory, but if you can't process all the images that flicker in front of your eyes if it's faster then 30 fps, then why can you see a difference between 60 and 85hz moniter refresh?
85hz meaning 85 times a second
no.
The human eye can distinguish, but it sees motion at much lower than that. 19 fps looks a little bit jumpy, but it looks fairly motion-like. after lowering my FPS to 800x600 I now get about 35 fps. Sure, it may look poo-like, but It is faster, it looks much more fluid. In the same sense, when I go to my cousin's place (his computer is about 6 times faster than mind, and he has an ATI Radeon) and I see him running at 80+ fps, I can definately tell the difference, and it makes a BIG difference. After this it really gets redundant anyways..
edit: and wolf? <!--emo&;)--><img src="http://www.natural-selection.org/iB_html/non-cgi/emoticons/wink.gif" border="0" valign="absmiddle" alt=';)'><!--endemo-->
sorry, no.
if the mind DID refresh at 24 fps (I don't know about this one) then when TV or movies were at 24 fps you wouldn't see any flicker at all, in fact, it would be completely fluid. It would look just like real life. How on earth would it interfere?
The only flicker I ever see is watching a computer going on on the TV (someone filming a computer). The Frames Per Second's are different, that's the only FPS interference I can think of.
However, it's true that having extra fps as a lag spike buffer is helpful.<!--QuoteEnd--></td></tr></table><span id='postcolor'><!--QuoteEEnd-->
No. TV can run at really any speed, it is virtually doubled by motion blurring because of the way it is filmed (from real life...)
In a lighter note, im not rich, and niether are most of us gamers, however 20 dollars for a new video card is NOT that much money.<!--QuoteEnd--></td></tr></table><span id='postcolor'><!--QuoteEEnd-->
My problem isn't my video card.. it's my slow-as-poo processor. I've looked into this, I would need a new motherboard, power supply, RAM, even a new tower, plus the actual processor. Sure, my video card sucks, but my processor is truly pants.
edit: and wolf? <!--emo&;)--><img src="http://www.natural-selection.org/iB_html/non-cgi/emoticons/wink.gif" border="0" valign="absmiddle" alt=';)'><!--endemo--><!--QuoteEnd--></td></tr></table><span id='postcolor'><!--QuoteEEnd-->
Well I saw this thread on my bi-weekly posting spree, and I thought I would comment on the billions of different "Your eye sees things at *insert FPS here*"
I've seen people say it is at 12, 20, 32, 34, 45, 50, and once even 100.
im sorry but youre really wrong. All those old studies are just buggered.
If you do a course in the human senses (which is now a part of Computer Science:Games Development) you will find that the eye or brain dont 'refresh' at any rate.
Tests have proved that people can recognise and subliminally read single frames shown inbetween blanks at over 200fps.
The eye just constantly receives photons (and it is sensitive enough to be able to recognise a single photon in a black room) and this stimulation is constantly fed into the brain. Youre brain doesnt run on a fetch-execute cycle. Its not a linear program.
We have a persistance of vision of around 15 fps, which means anything shown ~15 times per second will be recognised as a moving object, of course this depends on the person, but 15 frames is usually sufficiant for your brain to be fooled into thinking that things are actually moving. But 70fps fools the brain bettererer! <!--emo&:D--><img src="http://www.natural-selection.org/iB_html/non-cgi/emoticons/biggrin.gif" border="0" valign="absmiddle" alt=':D'><!--endemo-->
I'm prolly beating dead horse here, but when I went from a a 400 Mhz Comcrap to a 900 mhz home built, my Tribes 2 FPS went from 10 to 20.
That was a huge difference!
Then, when I got my new GF3, my FPS went from 20 to 50...
Oh my god. It was awesome.
Now, after I got my new processor, 1.67 Ghz or gaming glory, I get upwards of 100 Fps.
There is a <b>big</b> damn difference between 50 and 100 fps.