Is This Cheating?
Kamex
Join Date: 2004-06-06 Member: 29156Members
For a long time, I have been playing NS, along with all other half-life mods, with this userconfig.cfg file:
////////////////////
// Video Settings //
////////////////////
cl_showfps 1
gl_d3dflip 0
gl_polyoffset -0.001
gl_round_down 0
gl_texsort 1
gl_texturemode GL_LINEAR
gl_wateramp 1
fps_max 500
////////////////////
// Audio Settings //
////////////////////
bgmvolume 1
hisound 1
///////////////////////
// Other Preferences //
///////////////////////
r_novis 1
r_wateralpha 1
I recently read the FAQ over, and it tells me that any command that requires the console to use is considered to be an exploit and is cheating severely. I want to confirm this. If this is indeed cheating, then what parts are cheating (some of it, all of it) and why. I am not a hacking type person, and do not want to be a cheater.
////////////////////
// Video Settings //
////////////////////
cl_showfps 1
gl_d3dflip 0
gl_polyoffset -0.001
gl_round_down 0
gl_texsort 1
gl_texturemode GL_LINEAR
gl_wateramp 1
fps_max 500
////////////////////
// Audio Settings //
////////////////////
bgmvolume 1
hisound 1
///////////////////////
// Other Preferences //
///////////////////////
r_novis 1
r_wateralpha 1
I recently read the FAQ over, and it tells me that any command that requires the console to use is considered to be an exploit and is cheating severely. I want to confirm this. If this is indeed cheating, then what parts are cheating (some of it, all of it) and why. I am not a hacking type person, and do not want to be a cheater.
Comments
Natural Selection is pretty"variabel"
As not only the skill levels can be ALOT different, the players themselves can alter the way they play the way by changing the default models/skins/sprites with customized ones, like smaller or preciser crosshairs/labeled minimaps/eye catching skins (very bright colors).
Though most of this is because of Half Life itself, you can change it sooo much...
Half Life is still one of my favorite games to play with all the mods available though <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html//emoticons/smile.gif' border='0' style='vertical-align:middle' alt='smile.gif' /><!--endemo-->
Natural Selection is pretty"variabel"
As not only the skill levels can be ALOT different, the players themselves can alter the way they play the way by changing the default models/skins/sprites with customized ones, like smaller or preciser crosshairs/labeled minimaps/eye catching skins (very bright colors).
Though most of this is because of Half Life itself, you can change it sooo much...
Half Life is still one of my favorite games to play with all the mods available though <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html//emoticons/smile.gif' border='0' style='vertical-align:middle' alt='smile.gif' /><!--endemo--> <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
I don't think it is possible to give everyone in a game identical playing experience. There are too many variables that are impossible to change. I have an ATI RADEON 9800 PRO. With this card, I can get an almost steady 99 fps on a 1600 x 1200 resolution. This gives me a huge advantage over someone with a 320 x 240 resolution that can't get more than 20 fps. Even if you could some how give everyone the same system specs, same monitor, and speakers available to everyone, there's still internet connection issues. I have RR, but for some reason, my ping sometimes gets high enough that people start teleporting around. In this state I get owned by someone with 10 ping, and locking settings will not change that.
What I am concerned about is that the creators of NS are saying that any command that requires the console to perform is considered cheating, and I'm wondering if they seriously mean everything, including what is in this config file.
Average human eye can't differentiate a frame rate above 75 or so anyhow.... ^_o (no i have no proof of that.. just some thing i've always been led to believe)
Actually it depends on the media. TV and movies are shown at about 30+ fps and I don't think anyone would say that they are choppy. If you have the same 30+ fps for HL, the difference is pretty obvious.
me23:
If I am not wrong the HL engine caps the FPS at 100. At which point most monitors fail to catch up with their 75Mhz scan rate.
Actually it depends on the media. TV and movies are shown at about 30+ fps and I don't think anyone would say that they are choppy. If you have the same 30+ fps for HL, the difference is pretty obvious.
<!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
Actualyl, when I have been playing games at my PC with about 100 fps, and I go watch some TV, I always notice that the TV screen flashes a bit...after watching it for a while, I get used to it.
Consider the intent of that rule, rather than the literal. There are not enough in-game options for everyone to even be able to enjoy the game without changing some config files. As long as you are not giving yourself an artificial advantage over other players, then it's fine, espescially if you're just trying to make up for cruddy hardware.
I wish I could remember the name of the university now, but some respected joint in the US has run research to test the human eye. The research was conducted years and years and years ago, but was found to be very accurate in future testing. Their findings indicated that the human eye can detect the equivalent of 24 frames per second - consequently most televisions sets at the time came out producing images at exactly that rate. I don't know if TV's still output at that rate or not. If you look at a TV or monitor or other visual source that uses a CRT and outputs a frame rate significantly higher than that, it will appear to flash, or you'll see descending black lines move slowly down the screen. That's why, when you watch the news or some other TV program that shows active computer monitors, you can see that flashing and black lines - it's because the monitor is outputting faster than the TV station is broadcasting.
At any rate, this research explains that setting your frame rate above 24 fps is theoretically disadvantaging you, because things will happen faster than you can see. Of course, in practice this disadvantage doesn't really exist, because it's so minute it can be discounted. However, the proven theory remains... There's no point in exceeding 24 fps, because you can't see it anyway.
You'll just have to buy a red car to compensate for your lack of macho fps rates <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html//emoticons/smile.gif' border='0' style='vertical-align:middle' alt='smile.gif' /><!--endemo-->
From my experience, I can generally notice dips under 50 FPS, but higher than that just seems similar to me.
All that stuff looks like it's just to optimise for given hardware. Optimising PC performance could never be considered cheating.
At any rate, this research explains that setting your frame rate above 24 fps is theoretically disadvantaging you, because things will happen faster than you can see. Of course, in practice this disadvantage doesn't really exist, because it's so minute it can be discounted. However, the proven theory remains... There's no point in exceeding 24 fps, because you can't see it anyway.
You'll just have to buy a red car to compensate for your lack of macho fps rates <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html//emoticons/smile.gif' border='0' style='vertical-align:middle' alt='smile.gif' /><!--endemo--> <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
To expand,
*For those familiar with how CRT works*
At any given frame, the electron beam will be at a different position on screen from the previous frame. The bits just ahead of the electron beam will be darkest, and those just behind are lightest. At each capture, they will be at a different point, which is normally a little above or a little below where it was last time. Hence the patches of light and dark move up or down the screen
You may think that this is nonsence as the differences in light and dark should be noticeable when we view CRT normally. They should. In reality, the things actually get darker quicker than we think. It's just a trick of our eye that they stay as light as they seem to.
Incidentally, the same effect can be seen with car wheels. If they're spinning at a certain speed, on the camera they seem to be slowly spinning backwards
*For those not familiar with how a CRT works*
<a href='http://www.google.co.uk' target='_blank'>http://www.google.co.uk</a>
(Old silent films run at 15 without a blur effect)
TVs broadcast at 24, HOWEVER, your TV will show them around 60ish due to a nice feature that allows blending of frames. (for example, pause a digtal movie or VCR, while the VCR's fuzzyness is also in part of the nature of tape, digital does the same thing, which leads us to belive that there is a blending affect, smoke and mirrors so to speak)
(Note, HDTV broadcasts at 60 without a blur, hence the improved quaility)
Its also good to note TVs run at 320x200 res (uh i may be wrong, its something around there) HDTV runs at 530x400 or something of that nature, agian i may be wrong on the res on that too.
The average human eye sees at about 64 fps. (Your eye's framerate drops with age and excessive TV watching, that cool blending effect will blur your vision, study done by some UK university about 2 years ago.)
This is the average, now you have to realize The average is on a bell curve (along with almost all human traits, dexterity, strength etc) I belive the lowest goes down to 15 fps, to the highest of 120. (Oddly enough another study showed that people who play video game on an average of 2 hours a day have an incressed frame rate)
All in all, always run your game faster than you can see. This way your eyes dont get tired. A great example is, Floresent lighting. It hurts some peoples eyes. This is because their eye sight is around the same fps as a florsent bulb (which is 55~ fps) But have no problems watching TV for 12 hours a day.
Thought it would be nice to get this information out there <!--emo&:p--><img src='http://www.unknownworlds.com/forums/html//emoticons/tounge.gif' border='0' style='vertical-align:middle' alt='tounge.gif' /><!--endemo-->
Most of the things about the human eye are myths... go google for "FPS eye" and you will find a bunch of stuff like the one above (just took one of the random sites)
Actually it depends on the media. TV and movies are shown at about 30+ fps and I don't think anyone would say that they are choppy. If you have the same 30+ fps for HL, the difference is pretty obvious.
me23:
If I am not wrong the HL engine caps the FPS at 100. At which point most monitors fail to catch up with their 75Mhz scan rate. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
In case no one has said:
The HL engine now in steam can go higher than 100 by turning "developer" on, by putting "developer 1" (no quotation marks)in the console.
The config file your showed Kamex is no hacking at all.
Most TV's use 50Hz in Europe and 60 Hz in the US with slighty lower resolution in the US. Some TV's use 100 Hz or 120 Hz to avoid flicker. Of those only 25 and 30 FPS are new information, the others are interlaced(every other line is from the previous frame and every other from the next).
Movies use motion blur, that is the big difference. Even so, at 25 FPS it's quite easy to see the choppyness when things move fast. You see 60 Hz flicker very easilly and you "see" a much higher framerates in non motion-blurred movies and games. You can't actually differentiate between one frame and another or anything like that, but your eyes gather information continously and it looks more like motion blur when you have a high framerate. If you move your hand across the screen quickly(and don't use a TFT of course) you will see a bunch of discrete instances of your hand, clearly you need craptacularilly high framerates before you stop noticing this, and so you do in games as well.
Correctly motion blurred frames yes, crisp frames such in a game, no you need insane framerates beyond the capabillities of current monitors to reach a point at which rapid movement looks like motion blurred movement.
<!--QuoteBegin--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->If you look at a TV or monitor or other visual source that uses a CRT and outputs a frame rate significantly higher than that, it will appear to flash, or you'll see descending black lines move slowly down the screen. That's why, when you watch the news or some other TV program that shows active computer monitors, you can see that flashing and black lines - it's because the monitor is outputting faster than the TV station is broadcasting.<!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
This has nothing at all with the capabillites of the eye. This is because a CRT light up it's pixel during a very small intervall, it scans across the screen like when you read a paper and light one pixel up at a time. When you then have a camera filming this it will be out of sync with the monitor which may fill say 3.1 screens for every one the camera gets. part of the screen would have been overdrawn more times than another and look brighter than the other when viewed on TV, and the line where the difference in brightness occurs will appear to travel. This has nothing at all to do with the eye and isn't evidence of anything.
Most TV's operate at either 50 or 60 Hz depending on where you live. You can't see that a TV flickers like mad then I take it? This effect is much more obvious at lower refresh rates, at 25 Hz as you suggested it would probably cause anyone looking at it to get headaches withing minutes. The theater avoids this flickering by displaying each frame several times in a row(usually three). You need 80-90 Hz before most people stop noticing the flicker completely. This is what monitors usually use, 60 Hz is known to cause eye strain and head aches for many people, staring at a 25 Hz flickering screen all day would be medieval torture.
<!--QuoteBegin--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->At any rate, this research explains that setting your frame rate above 24 fps is theoretically disadvantaging you, because things will happen faster than you can see. Of course, in practice this disadvantage doesn't really exist, because it's so minute it can be discounted. However, the proven theory remains... There's no point in exceeding 24 fps, because you can't see it anyway.<!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
Try playing NS at 24 FPS, I dare you.
No one would care what you do to your config to make the game run better or look prettier, run smoother or such.
and yes, looking at a monitor running at 70hz and below actually causes me pain :E
I wish I could remember the name of the university now, but some respected joint in the US has run research to test the human eye. The research was conducted years and years and years ago, but was found to be very accurate in future testing. Their findings indicated that the human eye can detect the equivalent of 24 frames per second - consequently most televisions sets at the time came out producing images at exactly that rate. I don't know if TV's still output at that rate or not. If you look at a TV or monitor or other visual source that uses a CRT and outputs a frame rate significantly higher than that, it will appear to flash, or you'll see descending black lines move slowly down the screen. That's why, when you watch the news or some other TV program that shows active computer monitors, you can see that flashing and black lines - it's because the monitor is outputting faster than the TV station is broadcasting.
At any rate, this research explains that setting your frame rate above 24 fps is theoretically disadvantaging you, because things will happen faster than you can see. Of course, in practice this disadvantage doesn't really exist, because it's so minute it can be discounted. However, the proven theory remains... There's no point in exceeding 24 fps, because you can't see it anyway.
You'll just have to buy a red car to compensate for your lack of macho fps rates <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html//emoticons/smile.gif' border='0' style='vertical-align:middle' alt='smile.gif' /><!--endemo--> <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
All the eye does is transmit light to the brain. It doesent percieve motion or color or anything but light. It does break up the color spectrum before sending it to the brain, but the actual translation of color into something you can "see" occurs in the brain.
That said, it takes 12 frames per second to trick the brain into perceiving motion as opposed to a series of still pictures. While its true that it only takes 24 fps for the brain to perceive fluid motion, the notion that we dont or cant use more than that is false, because they brain (or the eye for that matter) does not receive light in a series of "frames" but rather as a constant smooth stream.
Take film for example, all movies are shot at 24 (video shoots at 30) frames per second(wich btw is not an issue of science or perception, but of cost) In film, if a camera shot pans across a pickett fence, the veiwer sees a flicker, as a space on screen jumps between white fence and green background in a very unnatural way. Thats because the picture isnt there for our brain to fill in the blank and see a smooth motion as opposed to flickering, the frames literally "jump" from white to green. Take the same shot filmed at 64 frames per second, and the veiwer sees that smooth motion, with no flicker...The duration of the shot is the same, but there is twice the information for our brain to extrapolate.
Take that example, and add all the variables and issues associated with, moniters, resolutions, refresh rates, RAM and video cards, and I tink its safe to say that games are stuck in a sort of perpetual state of "pickett fence feedback", far moreso that film or TV. What am I saying? Im saying yes, the higher the frames, the absolute better, and yes your brain sure as hell does perceive higher than 24 frames per second. The more visual information you brain can work from the better...so get out there and git yerself a new video card <!--emo&:p--><img src='http://www.unknownworlds.com/forums/html//emoticons/tounge.gif' border='0' style='vertical-align:middle' alt='tounge.gif' /><!--endemo-->