Question
Demon_Raiser
Join Date: 2002-11-27 Member: 10122Members
<div class="IPBDescription">Changing the Coloring</div> Is it possible to change the color settings of NS from 16-Bit to 32-bit cause I am really wondering if it would look better on my computer if I had it set to 32-bit coloring since I always have that set in my properties of my display. Yes, I do have a sucky video card but I think I wanna try changing the coloring of ns. Any ideas I could try instead of reinstalling ns?
Comments
16-bit = for most computers
32-bit = computers taht are fast
^
I want the 32-bit for ns now but i don't know how to change it without installing ns, i want to know how to change it without installing ns
Some people just like overkill. Just like the guy who wont stop twaeking until he tops at 100 fps, even though beyond 60 is unnotticable to the human eye.
Some people just like overkill. Just like the guy who wont stop twaeking until he tops at 100 fps, even though beyond 60 is unnotticable to the human eye.<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
actually above 24 is unnoticible to the human eye, but FPS is diffrent for gaming. I notice a diff between 75 and 100. I always play at 100
and setting yer crappy vedio card to desktop default would force the game into 32 bit color..................
As far as your question goes, I have no idea.
-32bpp
Not that the human eye can see this BUT
There is more than a "little" difference between 16 bit and 32 bit.
16 bit color is 65,536 colors and 32 bit is 16.7 million.
24 bit is reportedly the highest a human eye can see. Dunno if that's correct however.
i heard that new graphic cards will increase the number of colors to maybe 16bits per color component instead of 8bits which is what 24bit color is.
Your eyes process and transmit information in one picosecond (0.00000000001 seconds, 1 trillionth of a second), meaning you can see over a trillion frames/second. <a href='http://amo.net/NT/02-21-01FPS.html' target='_blank'>Read this for the information</a>. Judging from the work and research put into that, I believe it.
With a GeForce 4, I can easilly tell between 120 FPS and 80 FPS.
From another site:
<!--QuoteBegin--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->After all, if TV looks smooth at 30fps, then that must be a hard-coded, physiological limit, doesn't it? <!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
<!--QuoteBegin--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->Without getting too vague, I've always thought that most games are playable on any "reasonable" resolution, reasonable being 25-35 fps. Back when you guys were playing with your Pentium Pro 200s, I was running Quake on a Pentium 166, and this was in software, too. In fact, I remember the hot (and only) 3D card to get at the time was the Verite V1000, and while the framerates for that card were significantly higher than what we could get in software, the filtered look was a little too much for me to bear. Obviously, higher framerates will look smoother on screen, but what we're talking about here is a "reasonable" framerates.
<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
EDIT: Fixing my data, got it wrong <!--emo&:p--><img src='http://www.unknownworlds.com/forums/html/emoticons/tounge.gif' border='0' valign='absmiddle' alt='tounge.gif'><!--endemo-->.
What on earth does that have to do with color depth?
Not that the human eye can see this BUT
There is more than a "little" difference between 16 bit and 32 bit.
16 bit color is 65,536 colors and 32 bit is 16.7 million.<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
The 'human eye' can see far more than 256 shades of red, green, or blue.
16.7 million is the number of <b>combinations</b> of colour the screen can show, which is made up of 0-255 'points' of Red, Green, and Blue.
Also, your TV is actually 60fps, or 50fps for you PAL folks outside of the USA. It's using a trick called interlacing, where only half of the physical surface is updated every other frame, woven together in thin lines.
This is why, when you watch most cable TV on one of the HDTV's, and a flat-shaded cartoon (like the newer Batman or Batman Beyond as good examples) you'll occasionally be able to see an effect like the picture projected across a comb, lots of thin horizontal lines. :-)
Scientists have found that while the human eye can detect 'differences' at a very high speed and to millionths of shades, in reality, with a complex image, more than roughly 1024 shades at 100fps would be overkill for well over 90% of the population.
Surprisingly, most every single major video card manufacturer is coming out with video cards that support these 10-bit video modes, which would be advertised as 30-bit more commonly. The Radeon 9700 and Matrox Perhelia both are capable of this in fact, with noticable improvements in most video games.
The Radeon 9700 in fact, has an even more noticable improvement because it's able to calculate graphics to a far greater precision than it can display, allowing for even smoother blending and less rounding errors, allowing it to actually display the full 1024 shades of all three colours in most games.
Little-known fact, assume each layer of transparency you see is taking away one bit from your colour depth, I.E. if you see more than 3 'transparent' sprites overlapping, your effective color depth is down to 15/16-bit, not 24-bit. Remember, when I say lose a bit, I mean for red, green, AND blue, three bits actually lost.
If you're running at 15/16-bit natively... this is why things look so washed-out and pixelated whenever someone throws a spore cloud, and why webbing actually is harder to see at 24/32bpp than 15 or 16bpp. (In reality, 32bpp is only 24bpp, but with extra bytes and bits left over to keep the memory set up in a way the computer can access more effeciently, and therefore faster.)