Question

Demon_RaiserDemon_Raiser Join Date: 2002-11-27 Member: 10122Members
<div class="IPBDescription">Changing the Coloring</div> Is it possible to change the color settings of NS from 16-Bit to 32-bit cause I am really wondering if it would look better on my computer if I had it set to 32-bit coloring since I always have that set in my properties of my display. Yes, I do have a sucky video card but I think I wanna try changing the coloring of ns. Any ideas I could try instead of reinstalling ns?

Comments

  • GreyPawsGreyPaws Join Date: 2002-11-15 Member: 8659Members
    set yer vedio card to "default to desktop" settings.. should do it if yer desktop is at 32???
  • Demon_RaiserDemon_Raiser Join Date: 2002-11-27 Member: 10122Members
    i'm talking about when you select what color settings you want for ns like this
    16-bit = for most computers
    32-bit = computers taht are fast

    ^
    I want the 32-bit for ns now but i don't know how to change it without installing ns, i want to know how to change it without installing ns
  • CForresterCForrester P0rk(h0p Join Date: 2002-10-05 Member: 1439Members, Constellation
    Does it honestly matter? There's little difference between 32 and 16 bit color, besides the tone of maybe 3 colors.
  • SemperFiSemperFi Join Date: 2002-08-02 Member: 1049Members
    <!--QuoteBegin--CForrester+Dec 17 2002, 10:07 PM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (CForrester @ Dec 17 2002, 10:07 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->Does it honestly matter? There's little difference between 32 and 16 bit color, besides the tone of maybe 3 colors.<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
    Some people just like overkill. Just like the guy who wont stop twaeking until he tops at 100 fps, even though beyond 60 is unnotticable to the human eye.
  • GreyPawsGreyPaws Join Date: 2002-11-15 Member: 8659Members
    <!--QuoteBegin--|SemperFi|+Dec 17 2002, 05:13 PM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (|SemperFi| @ Dec 17 2002, 05:13 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--><!--QuoteBegin--CForrester+Dec 17 2002, 10:07 PM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (CForrester @ Dec 17 2002, 10:07 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->Does it honestly matter? There's little difference between 32 and 16 bit color, besides the tone of maybe 3 colors.<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
    Some people just like overkill. Just like the guy who wont stop twaeking until he tops at 100 fps, even though beyond 60 is unnotticable to the human eye.<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
    actually above 24 is unnoticible to the human eye, but FPS is diffrent for gaming. I notice a diff between 75 and 100. I always play at 100

    and setting yer crappy vedio card to desktop default would force the game into 32 bit color..................
  • TediakTediak Join Date: 2002-11-01 Member: 2910Members
    Actually there's a huge difference between 16 and 32 bit. I'd rather run at 20-30 FPS with 32 than 80 with 16.
    As far as your question goes, I have no idea.
  • AhnteisAhnteis teh Bob Join Date: 2002-10-02 Member: 1405Members, NS1 Playtester, Constellation
    Add to your shortcut
    -32bpp
  • MonsieurEvilMonsieurEvil Join Date: 2002-01-22 Member: 4Members, Retired Developer, NS1 Playtester, Contributor
    Yessir. Add the switch '-32bpp' (no quotes) to your 'Launch Natural Selection' shortcut on the start menu. It will definitely add a lot of overhead to your video card, but if it can handle it, things will look considerably better, especially when it comes to particle effects.
  • Heresy_FnordHeresy_Fnord Join Date: 2002-11-05 Member: 7207Members
    <!--QuoteBegin--CForrester+Dec 17 2002, 05:07 PM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (CForrester @ Dec 17 2002, 05:07 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->Does it honestly matter? There's little difference between 32 and 16 bit color, besides the tone of maybe 3 colors.<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
    Not that the human eye can see this BUT

    There is more than a "little" difference between 16 bit and 32 bit.

    16 bit color is 65,536 colors and 32 bit is 16.7 million.
  • AhnteisAhnteis teh Bob Join Date: 2002-10-02 Member: 1405Members, NS1 Playtester, Constellation
    you'll get banding in color gradients if you use 16 bit.

    24 bit is reportedly the highest a human eye can see. Dunno if that's correct however.
  • ViPrViPr Resident naysayer Join Date: 2002-10-17 Member: 1515Members
    i can count the different colors in 24bit color mode sometimes and i'm not talking about the integer rounding off problems in most graphic cards; i can see it in like raytraced images where everything is calculated with the highest precision decimal numbers and only the final color is converted to 24bit.

    i heard that new graphic cards will increase the number of colors to maybe 16bits per color component instead of 8bits which is what 24bit color is.
  • AhnteisAhnteis teh Bob Join Date: 2002-10-02 Member: 1405Members, NS1 Playtester, Constellation
    I believe that'll just be for the tranistory steps, not the final result. They'll probably also devote a lot of those to transparency.
  • uranium_235uranium_235 Join Date: 2002-11-20 Member: 9478Banned
    edited December 2002
    <!--QuoteBegin--Ahnteis+Dec 17 2002, 06:45 PM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Ahnteis @ Dec 17 2002, 06:45 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->24 bit is reportedly the highest a human eye can see.  Dunno if that's correct however.<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
    Your eyes process and transmit information in one picosecond (0.00000000001 seconds, 1 trillionth of a second), meaning you can see over a trillion frames/second. <a href='http://amo.net/NT/02-21-01FPS.html' target='_blank'>Read this for the information</a>. Judging from the work and research put into that, I believe it.

    With a GeForce 4, I can easilly tell between 120 FPS and 80 FPS.

    From another site:
    <!--QuoteBegin--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->After all, if TV looks smooth at 30fps, then that must be a hard-coded, physiological limit, doesn't it? <!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->

    <!--QuoteBegin--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->Without getting too vague, I've always thought that most games are playable on any "reasonable" resolution, reasonable being 25-35 fps. Back when you guys were playing with your Pentium Pro 200s, I was running Quake on a Pentium 166, and this was in software, too. In fact, I remember the hot (and only) 3D card to get at the time was the Verite V1000, and while the framerates for that card were significantly higher than what we could get in software, the filtered look was a little too much for me to bear. Obviously, higher framerates will look smoother on screen, but what we're talking about here is a "reasonable" framerates.
    <!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->

    EDIT: Fixing my data, got it wrong <!--emo&:p--><img src='http://www.unknownworlds.com/forums/html/emoticons/tounge.gif' border='0' valign='absmiddle' alt='tounge.gif'><!--endemo-->.
  • AhnteisAhnteis teh Bob Join Date: 2002-10-02 Member: 1405Members, NS1 Playtester, Constellation
    <!--QuoteBegin--uranium - 235+Dec 18 2002, 02:11 AM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (uranium - 235 @ Dec 18 2002, 02:11 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->Your eyes process and transmit information in one picosecond (0.00000000001 seconds, 1 trillionth of a second), meaning you can see over a trillion frames/second. ng <!--emo&:p--><img src='http://www.unknownworlds.com/forums/html/emoticons/tounge.gif' border='0' valign='absmiddle' alt='tounge.gif'><!--endemo-->.<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
    What on earth does that have to do with color depth?
  • uranium_235uranium_235 Join Date: 2002-11-20 Member: 9478Banned
    edited December 2002
    Confoocuis say: Man who read whole post not look like idiot when replying.
  • WolfWingsWolfWings NS_Nancy Resurrectionist Join Date: 2002-11-02 Member: 4416Members
    <!--QuoteBegin--Heresy_Fnord+Dec 17 2002, 03:25 PM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Heresy_Fnord @ Dec 17 2002, 03:25 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--><!--QuoteBegin--CForrester+Dec 17 2002, 05:07 PM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (CForrester @ Dec 17 2002, 05:07 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->Does it honestly matter? There's little difference between 32 and 16 bit color, besides the tone of maybe 3 colors.<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
    Not that the human eye can see this BUT

    There is more than a "little" difference between 16 bit and 32 bit.

    16 bit color is 65,536 colors and 32 bit is 16.7 million.<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
    The 'human eye' can see far more than 256 shades of red, green, or blue.

    16.7 million is the number of <b>combinations</b> of colour the screen can show, which is made up of 0-255 'points' of Red, Green, and Blue.

    Also, your TV is actually 60fps, or 50fps for you PAL folks outside of the USA. It's using a trick called interlacing, where only half of the physical surface is updated every other frame, woven together in thin lines.

    This is why, when you watch most cable TV on one of the HDTV's, and a flat-shaded cartoon (like the newer Batman or Batman Beyond as good examples) you'll occasionally be able to see an effect like the picture projected across a comb, lots of thin horizontal lines. :-)

    Scientists have found that while the human eye can detect 'differences' at a very high speed and to millionths of shades, in reality, with a complex image, more than roughly 1024 shades at 100fps would be overkill for well over 90% of the population.

    Surprisingly, most every single major video card manufacturer is coming out with video cards that support these 10-bit video modes, which would be advertised as 30-bit more commonly. The Radeon 9700 and Matrox Perhelia both are capable of this in fact, with noticable improvements in most video games.

    The Radeon 9700 in fact, has an even more noticable improvement because it's able to calculate graphics to a far greater precision than it can display, allowing for even smoother blending and less rounding errors, allowing it to actually display the full 1024 shades of all three colours in most games.

    Little-known fact, assume each layer of transparency you see is taking away one bit from your colour depth, I.E. if you see more than 3 'transparent' sprites overlapping, your effective color depth is down to 15/16-bit, not 24-bit. Remember, when I say lose a bit, I mean for red, green, AND blue, three bits actually lost.

    If you're running at 15/16-bit natively... this is why things look so washed-out and pixelated whenever someone throws a spore cloud, and why webbing actually is harder to see at 24/32bpp than 15 or 16bpp. (In reality, 32bpp is only 24bpp, but with extra bytes and bits left over to keep the memory set up in a way the computer can access more effeciently, and therefore faster.)
Sign In or Register to comment.