Upgraded My Computer, Same Fps
Uh-Oh
Join Date: 2002-11-04 Member: 6917Members
in Tech Support
<div class="IPBDescription">Upgraded to a way better system, no gain</div> Ok, so I was tired of getting 10-99 FPS (with an average of 30, and dropping to 15 in fights and at base).
So, I upgraded my system to a better one.
<b>Old configuration:</b>
Duron 800
Gigabyte GA7ZX-1
384MB PC-133
Voodoo3 3000 PCI 16MB
WindowsXP Pro
All the latest via drivers, and the best Voodoo3 drivers (Amigamerlin)
I played in 1024x768 16bit.
<b>New configuration</b>
AthlonXP 1900+
ECS K7S5A revision 3.1
Hercules Prophet II GeForce2 Ti 64MB DDR
384MB PC-133
WindowsXP Pro
30.82 detonator drivers
I play in 1280x960 (I have a 19" screen) 16bit
I get the same FPS counts...
HELP!!!
Thanks
So, I upgraded my system to a better one.
<b>Old configuration:</b>
Duron 800
Gigabyte GA7ZX-1
384MB PC-133
Voodoo3 3000 PCI 16MB
WindowsXP Pro
All the latest via drivers, and the best Voodoo3 drivers (Amigamerlin)
I played in 1024x768 16bit.
<b>New configuration</b>
AthlonXP 1900+
ECS K7S5A revision 3.1
Hercules Prophet II GeForce2 Ti 64MB DDR
384MB PC-133
WindowsXP Pro
30.82 detonator drivers
I play in 1280x960 (I have a 19" screen) 16bit
I get the same FPS counts...
HELP!!!
Thanks
Comments
<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
There's your problem.
<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
I think this is your problem. Try to update them to the new drives. I think its something like 40.42. I don't know cause I have a Voodoo3 3000 16mb AGP.
if you look at what you wrote...
your experiment has a flaw
your resolution is higher on the newer computer.
that is going to cut your FPS down
try running with the *SAME* settings as your old computer, and you'll notice you get a higher framerate.
People don't realize that whenever you change your settings, even if your computer can support it, it will still change the framerate.
Pic =
Crowded base in a 16 player server,
i'm constantly at 40 + fps on:
AMD DURON (900 mhz)
256 SDR RAM
Geforce 4 mx 440
18 gig harddrive with 8 gig free
OpenGL 800 * 600
Next on my list:
<u>If you don't know what the hell you're talking about, then don't bother posting.</u>
<!--QuoteBegin--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->GeForce2 Ti 64MB DDR
There's your problem. <!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
It so isn't the problem. I RAN THE GAME AT THE SAME FPS WITH A VOODOO 3 3000 PCI 16MB... I can run unreal tournament 2k3 with HIGHEST settings fine with that GeForce2 ti 64MB. So don't come bullshiting me about a 5 year old game. Sorry for bursting everyone's bubble, but I'm sick and tired of people saying to upgrade a fine system. $hit! People with GeForce4 ti 4600 128MB are having the SAME problem as me!
<!--QuoteBegin--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->heh heh nice 1,,,,also dont forget,,,no matter how fast your pc is,,,the server wont run any faster <!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
You're simply out of my league... I mean... you should be in those classes were they teach you what a mice does...
<!--QuoteBegin--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->30.82 detonator drivers
I think this is your problem. Try to update them to the new drives. I think its something like 40.42. I don't know cause I have a Voodoo3 3000 16mb AGP. <!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
Newer drivers don't mean BETTER drivers. For GeForce 2s you are better off with the 23.11 then with ANY other newer driver. nVidia is optimising the newer drivers for the newer cards. Best performences depend on the driver/card model configuration. Thanks anyway though <!--emo&???--><img src='http://www.unknownworlds.com/forums/html/emoticons/confused.gif' border='0' valign='absmiddle' alt='confused.gif'><!--endemo-->
<!--QuoteBegin--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> uhh
if you look at what you wrote...
your experiment has a flaw
your resolution is higher on the newer computer.
that is going to cut your FPS down
try running with the *SAME* settings as your old computer, and you'll notice you get a higher framerate.
People don't realize that whenever you change your settings, even if your computer can support it, it will still change the framerate. <!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
And <u>you</u> don't seem to realize how a computer works. First off, there is a 0.2% (seriously) performence hit if I go down to 1024x768. Why is that? It all has to do with my 64MB DDR... Learn how a component works before posting.
Soooo, if anyone has any constructive help, I (and many others that are in the same boat) are more then all ears. There are tons of people with AthlonXP 2400+ or P4 2.3 with GeForce4 ti 4600, 512MB DDR 333MHz, with the best drivers and all that is implied when you have a kick@ss system, that are still getting 50FPS...
Now consider this, they get 250 FPS in quake3A, they get 150FPS in UnrealTournament 2k3, and they cap 99FPS in Dod, CS, HL, etc... There is a flaw, we need to find it.
Thanks
i got 2 pcs.
1:
pentium 4 1800 mhz
geforce 3 ti200 64 mb
win 2k
40.72 detonator
--> 60-70 fps
2:
amd athlon xp 2000+
GeForce 4 Ti4200 64mb
win 2k
40.72 detonator
--> 99,9 fps
and i cant see any differents between 60 and 100 fps
-RAM
-Virtual RAM
-OpenGL-compliant Graphics card
here is a list that HL doesn't give a crap about:
-CPU
-chipset
-RAM type
I have a frankenstein machine that's running Win98 on a P2-400 with a Geforce3, and I get upwards of 200fps in NS at 1024x768. want to know my secret? I nabbed a 2gb hard drive from the school storeroom and made it my virtual memory drive, and nabbed a bunch of RAM, for a total of 384mb, the most the motherboard supports.
by the way. I'm A+ certified, which means I know what I'm talking about as far as computer hardware is considered. Just so you can save yourself the energy of a wasted flame, you know? <!--emo&:p--><img src='http://www.unknownworlds.com/forums/html/emoticons/tounge.gif' border='0' valign='absmiddle' alt='tounge.gif'><!--endemo-->
-RAM <b>384MB PC-133</b>
-Virtual RAM <b>1gig of it</b>
-OpenGL-compliant Graphics card <b>GeForce2 ti 64MB </b>
Only reason why I'm un-happy is because I upgraded (SPENT MY MONEY) and there is practically no gain in performences weither I use my old Voodoo 3 3000 PCI 16MB, or my GeForce2 ti 64MB...
I mean, it's frustrating... specially seeing as how people with $hity computers get better results...
<!--QuoteBegin--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->I can run unreal tournament 2k3 with HIGHEST settings fine with that GeForce2 ti 64MB. <!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
I find that to be TOTAL bs. There is no way in hell you can run 1600x1200 at 32bit with every detail on highest and have it playable on a piece of **obscenity** Geforce 2. My geforce 3 ti 500 can't handle that so I know thats bs...
I.E.
fps_max
fps_modem
mine are set to 100
one other thing to note....NS servers have a max fps of 99 also when chit gets think in the server like lots of players and lots of
entitys the servers fps drops and you will not get more then that
my server is a amd 1700+ XP
512 meg of ram and my fps the server shows drops to 25-60 when the game is well along
floating back and forth between
my playing comp is a
P4 2.24a O.C. to a 2.6
gforce4 ti 4200 gold O.C. 50%
winMe
i get 99 fps solid @ 1600x1200
i run all card setting except the OC at default
.....OH also check the mini-drivers for your MOBO's agp slot i had a amd 500 that had the wrong mini drivers for the agp after i loaded the right ones my fps on that comp doubled <!--emo&???--><img src='http://www.unknownworlds.com/forums/html/emoticons/confused.gif' border='0' valign='absmiddle' alt='confused.gif'><!--endemo-->i might be misstating the name of the drivers so look to your mobo's instructions amd has 4-n-1 drivers i think not sure.
hope this helps you
And I can run the game fine (1024x768 x 32bit x all details on high @ a constant 85fps, which is both my refresh rate and my maxfps setting) with the following specs:
Athlon XP 2000+
Asus A7V333 (v1013 BIOS, VIA 4-in-1 v4.43)
256MB Corsair PC2700 DDR
GeForce2 MX 400 64MB (Detonator 29.42)
WD 40GB 7200RPM HD
Windows XP w/ SP1
Based on that, I'd say it's a driver problem. Try an older Detonator (get them from <a href='http://www.guru3d.com' target='_blank'>http://www.guru3d.com</a> ) and update your motherboard drivers and BIOS to the latest versions.
I.E.
fps_max
fps_modem
mine are set to 100
<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
If you're looking for better fps numbers than this could be what you're looking for. Do your half-life games seem capped at 99? Is the 99 constant? Jack up your fps_max (I've never figured out what fps_modem does) and see if your performance follows.
<i>Sidenote:</i> Of course you realize that the game wont look any different at 3500fps than it did at 99.
Oh yeah, I remember a command to get different fps rating in CS. It tries to render your current setting and rotate your view 360 degrees as fast as it can without dropping frames. Then it calculates the fps from the time it took to do the "spin". Anyone remember that command?
I think it was something like "refresh" or "refreshtimerate". Something like that....
You say you have a Voodoo3 3000 PCI 16mb running NS at 40fps. How? I have a Voodoo3 3000 AGP 16mb (Over clocked to 200mhz instead of 166mhz) and I only get ~25fps and ~4fps on long games/ns_caged. I think its because I have little Virutal Ram.
Specs:
400mhz AMD K-6 3D accel.
328mb PC1200 (?) RAM
Voodoo3 3000 AGP 16mb (200mhz clock speed instead of default 166mhz. Anymore and my computer goes wacko.)
My 1 GB system drive with ~200mb free space.
My 3 GB Half-Life drive with ~2GB free space.
My 20 GB Other drive with ~10GB free space.
Any tips as how to raise my fps without spending money?
BTW, sorry for "hi-jacking" the topic. <!--emo&:D--><img src='http://www.unknownworlds.com/forums/html/emoticons/biggrin.gif' border='0' valign='absmiddle' alt='biggrin.gif'><!--endemo-->
Take the Voodoo 3000 and stick it back in. Get the WickedGL drivers, and you should get ~30-40 FPS everywhere.
BTW. These people are trying to help you dont be an a$$ <!--emo&:p--><img src='http://www.unknownworlds.com/forums/html/emoticons/tounge.gif' border='0' valign='absmiddle' alt='tounge.gif'><!--endemo-->
I limit mine to about 50fps and run it in absolute full everything at 800*600 on a GeForce4 Ti4200 64mbDDR. (I don't go higher than 800*600 or my monitor starts to scramble my brain. It is, after all, probably 14 years old.
Timerefresh isn't a particularly good way of representing your FPS as everyone will, probably, do it in a different location. I have got between 100 and 400 fps results with Timerefresh depending on where I stand. If you're going to timerefresh and compare it just do it in a readyroom on a lan game and do it again in the same place for each settings test/system. Sometimes fluctuating framerates can give people misleading information.
Regardless of timerefresh, however, the displayed FPS will never go beyond 99, it doesn't need to anyway.
3500 fps? in Natural Selection? In this century? Absolute top of the range system can expect 200-500 in timerefresh but what does it matter, I have played from 7fps to 99fps and it doesn't make the game any less fun, long live NS (when I can finally get into all 1.03 servers that is).
I'm sorry if I was rude earlier on, but I just spent alot of money (well, its alot for me, I'm on a tight budget) and I didn't get any significant boost in performence.
But:
I know that past 72ish FPS the eye can't even see the added FPS (it varies with people). It's the principle that I spent money, I now have a better computer, but Ì have the same FPS. I practicaly spent money for the fun of spending money (wich isn't fun).
To Unknown, you need a better CPU. I have a similar system (a k6-2 350MHZ that I upgraded to a 500MHz) and it runs at about the same speeds as you do.
I know its a pain in the royal @ss to be told your hardware isn't suitable anymore.
If you can though, get the following components:
-Cheap KT133A or KT266 compatible mobo (about 30-60$)
-Cheap AMD Thunderbird 1333MHz (about 40-50$, they run very well), a cheap Duron 950-1300Mhz (from 20 to 50$) or an AnthlonXP 1700+ to 2100+ (50$ to 80$)
-256MB of PC2100 DDR memory (its 266MHz, costs around 50 to 65$)
And in the future you could buy a Cheap GeForce3 Ti-200 or an Ati Radeon 8500LE for about 70$ to 90$.
All prices are in U.S. currency, and for retail or OEM equipment (not trade-ins, good deals on the web, refurbished material, etc... prices are for NEW equipment)
Also, you can download and use the AmigaMerlin drivers (fastest and most stable drivers I ever found for my Voodoo 3 3000).
I'd suggest you go to the www.guru3d.com forums and check out the 3Dfx section of the forums.
As for my own problem, I am well aware that playing at 30+FPS is a constent, no slowdown game. But, I'd very much prefer playing at 99FPS since the eye can very clearly see the added FPS in games at 60FPS (and sometimes more). I've been away for the past week or so, but now that I'm back, I'm checking my mobo drivers. It's SiS based (wich I have never dealt with) and the Bios is really not up to date. I tried tweaking it but it keeps crashing if I set my bios to 133/133 for my processor's running speed.
So I have to play with a 1.2gig processor (instead of 1.6ish) since I need to put my setting to 100/100.
Once I've flashed my bios, I'll see what happens. Also, I'll look into the Sis drivers, since updating the AGP slot driver alone can give you a big help. I just don't know if there are things like the via 4in1 drivers out there for Sis based boards.
Anyway, have a nice day, and thanks for the help.
I think it was something like "refresh" or "refreshtimerate". Something like that....
<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
Eureka! The command is "timerefresh". I find that it gives a better reading on fps in half-life.
I limit mine to about 50fps and run it in absolute full everything at 800*600 on a GeForce4 Ti4200 64mbDDR. (I don't go higher than 800*600 or my monitor starts to scramble my brain. It is, after all, probably 14 years old.
<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->Oh you're killing me, you run a gf4 and dont go more than 800x600? I feel like mailing you a new monitor. Also, you'd probably notice a difference between 30 and 60fps if your monitor was better, sounds like a refresh rate problem. Your comp runs the game at like 100+ fps but your monitor might be refreshing slower than 60 times per second (although i dont know if monitors do that). The difference is all in turning quickly, a constant 60fps and spining fast is quite impressive on a nice monitor. 30fps will chop a little when spinning.
<!--QuoteBegin--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->Timerefresh isn't a particularly good way of representing your FPS as everyone will, probably, do it in a different location. I have got between 100 and 400 fps results with Timerefresh depending on where I stand. If you're going to timerefresh and compare it just do it in a readyroom on a lan game and do it again in the same place for each settings test/system. Sometimes fluctuating framerates can give people misleading information.<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->Yes timerefresh does fluctuate a lot, but its only an exaggeration of changes that happen all the time, I was just trying to suggest a solution that wasnt capped like the other HL fps readings. The original poster wants a way of comparing performance after an upgrade and I thought timerefresh is better than cl_showfps.
<!--QuoteBegin--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->3500 fps? in Natural Selection? In this century? Absolute top of the range system can expect 200-500 in timerefresh<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->I really need to find a better way of expressing sarcasm when i post, no one ever gets it... <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html/emoticons/smile.gif' border='0' valign='absmiddle' alt='smile.gif'><!--endemo-->
the cones on the human eye, the ones primarily used to start at your monitor, won't be able to detect anything faster then 40-45Hz. Hence why your light bulb looks lit, even though it's cycling at 60 times a second. the rods on your eye are more sensitive, and will be able to see flicker up to about 65 hertz. But that's your peripheral vision. I doubt you play NS looking at the corner of your monitor. so 99 fps really means nothing. 40-45fps would be fine, except when you throw your mouse around quickly. in which case 65fps would be ideal. after that, your eye doesn't see it anyways.
Anyways, in solution to your problem
- 512mb ram would be better, however, 384 should be fine. Make sure you don't have many programs running.
- The latest NVidia drivers DO improve speed even on older Geforce cards. I have a Geforce 2 GTS, and I got better speeds from the lastest Detonators...
- Get NVRefresh to ensure your game is running at the highest monitor resolution since nvidia cards default to the lowest for some stupid reason.