Hardware question time
DiscoZombie
Join Date: 2003-08-05 Member: 18951Members
in Off-Topic
<div class="IPBDescription">PSUs and SSDs</div>Whenever someone mentions a hard-to-diagnose computer problem, the first question is almost always whether their PSU is beefy enough. How do you know how much power supply you need? I just ran my specs through the calculator at <a href="http://www.antec.outervision.com/PSUEngine" target="_blank">http://www.antec.outervision.com/PSUEngine</a> - and it said I needed about 242w to run my system, and here I thought my 500w PSU was wimpy by gamer standards... are these calculators accurate? what sort of hardware would you need before you start requiring like an 800w power supply?
also, anyone have any estimates on how long it will be until SSDs are affordable? Anyone run one? How much did you pay for it? I hear very good things - I'm drooling over the idea of near instant bootup and game load times...
also, anyone have any estimates on how long it will be until SSDs are affordable? Anyone run one? How much did you pay for it? I hear very good things - I'm drooling over the idea of near instant bootup and game load times...
Comments
edit: and if you end up with extra you can use the cheap ones for handy dandy around the house stuff... if you're into that
High quality PSU's not only can actually safely give out the power they're designed to give out, they do so at a better efficiency. Therefore, if you need 300W and your PSU is at 85% efficiency, you actually use 353W. Bad PSU's can be as low as 60% efficient, which means that a 300W load actually demands 500W.
Needless to say, bad efficiency = more expensive electricity bill, but also hotter supply, which means overheating danger, or worse. Remember that for gaming, you need stability, and over lengthy periods of time. Else your system can reboot unexpectedly, perhaps sometimes just shut down. Worse case scenarios can see your PSU or system destroyed.
One last thing: buying too big a PSU isn't a good idea. Peak efficiency is usually reached at around half the PSU's rating specs. That's a rule of thumb, and besides, most quality supplies now provide good efficiency from 20% load and up.
edit: to clarify, when I say "a 300W load actually uses 353W" that means that if your PSU is rated for 300W, it "should" be fine (though using 100% load isn't smart). However, 350 watts are taken from your wall's power plug. The rating written on your power supply shows you what it can take, however it doesn't refer in any way to what it actually takes from your installation (as this varies both on efficiency and actual power consumed). Therefore, it is better to have a 850W supply working at 82% efficiency on a 300W load, than a 350W supply working at 60% efficiency on the same load.
I'll also add that you have to be cautious on how this power is shared on the different rails. Check your PSU's sticker for additional info. Remember that Power(W) = voltage(V)*current(A) and that it's not because you're under the total power limit that you're not over the limit of a single rail.
As for SSD's, I'm hoping to get one soon. They are quite expensive, but hard drives have been the bottleneck in a system for too long. I'm eyeing the G.Skill Falcon: 64 GB for 190€, 128 for 320€. I know it's very expensive, but the performance is breathtaking.
this man knows what he is talking about
Also, with video cards these days, look for ratings in the specs that tell you how many 12 volt amps the card wants. Lots of power supplies won't give you that many amps (usually cards want more than 15A or so out of the 12 V) and at the very least you'll take some performance hits because of it. The PSU will have ratings like 12V, 10A or 12V, 58A. They say as far as 12Vs go, a single 12V rail is better than two. Rail is a fancy term for "supplied voltage line." I tend to agree just because it reduces complexity.
edit: speel bad
That makes sense.
You mean that crap uncle Ohm used to go on and on about?
1 gpu
2 cpu
3 and everything else
with 1 and 2 high-end, you rarely consume more than ~350w, but that would be measured actual consumption.
the components' ratings would sum up to ~450w (including #3).
so when choosing a psu, get a little margin to be on the safe side, so 500-550w.
note this is with 1 quad cpu, 1 gpu, no OC
with 2 high end gpu:s, add another ~250w to final psu wattage (300 if radeon 4870x2 :P).
also note all figures are in full load context, not idle.
and like mentioned, go for best effiecency, for many reasons...
most important is the quality of the components, and thereby quality of output (and psu lifetime). when a manufacturer is making a technologically advanced psu with high eff, they don't choose crappy components simply.
second, your psu will be cooler and quieter (less energy wasted into heat).
and ofc, better for the enviroment, and technology progression.
you have the "80plus" standard to guide you. there's plain 80+, 80+ bronze, silver, and on a few new ones - gold.
i recommend silver or bronze, simply cos of the availability, gold if you can find one and have money for it.
some links:
<a href="http://www.tomshardware.com/reviews/geforce-radeon-power,2122-6.html" target="_blank">http://www.tomshardware.com/reviews/geforc...wer,2122-6.html</a>
<a href="http://www.80plus.org/" target="_blank">http://www.80plus.org/</a>
Intel said they have reduced cost by 50% going from 50nm to 34nm, so there's alot of room for price adjustments that will happen once the competition steps up.
There will not be a good time to buy a SSD in the coming year if you are extremely cost-concious like me :D
Intel will always have the leading edge, but AMD always prices accordingly to try and undercut them. Too bad AMD failed at the dual core battle, but they have awesome low-power cores.
The flash memory industry is really one of the fiercest in the electronics world right now. Prices are really dropping amazingly fast, that's one of the reasons Intel is cutting down the prices so dramatically on a simple generation change. And believe it or not, Intel doesn't rule the flash memory market.
Actually, I believe AMD and Intel have cross-tech/licensing agreements. If one group develops something, they get claim to it for a while but then share it with the other guys, typically a few (i.e. not 2) months. Turns out the majority of the chip companies have these with at least one other company.