amd 6570 name and shame ^^, decent for everyday computing. not gaming
N@uralBornNoobist
Gorge-N-Freeman,2Gorges1Clog Join Date: 2012-12-24 Member: 176138Members
Comments
The least I would recommend you to play this game in all honesty is the basic GeForce GTX 650 (they average £80), paired with an adequate (not fast) processor such as a Sandy Bridge i3 you'll get somewhere between 40 and 70 fps with highish settings (but forget about ambient occlusion).
Here's the 2013 lineup for gaming-class GeForce cards, the equivalent to the GTX 650 is the AMD 7750.
For a budget like yours I would be looking at the last two gens of cards and what you can get. Here in the USA you can find the following gpu's used for about your price range depending on luck: (Brand, low to high) AMD 5830 6850 5850, NVidea 460 465 470.
If you could swing your budget higher, I might look up to a AMD 5870 or 6950, Nvidea 470 or 560 ti which are a bit better than the 650 but cheaper since used.
Edit: just went to a comparison... why is my multi-year old discontinued card rated better than the more current 650? makes no damn sense to me.
Well some forumers in places like TPU will just want to get rid of their old cards and will make nice prices, but that isn't for everyone. I never buy used hardware either, and wouldn't unless it's a killer deal on a very neatly treated piece, if you saw how my cards get after 3 years of my thrashing... haaaa
Your card is equivalent to the GTX 560 non Ti (336 core version), and both are pretty similar in performance and both are faster than the GTX 650. The 650 performs slightly faster than the GTX 550 Ti, and about 30% faster than a reference GTS 450. Guess I'll share some of my expertise with you guys, never mind the bollocks
This is a screenshot from a 768MB GTX 460.
Since your card uses the older Fermi architecture that still had the shader hot clock, there are a few differences, since Kepler cards only have the base clock. All in all, 1 Fermi shader at 1400 MHz (700 core) is equivalent to roughly one and a half to two Kepler shaders at 700 MHz, that disregarding the other enhancements in stuff.
Kepler cores can clock substantially higher however, for GK100 class cards clocks of upwards of a GHz are present even on the most basic processors, and for GK110 the Titan has a base clock of 836 MHz but every card is capable of upwards of 1100 MHz+ with very minimal tweaking, making them extremely powerful compared to a Fermi shader. (that would need to be at 2200 MHz to achieve this, and only the best GTX 580s, binned and under water with custom power phase setup can do it)
The GTX 650 uses the entry level GK107 processor (same as GT 650M and GTX 660M), and it has 384 "cores", 128 bit memory, 16 ROPs and 32 TMUs, which is roughly equivalent to 192 Fermi ones on a similar setup. Here's my GT 650M, it's basically the same card, really. Note how the GTX 460 achieves this at half the clock speed.
The most accurate comparison to the GTX 460 1 GB and the GTX 560 non Ti is the GTX 650 Ti BOOST (that has 768 cores and 192 bit memory), though the BOOST will be quite a tad faster because of the superior texturing capacity of Kepler. This makes sense, seeing as the GTX 580 is roughly equivalent to the non Ti GTX 660 with 960 cores. (but with weaker texture throughput)
That all makes the GTX Titan as powerful as roughly three to four GTX 580's accounting for all the cores, instruction, memory management, etc enhancements on it, and coming from three 480's, I can verify that it's sometimes even faster than that. My brother has 2 GTX 580 in SLI (overclocked to 900 MHz) and it doesn't even come close... my Titan still has more than twice the pixel and texture fillrate of that setup combined, with room to spare. It's a $1000 processor for a reason, and worth every cent... but I believe a dude with more than two of these is frankly insane haha.
Note that there are obvious limitations to Kepler which prevent it from completely whooping the life out of Fermi cards, such as the compute (only GK110 cards can outperform GF110, GK104 and under are actually weaker), Fermi has strong ROP throughput compared to GK104 (680/770), etc, so unless you are talking at least a GTX 780, even the GTX 560 Ti 448 will show certain degree of strength against a GTX 600/770 class card.
Your CPU must be relatively powerfull as well, a 4-core 3ghz at least.
My google-fu suggests you wouldn't get that card for much less than $300 anyway. Seems to me like they haven't been making a lot of real advancement for a while.
Oh... makes sense seeing it's pretty niche, i don't know here in Brazil things just work way different, both retail and used pricing, it varies a lot with supply and demand and the embedded tax system... Things usually cost twice to even thrice as much as compared to the US, for example my Titan I paid R$ 3,699 on it (by far the cheapest I found), seeing that 1 USD = roughly 2,30 BRL, that would make the card $1600 after all the import duty, retailer's profit, national taxes and shipping...
oh well, was worth a try
You called me on the forum I mostly look for used hardware...
It works like this, cpu's and gpu's are sorted out by a binning process. Each one in a product line starts off with the hopes of making the best quality possible, but there will always be flaws in the manufacturing process. So let's say a 580 was the goal, but there are a few flaws, they'll just downgrade it to a 570 and load the appropriate programming. A few more flaws on the next makes a 560 etc....
Sadly it's a lot less likely you'll end up with a 580 worthy chip instead of a 560 worthy chip. That's where supply and demand comes in. Intel does the same thing with their processors, it's why the extreme edition are so damn expensive compared to the rest regardless of generation. A diamond has cut, color, and clarity. A cpu or gpu has temperature, speed, and architecture. Both are pretty rare.
The important thing though is that anything below like a x70 model (e.g. 570, 670, 770) on Nvidia cards is usually garbage
What I actually heard some years ago was that nVidia deliberately crippled their higher end cards in order to create the bargain cards, doing them all in one process and then adding or subtracting afterward. That way they can still sell their high end for a ridiculous premium while also making money off of the poorer masses as well.
Not Really, I use a 650Ti, It runs NS2 on 60-80 FPS on medium and Most other games I play, I run on the highest settings
I know, right! I explain this to everyone on YouTube. The GTX 295 is dual GTX 275 (harvested GTX 285 processors equipped with the GTX 260's memory design) crammed in one card, the earlier 295's were even dual PCB you see. The GTX 560 is a binned GF114 processor , found in its complete version (with 384 cores) on the GTX 560 Ti, the 560 Ti 448 cores is a limited, stock dump GF110 processor, the same the GTX 570 uses with a further damaged unit. Same goes for the i7 and i5 CPU's, FX-8K and FX-6K chips, etc.
I was suggesting it because it is pretty old and I thought it would be in that space right now
Otherwise, dont bother and stick with your old gpu.
Oh he doesn't mean about the processor's power, but the binning really. The 650 Ti is amazing for such a cheap graphics card. It's the manufacturing process. The GTX 650 Ti uses a cut (binned) GK106 chip. These lower performing cheap variants of the same card are actually processors that either are unstable at their stock setting or have actual damage to the silicon, so instead of taking a full loss what does AMD/NVIDIA/Intel do, they either laser cut or disable the damaged area by firmware, and sell it as a budget chip. Its quality will affect how high the processor can clock, how much power it will consume and how stable it will be at a default setting.
Notorious examples of this are the AMD's dual and triple core processors from the Phenom II generation as well as the FX-4000 and FX-6000 processors, who all have instability and cache issues (and many X3's actually have a damaged fourth core), Intel Core i5 processors, which have cache memory stripped down from them and are likely not to clock anywhere near its i7 sibling while not having the HT support enabled due to instability with it, the AMD HD XX50 series, and the NVIDIA non Ti processors (with the exception of a few such as this one and the Ti BOOST.)
Interestingly enough, the GeForce GTX 770 is actually a fully enabled, ultra high quality GK104 processor with all the 1536 cores, 128 TMU and 256 bit memory enabled at a high clock speed and lower power consumption compared to its earlier revision at the same clock, it is the same used on the GTX 680 before, while the GTX 780 is actually an extremely poor GK110 processor (2304 cores, 384 bit, 192 TMU) that has many units deactivated from poor yield and silicon quality. This has nothing to do with the performance very obviously as the GTX 780 is still a superior graphics card in every single way conceivable compared to a GTX 770.
This is used even in some high end chips, such as the i7-3970X processor (its full design has 8 cores and 20 MB of L3) and the GTX Titan (full design has 2880 cores up from 2688), which are derived from extremely expensive and difficult to manufacture processors such as the upper end Xeon E5 and Quadro K6000 processors, sold to businesses that cannot afford even the very slightest margin of hardware failure.
An example of this is that the GTX 780, the Titan and this Quadro K6000 are all the same processor, just with varying quality and enabled features. On the low end, you have the $649 GTX 780, with its low ASIC quality, "meagre" 3 GB of video memory and fewer enabled blocks, on the midrange you have the $1,000 GTX Titan, with only one block disabled, plentiful 6 GB of video memory, and a high ASIC quality, while retaining some of the exclusive high end features of the processor, notoriously Double Precision compute never seen before on a GeForce card, and finally the Quadro K6000, which comes with professional driver support, and a fully enabled chip with a whopping, out of this world 12 GB of video memory, with an entry price of $5,265 for the basic edition to $8,599 with the SDI kit. Hope this clears it all up
I'm gonna go ahead and throw up the BS flag on this post. Unless your "highest settings" are on resolutions less than 1920x1080 and don't include AA or MSAA and "Most other games you play" are minecraft, you're not getting 60-80FPS even overclocked. Check the link below for why.
http://www.tomshardware.com/reviews/geforce-gtx-650-ti-benchmark-gk106,3318-11.html
40 is too low a budget dude, you're seriously better off saving some money at least for a GTX 650 Ti BOOST or a GTX 760, or an HD 7850/7870.
Definitely, if you set such a low budget you're setting yourself up for disappointment, save what you can even if it takes months or a year. You could save up all year until next year and get something entry level from the new generation or better from the previous and you'll be set for a few years. I always recommend Nvidia because it does more than just gaming and it has advanced PhysX in the higher end cards (GTX x70 or better) and that is worth it. But if you're on a budget there is software PhysX options for AMD cards that won't be as good but workable. But yea in a nutshell save up for something good not just good enough.