How to enable SLI (driver WHQL 306.97)
Aix
Join Date: 2010-12-02 Member: 75409Members
Hello all,
If you're running SLI and find that you're not getting the performance that you should be, it's likely because NS2 does not yet have a proper SLI profile. As you can see below, leaving everything on auto means the driver will only utilize GPU1 to render:
<img src="http://i.imgur.com/C6AJG.jpg" border="0" class="linked-image" />
If you go into your Nvidia Control Panel and under SLI Performance Mode select "Force Alternate Frame Rendering 2", you should see a nice bump in performance:
<img src="http://i.imgur.com/W5crO.jpg" border="0" class="linked-image" />
<img src="http://i.imgur.com/P76Gp.jpg" border="0" class="linked-image" />
Hope it helps.
If you're running SLI and find that you're not getting the performance that you should be, it's likely because NS2 does not yet have a proper SLI profile. As you can see below, leaving everything on auto means the driver will only utilize GPU1 to render:
<img src="http://i.imgur.com/C6AJG.jpg" border="0" class="linked-image" />
If you go into your Nvidia Control Panel and under SLI Performance Mode select "Force Alternate Frame Rendering 2", you should see a nice bump in performance:
<img src="http://i.imgur.com/W5crO.jpg" border="0" class="linked-image" />
<img src="http://i.imgur.com/P76Gp.jpg" border="0" class="linked-image" />
Hope it helps.
Comments
How strange, what driver are you using? For me, AFR1 caused both GPUs to work around 50% and in-game was choppy. AFR2 brought them both to max and much smoother.
Global Setting (Nvidia Recommended): 141 FPS
Single GPU: 140 FPS
AFR1: 43 FPS
AFR2: 145 FPS
AFR2's gains were negligible in my brief and limited test but one thing is for sure: AFR1 was terrible.
Specs:
- Core i7-3770K @ 4.2GHz
- 8GB of RAM
- GeForce GTX 590 w/ Driver 310.33
- OCZ Summit SSD
Graphics Settings:
- 1920x1080
- Fullscreen
- Vysnc Disabled
- Medium Textures
- Infestation Minimal
- Anti-Aliasing, Bloom, Atmospherics, Anisotropic Filtering, Ambient Occlusion, Shadows & Texture Steaming : All OFF
- Multicore Rendering On
The newest drivers. It's my understanding the only difference between the 2 is which gpu renders the odd frames and which renders the even frames. I am using a pair of GTX580's so maybe it's just the cards you're using.
You won't notice much of a difference with a single card that's doing "sli". Realistically you're still only using the 1 channel from the PCIe slot. If you have 2 cards it's a much bigger difference due to having 2 channels.
That is not at all how multi-GPU setups function; saying a dual-GPU card is only as good as a single-GPU card because it only uses 1 PCIe slot is patently incorrect. The bandwidth on PCIe slots is more than enough to handle dual-GPU rendering. Dual-GPU cards often have lower reference clocks to keep power draw (and heat generation) down, which leads to lower scores vs their dual-card counterparts, but dual-GPU cards are certainly capable of SLI/CF...otherwise there would be no point in buying them.
I'll have to try those 310.33 beta drivers and see how they do, although 306.97 were great tonight for me (using 580 3GB SLI). I didn't see any NS2 changes in the documentation though.
I'll have to try those 310.33 beta drivers and see how they do, although 306.97 were great tonight for me (using 580 3GB SLI). I didn't see any NS2 changes in the documentation though.<!--QuoteEnd--></div><!--QuoteEEnd-->
That's not what I said, obviously a 590 beats a single 580, however 2 580's will beat a 590 hands down. And no I didn't say 590's don't do sli, that option is certainly available in the nvidia settings. However, you won't notice as much of a performance gain because it's still 1 card on 1 slot using 1 set of bandwidth.
I have a single 580 GTX, and R_Stats shows my waiting for GPU to be 0 (occasionally flashing to 1ms), this means my system is not waiting for my card to catch up and in my case my 1090T 3.2 ghz AMD CPU is the bottleneck.
I very much doubt SLI configs will boost the Spark engine as it seems to be a CPU intensive game rather than GPU demanding.
<b>The best way to get SLI working is to download Nvidia inspector, select the profile for the game Dishonored, and switch the .exe with that for NS2.exe. I get 99% scaling on my cards that way</b>
Here's a 690 review showing performance against a 680:
<a href="http://www.anandtech.com/show/5805/nvidia-geforce-gtx-690-review-ultra-expensive-ultra-rare-ultra-fast/6" target="_blank">http://www.anandtech.com/show/5805/nvidia-...re-ultra-fast/6</a>
Here's a 690 being tested on PCIE 2.0 and 3.0. No difference.
<a href="http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/53901-nvidia-geforce-gtx-690-review-25.html" target="_blank">http://www.hardwarecanucks.com/forum/hardw...-review-25.html</a>
The 690 is slightly slower than the 680 because it does not turbo its frequency as high as the 680s.
Now, if you were going to render it very high resolutions (like 3K or 4K), then you might start seeing a difference. So if we start seeing big "retina" desktop displays, then this may change.
Here's a 690 review showing performance against a 680:
<a href="http://www.anandtech.com/show/5805/nvidia-geforce-gtx-690-review-ultra-expensive-ultra-rare-ultra-fast/6" target="_blank">http://www.anandtech.com/show/5805/nvidia-...re-ultra-fast/6</a>
Here's a 690 being tested on PCIE 2.0 and 3.0. No difference.
<a href="http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/53901-nvidia-geforce-gtx-690-review-25.html" target="_blank">http://www.hardwarecanucks.com/forum/hardw...-review-25.html</a>
The 690 is slightly slower than the 680 because it does not turbo its frequency as high as the 680s.
Now, if you were going to render it very high resolutions (like 3K or 4K), then you might start seeing a difference. So if we start seeing big "retina" desktop displays, then this may change.<!--QuoteEnd--></div><!--QuoteEEnd-->
First I'm not comparing a 690 to a 680, even though a pair of 680's still beats a single 690.
Second, look those PCIe 16x slots are connecting to your north bridge. That is going to be handling all the signals going to it and coming back. It can't do more than 1 signal per slot at once. It does do full duplex so it can do a send and receive at the same time, but not two of either. This is why, 2 separate cards are faster. It's like the difference between multiple threads and multiple cores.
I've got dual GTX460 1GB cards and been getting 25-40fps with the global settings. When I change to AFR 1 it went down to 15-20, but with AFR 2 it went up to 70-80!
This is a huge win, thank you!
Yes, 1 PCIE slot has less bandwidth than 2 PCIE slots, but 1 PCIE slot still has more than enough bandwidth to support two cards. The benchmarks I posted show this because, despite PCIE 2.0 having half as much bandwidth as PCIE 3.0, there is almost no difference between them in benchmarks at normal resolutions, even for an Nvidia 690.
SLI AFR1 (2x 580 GTX 3GB): 2560x1440, AA-ON, AF-ON, AO-OFF, All details-High | <b>15.5 FPS, 38ms waiting for GPU</b>
<a href="http://imgur.com/Yl0L6" target="_blank"><img src="http://i.imgur.com/Yl0L6l.jpg" border="0" class="linked-image" /></a>
SLI AFR1 (2x 580 GTX 3GB): 2560x1440, AA-ON, AF-ON, AO-MAX, All details-High | <b>12.6 FPS, 58ms waiting for GPU</b>
<a href="http://imgur.com/Tka4L" target="_blank"><img src="http://i.imgur.com/Tka4Ll.jpg" border="0" class="linked-image" /></a>
SLI AFR2 (2x 580 GTX 3GB): 2560x1440, AA-ON, AF-ON, AO-OFF, All details-High | <b>63.6 FPS, 2ms waiting for GPU</b>
<a href="http://imgur.com/HAmzA" target="_blank"><img src="http://i.imgur.com/HAmzAl.jpg" border="0" class="linked-image" /></a>
SLI AFR2 (2x 580 GTX 3GB): 2560x1440, AA-ON, AF-ON, AO-MAX, All details-High | <b>45.5 FPS, 7ms waiting for GPU</b>
<a href="http://imgur.com/ha9s4" target="_blank"><img src="http://i.imgur.com/ha9s4l.jpg" border="0" class="linked-image" /></a>
Single GPU (GTX 580 3GB): 2560x1440, AA-ON, AF-ON, AO-OFF, All details-High | <b>50.2 FPS, 7ms waiting for GPU</b>
<a href="http://imgur.com/CRUKy" target="_blank"><img src="http://i.imgur.com/CRUKyl.jpg" border="0" class="linked-image" /></a>
Single GPU (GTX 580 3GB): 2560x1440, AA-ON, AF-ON, AO-MAX, All details-High | <b>37.3 FPS, 26ms waiting for GPU</b>
<a href="http://imgur.com/3dJFS" target="_blank"><img src="http://i.imgur.com/3dJFSl.jpg" border="0" class="linked-image" /></a>
Still dips to 38-45 FPS in certain areas of the map.. others can run at 70+ though.
So disappointed..........wasted my $25, until they can implement SLI for this game!
Specs:
GTX 570 x2 SLI w/ Driver 310.33
8 Gigs RAM
2500K @ 4 GHz
Game on Crucial M4 SSD
Still dips to 38-45 FPS in certain areas of the map.. others can run at 70+ though.
So disappointed..........wasted my $25, until they can implement SLI for this game!
Specs:
GTX 570 x2 SLI w/ Driver 310.33
8 Gigs RAM
2500K @ 4 GHz
Game on Crucial M4 SSD<!--QuoteEnd--></div><!--QuoteEEnd-->
Did you try it with 306.97? I haven't tried this with that beta driver.
I've got dual GTX460 1GB cards and been getting 25-40fps with the global settings. When I change to AFR 1 it went down to 15-20, but with AFR 2 it went up to 70-80!
This is a huge win, thank you!<!--QuoteEnd--></div><!--QuoteEEnd-->
Careful... SLI already has 1 frame of lag, AFR2 adds 3 more to it, and when FPS drops it gets worse.
Thats the tradeoff of "AFR" and "pre rendered frames" - input delay.
And if you're like me having raw mouse iput and snappy mouse control without mouse accel or delay is waaayyy more important than a few more frames per second.
Fun example of AFR:
<a href="http://www.pcgameshardware.com/aid,675353/GPU-benchmark-review-CrossfireX-versus-Quad-SLI-and-3-Way-SLI/Reviews/?page=2" target="_blank">http://www.pcgameshardware.com/aid,675353/...Reviews/?page=2</a>
Some of you have a beast of a rig so its really odd seeing 12 fps. ^.^
Maybe run in single GPU until Nvidia releases a profile/beta drivers? I really think you can feel the input delay difference between default and AFR 2.. I can
I've been using afr2 since yesterday morning and I haven't noticed any input lag at all.
AFR2 helped a bunch with fps since my second card wasnt at 2% anymore, but my fps still wasn't 'completely' smooth.. though my 'wait for gpu' was now always 0.. ... so I overclocked my cpu from 3.3 to 4.4, and now it's 100% playable and smooth during every type of scene. (24man game) minimum fps 70, average 80, maximum 110. medium textures cause of 1gb cards, med ao because the difference isn't that big, and nvidia ffxa instead of ingame aa.
Originally I was dropping to 35-40fps during big scenes on the lowest settings possible. That hurts your aim, and your eyes.. So I'm pretty happy now .
2600k 3.4 @ 4.4
560ti sli
p8p67 deluxe
8gb 1600 ddr3
win7
Still dips to 38-45 FPS in certain areas of the map.. others can run at 70+ though.
So disappointed..........wasted my $25, until they can implement SLI for this game!
Specs:
GTX 570 x2 SLI w/ Driver 310.33
8 Gigs RAM
2500K @ 4 GHz
Game on Crucial M4 SSD<!--QuoteEnd--></div><!--QuoteEEnd-->
Check out my post. It helps with SLI quite a bit
I have post already how to achieve awesome SLI scaling. YOu just need the proper profile.
There goes a lot more into SLI rendering than just AFR2 and AFR1. There are other settings which are invisible to the user and can only be unlocked by editing the profiles directly in Nvidia inspector. The Dishonored profile offers the best performance by far with no input lag.
If anyone with SLi wants a tutorial just ask me
Thats the tradeoff of "AFR" and "pre rendered frames" - input delay.
And if you're like me having raw mouse iput and snappy mouse control without mouse accel or delay is waaayyy more important than a few more frames per second.
Fun example of AFR:
<a href="http://www.pcgameshardware.com/aid,675353/GPU-benchmark-review-CrossfireX-versus-Quad-SLI-and-3-Way-SLI/Reviews/?page=2" target="_blank">http://www.pcgameshardware.com/aid,675353/...Reviews/?page=2</a>
Some of you have a beast of a rig so its really odd seeing 12 fps. ^.^
Maybe run in single GPU until Nvidia releases a profile/beta drivers? I really think you can feel the input delay difference between default and AFR 2.. I can<!--QuoteEnd--></div><!--QuoteEEnd-->
That is a Quad-SLI review from 2009 of cards that are 4 generations old; that's not really applicable here unless we have a bunch of people rocking 4xGTX285's. Drivers are different, technology is different, games are different, and quad-SLI was always a mess compared to even tril-SLI, much less regular SLI.
As for input lag, I had no trouble hitting skulks last night, and considering the interp on some of the weapons (the grenade launcher fires like the default TF2 launcher: 100ms delay), I'm not sure a "1 frame delay" is going to make or break anything if there is one - especially not when the alternative is a garbage framerate.
I have post already how to achieve awesome SLI scaling. YOu just need the proper profile.
There goes a lot more into SLI rendering than just AFR2 and AFR1. There are other settings which are invisible to the user and can only be unlocked by editing the profiles directly in Nvidia inspector. The Dishonored profile offers the best performance by far with no input lag.
If anyone with SLi wants a tutorial just ask me<!--QuoteEnd--></div><!--QuoteEEnd-->
I downloaded Nvidia Inspector last night but didn't see anything about profiles...where do I find that? It basically looks like GPU-Z combined with an overclocking tool. Or did you mean update to the latest Nvidia beta driver and then use the Dishonored SLI profile in the Nvidia CP?
Send me a PM i will walk you through it
I ran into this issue with a 570 SLI setup on NS2 and BF3, personally, recently.
That website was provided to show an exaggeration of how the setting works - no you may not be using 4 cards but it highlights the core issues with AFR, that are present even with 2 cards.
The way the method works, and the reason why nvidia will default to that sort of profile of prerendered frames, as some have accused, is simply to get higher FPS in benchmarks.
But you are delaying the frames, make no mistake about it
If it does not effect you then lucky you, stick with it. :)
But for people like me who detest any form of delay, its no vsync, raw input, and as few prerendered frames as i can get away with - and after dealing with SLI microstutter and input delays for years i am now happily a single GPU user.
I just wanted to warn others out there about this - its worth a mention and testing for yourself. I really wish the OP would at least warn of it in his first post... :-/
<b>edit:</b>
<b>Source = www.nhancer.com </b>
<i>"Alternate Frame Rendering (AFR)
When using the Alternate Frame Rendering, each other frame is rendered by one of the two cards alternatively. So one card renders a complete image, while the other card is already rendering the following image.
This mode is very CPU effective. In theory, <u>it can introduce a slight lag (i.e. mouse movements are not as immediate as normal), </u>but this effect is so small, that practically nobody ever notices it."</i>
<b>Source = SLI wiki</b>
<i>While AFR may produce higher overall framerates than SFR,<u> it may result in increased input latency due to the next frame starting rendering in advance of the frame before it.</u> This is identical to the issue that was first discovered in the ATI Rage Fury MAXX board in 1999.[1] This makes SFR the preferred SLI method for fast paced action games.</i>
I recommend the less FPS, but smoother split frame rendering. (SFR)
<a href="http://isiforums.net/f/attachment.php?attachmentid=2081&d=1335031950" target="_blank">http://isiforums.net/f/attachment.php?atta...mp;d=1335031950</a>
^only pay attention to the SLI settings, the rest are for a specific driver version, so I wouldn't suggest using them. and obviously set GPU_COUNT to however many cards you have.
edit: f u iron horse you beat me to it :P
<a href="http://www.nvidia.com/object/win8-win7-winvista-64bit-310.54-beta-driver.html" target="_blank">http://www.nvidia.com/object/win8-win7-win...eta-driver.html</a>
(<a href="http://www.reddit.com/r/ns2/comments/1369ju/nvidia_sli_driver_update_for_ns2/" target="_blank">Source</a> from Reddit)
I have a single 580 GTX, and R_Stats shows my waiting for GPU to be 0 (occasionally flashing to 1ms), this means my system is not waiting for my card to catch up and in my case my 1090T 3.2 ghz AMD CPU is the bottleneck.
I very much doubt SLI configs will boost the Spark engine as it seems to be a CPU intensive game rather than GPU demanding.<!--QuoteEnd--></div><!--QuoteEEnd-->
You need to be in the 500 series of cards to get positive effects above medium graph settings.
<a href="http://www.nvidia.com/object/win8-win7-winvista-64bit-310.54-beta-driver.html" target="_blank">http://www.nvidia.com/object/win8-win7-win...eta-driver.html</a>
(<a href="http://www.reddit.com/r/ns2/comments/1369ju/nvidia_sli_driver_update_for_ns2/" target="_blank">Source</a> from Reddit)<!--QuoteEnd--></div><!--QuoteEEnd-->
how do I use this profile?