Subnautica doesn't switch to Nvidia GPU
Sounlligen
Join Date: 2016-03-19 Member: 214524Members
Hello everyone.
I know it's been reported a few times (since May 2015) but the problem persists. I have an integrated Intel HD Graphics 4000 and NVidia GeForce GT 740M. The games uses only integrated GPU and I can't force it to use the latter.
I know it's a problem with Unity and I would like to know if there's any progress or news?
Thanks a lot for your answer.
I know it's been reported a few times (since May 2015) but the problem persists. I have an integrated Intel HD Graphics 4000 and NVidia GeForce GT 740M. The games uses only integrated GPU and I can't force it to use the latter.
I know it's a problem with Unity and I would like to know if there's any progress or news?
Thanks a lot for your answer.
Comments
Right-click on your desktop and select NVIDIA Control Panel.
In the Control Panel, select Manage 3D settings, then click on the Program Settings tab.
Next to the "Select a program to customize" option, click the Add button.
Image
From the pop-up window, navigate to the folder where Subnautica is installed. This will most likely be C:\Program Files (x86)\Steam\steamapps\common\Subnautica\
Select the executable file for the game (Subnautica.exe).
In the "Select the preferred graphics processor for this program" option, open the drop-down menu and select High-performance NVIDIA processor and confirm.
To improve performance, to to the "Specify the settings for this program:" section, click Power management mode, and select Perfer maximim performance and confirm.
To further improve performance, you can go to the Manage 3D Settings section, click on the Global Settings tab, and change the settings below:
Vertical sync to off
Threaded optimization to off
Triple buggering to off
Note: The above instructions may change depending on version and unique graphics card. If you require further assistance, you should contact NVIDIA Support here: http://www.nvidia.com/page/support.html.
This would not be advisable. On Optimus systems, the Intel GPU is used as a display frame buffer directly connected to the screen. If this is disabled, the Nvidia GPU cannot output anything to the screen.
I have the very same problem, no matter what I do (I've tried Cynical_Scrub's advice) the game still runs on the Intel HD Graphics GPU. Fortunately, it runs smoothly enough and I only get occasional FPS drops.
Still, I came to the forums to see if there was any way to get around it, and I'm a bit sad there isn't!
I just hope the game requirements do not change too much while it's being developed, I'm not sure the Intel GPU can withstand too much
Also this one's worth a try to verify GPU state: http://forums.laptopvideo2go.com/topic/26992-optimus-test-tools-finally-in-users-hands/
And there's nothing wrong with my Nvidia GPU, it's working as intended for all the other games. Needless to say I always keep the drivers up-to-date as well.
Anyway, if Obraxis said the issue comes from a problem between Unity and Nvidia, I think there's not much we can do about it.
What I mean is, the tools show what programs are runnning on the GPU. Basically the same as the tray-thingy Nvidia provides without being a tray-thingy.
There are a lot of developers of Unity based games who have solved this problem, so maybe you (UW) should talk to their devs and see how they solved it. This includes Savage Lands and Empyrion, for two examples that I know of specifically.