Sorry your browser is not supported!

You are using an outdated browser that does not support modern web technologies, in order to use this site please update to a new browser.

Browsers supported include Chrome, FireFox, Safari, Opera, Internet Explorer 10+ or Microsoft Edge.

Geek Culture / SLI - Does it require specific coding?

Author
Message
Scraggle
Moderator
21
Years of Service
User Offline
Joined: 10th Jul 2003
Location: Yorkshire
Posted: 9th Dec 2007 19:34 Edited at: 9th Dec 2007 19:34
I just bought a new PC with twin GeForce 8800GTX 768Mb Graphics cards in it and was eager to learn how much faster my projects would run.
So, I loaded up the bitmap font tutorial from this months newlsetter and tested it.

Old PC:
P4 3.2Ghz
GeForce 6600GT
4Gb DDR2

New PC:
Quad Core 2.6Ghz
2 x GeForce 8800 GTX 768Mb (SLI)
4Gb DDR2

Result of running the Bitmap Font Tutorial:
Old PC: 900fps
New PC with SLI disabled: 1650fps
New PC with SLI enabled: 1650fps

So, I am obviously happy that I have almost doubled my framerate with my new PC but there is no difference at all whether I use one graphics card or both. So, as per the title of the post: Does the use of SLI require specific coding or is there some other reason that there is no difference?

Thanks



El Goorf
18
Years of Service
User Offline
Joined: 17th Sep 2006
Location: Uni: Manchester, Home: Dunstable
Posted: 9th Dec 2007 20:05
i asked lee about this at the convention. he said that it's all handled by the drivers. he could release something in a patch to allow us to play with sli settings, but it'd be pointless, since the drivers will always use the most optimal settings for the game, perhaps now you could prove him wrong

http://notmybase.com
All my base are not belong to anyone.
jasonhtml
20
Years of Service
User Offline
Joined: 20th Mar 2004
Location: OC, California, USA
Posted: 9th Dec 2007 20:19
i dont think you should need any commands, but SLI only seems to make a big difference in games with certain settings on... (like really high resolution, anti-aliasing, ect.)

Silvester
18
Years of Service
User Offline
Joined: 7th Dec 2005
Location: Netherlands
Posted: 9th Dec 2007 20:30
I think SLI will kick in when your graphic card needs to process alot of stuff and effects at once. Not just a simple DBP program.

dark coder
22
Years of Service
User Offline
Joined: 6th Oct 2002
Location: Japan
Posted: 9th Dec 2007 21:02 Edited at: 9th Dec 2007 21:02
This is an interesting read: http://developer.nvidia.com/object/gpu_programming_guide.html

It's possible DBP uses "Compatibility mode simply uses only a single GPU to render everything (that is, the second GPU is idle at all times). This mode cannot show any performance enhancement, but also ensures an application is compatible with SLI." by default as I too haven't noticed any speed boosts with SLI(though I'm now on a single GPU system).

Also for a test like this it's probably best to use a GPU intensive test(so you know the CPU isn't the bottleneck), i.e. have a very high poly mesh with vertex/pixel shaders, large textures etc.

GatorHex
19
Years of Service
User Offline
Joined: 5th Apr 2005
Location: Gunchester, UK
Posted: 9th Dec 2007 23:06 Edited at: 9th Dec 2007 23:08
Also only 1 core of your Quad is being used

Imaginee how fast it could be if used it all

Is it a 2.4 overcocked? The 2.66 is a big price jump.

DinoHunter (still no nVidia compo voucher!), CPU/GPU Benchmark, DarkFish Encryption DLL, War MMOG (WIP), 3D Model Viewer
NeX the Fairly Fast Ferret
19
Years of Service
User Offline
Joined: 10th Apr 2005
Location: The Fifth Plane of Oblivion
Posted: 10th Dec 2007 17:28
Would the bitmap font demo be rendered by the GPU?


Since the other one was scaring you guys so much...
Green Gandalf
VIP Member
19
Years of Service
User Offline
Joined: 3rd Jan 2005
Playing: Malevolence:Sword of Ahkranox, Skyrim, Civ6.
Posted: 10th Dec 2007 18:14
Quote: "Is it a 2.4 overcocked?"


My PC was well and truly "overcocked" a long time ago.

Login to post a reply

Server time is: 2024-11-19 15:23:08
Your offset time is: 2024-11-19 15:23:08