Sorry your browser is not supported!

You are using an outdated browser that does not support modern web technologies, in order to use this site please update to a new browser.

Browsers supported include Chrome, FireFox, Safari, Opera, Internet Explorer 10+ or Microsoft Edge.

Geek Culture / Regarding DirectX in general

Author
Message
MrValentine
AGK Backer
14
Years of Service
User Offline
Joined: 5th Dec 2010
Playing: FFVII
Posted: 24th Aug 2011 05:55 Edited at: 24th Aug 2011 09:10
I always thought anything running in DX was usually moved onto the GPU like a 3D object... I know not all things are DX to GPU as pretty much... everything in Windows is driven by DX... as far as my OS knowhow goes... anyway without sounding dim or dum I just wanted to know is there a possible method to force all our programs to jump off the GPU? I know Win7 had DirectCompute... can we integrate this into our DVPro productions or... do we need to implement some c++ coding via DGDK or something?

More insight into this may benefit us all here I believe...

I am writing this in short as I am still playing Xenogears at the moment and it is 03:57am here lol I have another post to write up before I forget it.

Hope this was enough to go on for now.

EDIT

I realise I should have added DirectCompute in the title... hope a MOD doesnt mind adding it in { } after general thanks.

Dark Java Dude 64
Community Leader
14
Years of Service
User Offline
Joined: 21st Sep 2010
Location: Neither here nor there nor anywhere
Posted: 24th Aug 2011 06:25
Hmmm, well im sure you could use assembly code or maybe even just a typical high level language like C++ that doesnt use DX unless you tell it to, then do all graphics calculations then only use the GPU to draw the individually calculated pixels to the screen. I hope thats what you meant!!

By the way, now that your website is finished, it looks really awesome!

Irrigation of the land with seawater desalinated by fusion power is ancient. It's called rain.
MrValentine
AGK Backer
14
Years of Service
User Offline
Joined: 5th Dec 2010
Playing: FFVII
Posted: 24th Aug 2011 06:34
Quote: "use the GPU to draw the individually calculated pixels to the screen. "

This is exactly what I am looking to figure out...


Quote: "By the way, now that your website is finished, it looks really awesome!"

Thanks buddy I am yet to add a few more pages to it namely Video and a proper contact page... as well as FB TWITTER stuff...

Forgot what I was going to write next............. darn hunger

If it ever comes back to me I will let you know... {oh My shop is coming together slowly...} that was it I guess...

Aside from that urm I need to sort out the Communities site I might as well hint at it... it includes free Wordpress hosting ^^ as well as Joomla and Drupal for those who know how to use them... however I have had issues with Drupal being in a sub-sub folder...

Back on topic... I wonder if there is a simple plugin or line of code already in DBP that forces the GPU to move its cores and process some data visually...... would make a huge difference to Frame Rate...

ionstream
20
Years of Service
User Offline
Joined: 4th Jul 2004
Location: Overweb
Posted: 24th Aug 2011 06:43
No, there is no way to make programs run from the GPU, nor is there any way to write a program that runs solely on the GPU. You can however run algorithms on the GPU using CUDA/OpenCL/DirectCompute as part of the larger CPU-driven program. If you are just asking how to get graphics to process on the GPU, then using DirectX 11/OpenGL 4.0(ish) will force you to do this via shaders and vertex buffers.

Also you are using ellipses when commas or periods are more appropriate.

MrValentine
AGK Backer
14
Years of Service
User Offline
Joined: 5th Dec 2010
Playing: FFVII
Posted: 24th Aug 2011 06:47
Quote: "Also you are using ellipses when commas or periods are more appropriate."


huh?

Thanks for answering the question ionstream...

Come to think of it... Vertex buffers? I know using too many shaders would be a bad idea... how can I utilise Vertexes to get the GPU to wake up?

Also umm that means there is nothing I can do with DBPro to invoke the GPU to say hello?

Neuro Fuzzy
17
Years of Service
User Offline
Joined: 11th Jun 2007
Location:
Posted: 24th Aug 2011 08:03 Edited at: 24th Aug 2011 08:04
What are you talking about? I don't really see a clear question here at all. Forcing programs *off* the gpu?

OpenCL and CUDA are two libraries used to write code for parallel computing. As you're probably aware, 3d transformations use matrix multiplication. Multiplying these 4x4 grids with a 3d point is computationally challenging when you have hundreds of thousands of points, so the GPU uses streaming cores to calculate tons and tons of floating point calculations all at the same time.

So, a GPU is not like your normal processor. That's why you can't just get code to run on a GPU.

I've looked into it a little, and I believe you need to use their compiler, which probably means there are separate keywords and syntax quirks for CUDA.

No idea what you're talking about that "almost everything in windows runs off the GPU". As far as I'm aware, the only thing that runs on the GPU is the windows 7 aero GUI stuff (with see-through panels and fancy schmancy stuff like that). Everything else is on the CPU.

[edit]
and by the looks of it the CUDA compiler only supports C and C++. that sounds relevant to your problem xD


Why does blue text appear every time you are near?
MrValentine
AGK Backer
14
Years of Service
User Offline
Joined: 5th Dec 2010
Playing: FFVII
Posted: 24th Aug 2011 08:13 Edited at: 24th Aug 2011 08:17
Cuda is Cg actually... I am aware of this... your typos are confusing me too...

Perhaps its the UK english that you didnt get...

Anyway the point here is... when you play a game for example... CS:Source it is a highly CPU intensive game... whereas other games rely on your GPU more than your CPU... I just wanted to ensure my applications were running off {ON for Neuro} the GPU.

Actually this is a question that has puzzled me for some time now... anyone know if PerfHUD can answer the following for me?

How to measure how much of your GPU is being utilised while playing a game... and specially in OUR case games which we created.


Quote: "I just wanted to know is there a possible method to force all our programs to jump off the GPU"
pretty clear to me... try imagining this... you were processed by a gpu... there are two platforms on two sides of a swimming pool... one named CPU the other a GPU... now if I said JUMP OFF THE GPU which platform are you going to jump from? also as you were processed by the gpu you would already be where exactly? oh the GPu platform... hope this helped you get past my UK english talk... either that or its a northerner thing we say haha.

EDIT

Adding to this I do not just mean new games... some older games process by default on the GPU I figured it was something DX did as the applications accessed GPU specific commands/functions. Obviously I do not mean the whole application just the labourous functions like a visual effect or images being pasted onto objects etc. or for more info... a character models textures.

ionstream
20
Years of Service
User Offline
Joined: 4th Jul 2004
Location: Overweb
Posted: 24th Aug 2011 08:50
CUDA and Cg are completely different, and CS:Source is not a particularly CPU intensive game. Running a program on the GPU is not guaranteed to be faster and is only suitable for highly parallel tasks, like graphics processing. Moving a piece of code from the CPU to the GPU will not automatically make it faster.

Quote: "Come to think of it... Vertex buffers? I know using too many shaders would be a bad idea... how can I utilise Vertexes to get the GPU to wake up?"


This idea does not make sense.

MrValentine
AGK Backer
14
Years of Service
User Offline
Joined: 5th Dec 2010
Playing: FFVII
Posted: 24th Aug 2011 09:10 Edited at: 24th Aug 2011 09:24
Quote: "If you are just asking how to get graphics to process on the GPU, then using DirectX 11/OpenGL 4.0(ish) will force you to do this via shaders and vertex buffers."


was refering to what you said...


But anyway I figured out it must be the reason why Shader version on games is so important as the shaders are the main thing thrown at the gpu to handle as well as Vertex Shaders so I get your point now... im starting to wake up... its now 07:07am here now...


Quote: "No idea what you're talking about that "almost everything in windows runs off the GPU". "


I never said this...

Anyway I did say My initial brief was not highly detailed enough this is starting to get messy... hope its all making sense now...

EDIT

I saw this linked to CUDA at one point but heres the highlights of Cg hope some find it helpful

http://developer.nvidia.com/cg-toolkit

EDIT

Here is PerfHUD... it might help me answer how much of my application is loaded onto CPU vs GPU.

http://developer.nvidia.com/nvidia-perfhud

EDIT

Wanted to add... if something I say is unclear... please just ask me to elaborate on it... instead of stabbing me, will appreciate this approach much more than the 'YOU GOT IT ALL WRONG' or 'WHAT YOU TALKING ABOUT' approach. arhh what a wondeful sky outside.

TheComet
17
Years of Service
User Offline
Joined: 18th Oct 2007
Location: I`m under ur bridge eating ur goatz.
Posted: 24th Aug 2011 09:24 Edited at: 24th Aug 2011 13:40
Quote: "Also umm that means there is nothing I can do with DBPro to invoke the GPU to say hello?"




I also found this to work really well:



TheComet

MrValentine
AGK Backer
14
Years of Service
User Offline
Joined: 5th Dec 2010
Playing: FFVII
Posted: 24th Aug 2011 09:25 Edited at: 24th Aug 2011 09:43
@TheComet

Always loved you >.< will try that first code shortly... not so certain about the second one... might harm my precious computer

EDIT

found a few errors just a few bits... fixed them heres the fixed code



Thanks for this, its an interesting concept I might try PerfHUD on it and see whether the GPU picked up the bill haha ahh I can always rely on TheComet to brighten up my mornings {or is it My nights? - comets in sky....=light?}

EDIT

For the realm of not making clear sense... heres a quote from the PerfhUD download page...

Quote: "Download PerfKit 6.70 for Win7 and Vista


• Windows 7 and Vista (32-bit applications)
• Windows 7 and Vista (64-bit applications)
•Older versions of PerfHUD are available here."


I am confused... 64bit Windows or 64bit Applications? I do not know many games written in 64bit... so... huh? would I be safe to just install the 64bit edition? I never had this dilemma last time I installed it... god I sound like a NEWBIE tonight... just to make sense I am only asking possibly age old questions which I never got clear answers to before sometimes... argh I need to have some coffee... more so the caffeine but hey we all know what I eman by Coffee... right? anyhoo lol I have been playing Xenogears for 24+ hours now... can not believe I am still just under 1/3 of the way through... and I am going through the story faster than you regulary should... using a tut from gamefaqs to aide me havent been playing games for some time... and wanted to make sure I didnt miss anything out as I played Star Oceon 2 a few weeks back and well... killed off half of the characters I could only get on disk one after on disk two {without giving a massive spoiler out} you no longer have the chance to employ them into your team.

Fallout
22
Years of Service
User Offline
Joined: 1st Sep 2002
Location: Basingstoke, England
Posted: 24th Aug 2011 12:12
lol - I couldn't make head nor tail of this thread either. 'Jump off the GPU' to me meant 'stop doing any work on the GPU', then it sounds like you meant the opposite!

My understanding is based on my Java+OpenGL experience, but I'm sure the concept is the same for C++ and DirectX. The final rendering of your vertex data takes place on the GPU, as instructed by DirectX, unless it is running in some form of software mode, or the commands you're issuing are not supported by the GPU.

To make everything run in software and just have the pixel colours sent off to the hardware, then you just do all your complex 3D maths in your program, arrive at an array of dots, and use a simple draw command.

To make as much as possible run on the GPU, you do as little as possible in your program, plug as much of the data into DirectX command sets as possible, and then let the renderer do it all for you.

In OpenGL 2 for example, I can write shader programs which do a lot of the work. When I send my vertex data off to be rendered, I know the shader program will do as much on the GPU as possible. If I keep that shader program simple, I'll have to do all the work in my program to set up the vertex buffers before I send them off to OpenGL to be rendered. The process for DirectX must be much the same.

Whether you can do that in GDK, I doubt. I think the point of GDK is to stop you having to worry about how anything gets rendered and just be content that it gets done. If you really want to have control over how things are rendered to maximize efficiency of GPU utilization, or to keep everything in software and minimize GPU utilization, you'll need to use DirectX directly.

I have no idea if that helped at all.

MrValentine
AGK Backer
14
Years of Service
User Offline
Joined: 5th Dec 2010
Playing: FFVII
Posted: 24th Aug 2011 12:30
<3 fallout

yeah it helped quite a lot... especially when you mentioned the array of dots part got my mind tingeling...

I came to this question for several reasons... 1 I am making a desktop application and 2 I want to recreate some nostalgic moments such as a PS1 game scene... just for fun or possibly adapt the same visual style and also came to the idea of using the GPU for effects etc as i know I will encounter delays if i am using a lot of effects etc but anyhow... I just wanted to find out more bout utilising the GPU to process dta/graphics...

For example going back to you 'Fallout, the array of dots how would I use this for say a sphere which has a texture mapped to it?

Fallout
22
Years of Service
User Offline
Joined: 1st Sep 2002
Location: Basingstoke, England
Posted: 24th Aug 2011 13:00
Quote: "For example going back to you 'Fallout, the array of dots how would I use this for say a sphere which has a texture mapped to it?"


With great difficulty! When you render a sphere, you have a buffer of vertices which make up that sphere. Each vertex in that buffer will have position, texture (UV) and normal data (and perhaps much more). Generally you send that buffer off to DirectX and it then renders it for you, and then you have your lovely image on screen.

It uses a matrix calculated from the camera projection (i.e. field of view), the cameras 'look at' (i.e. where it is and where it's pointing), and the spheres' translation/rotation. It then positions all those vertices that make up your sphere into a 2D image (i.e. your screen), using the matrix it's just calculated. Then it links those vertices together in 3s using triangles. Then for each pixel the a triangle occupies, it calculates which textel (pixel in the texture) it needs, and colours that pixel accordingly. And there is your sphere! DirectX/GPU does all that for you.

What you appear to be talking about is doing all that yourself, so you can find out which pixel is which colour. In order for you to do that, you'll need to understand how to:
(a) Break your 3D model down into vertex data and create vertex buffers and
(b) Translate that vertex data using matrices and other 3D maths into 2D screen coordinates and colours.

^A lot to learn, it ain't easy, and I wouldn't like to try and figure it out.

In most applications, to arrive at your array of dots which represents the screen, you'd send it off to directX to be rendered in the normal way, and hey presto, the screen it renders is your array of dots. You could then redraw these, if you so chose. But I get the feeling you might not be going about this the right way!

MrValentine
AGK Backer
14
Years of Service
User Offline
Joined: 5th Dec 2010
Playing: FFVII
Posted: 24th Aug 2011 13:49 Edited at: 24th Aug 2011 13:50
nonono not at all I am just trying to figure out the best methos to speed up display output... FPS in short ,

Thanks for that post it will come in handy.

EDIT

But matrices? {Matrix plural}

Login to post a reply

Server time is: 2025-05-20 18:50:08
Your offset time is: 2025-05-20 18:50:08