Sorry your browser is not supported!

You are using an outdated browser that does not support modern web technologies, in order to use this site please update to a new browser.

Browsers supported include Chrome, FireFox, Safari, Opera, Internet Explorer 10+ or Microsoft Edge.

Geek Culture / dx10 isn't lookin' so good atm...

Author
Message
Xarshi
18
Years of Service
User Offline
Joined: 25th Dec 2005
Location: Ohio
Posted: 26th Aug 2007 21:10
So now we have reasons to fear dx10. Current cards(say the 8800gts) barely run it at decent frame rates. They get like,30fps at most. In some cases,the 8800 ultra only gets 30fps on average. So we're looking at some bad times here for dx10. Heres a link of some benchmarks of the dx10 games and all that that are out atm:
http://www.anandtech.com/video/showdoc.aspx?i=3029

Read that,and you'll regret buying that shiny new 8800gts(unless you bought it for outstanding performance on dx9)

Hello
GatorHex
19
Years of Service
User Offline
Joined: 5th Apr 2005
Location: Gunchester, UK
Posted: 26th Aug 2007 21:31 Edited at: 26th Aug 2007 21:32
Would explain some of the lowish FPS figures I saw in early FPSCX screenshots.

I'm sure the graphics cards will get better with DX10.1 then we will see current 8800s drop to half price

DinoHunter (still no nVidia compo voucher!), CPU/GPU Benchmark, DarkFish Encryption DLL, War MMOG (WIP), 3D Model Viewer
gamebird
17
Years of Service
User Offline
Joined: 13th Jun 2007
Location:
Posted: 26th Aug 2007 21:44
I'm guessing that the main reason these cards have trouble handling those games in directx 10 is because those developers were planning on releasing when the more powerful geforce 9 series comes out.

I also read somewhere that directx 10.1 would not actually offer any additions important for developers.
Xarshi
18
Years of Service
User Offline
Joined: 25th Dec 2005
Location: Ohio
Posted: 26th Aug 2007 21:46
well yeah,the geforce 9 cards are supposed to be double performance of their geforce 8 counterparts. So I'm assuming they'll be fast. Like,freakishly fast. And won't consume as much power. And won't heat up as much. Which I think is cool.

Hello
Digital Awakening
AGK Developer
22
Years of Service
User Offline
Joined: 27th Aug 2002
Location: Sweden
Posted: 26th Aug 2007 22:30
Always when there's a new DX standard out it seems the latest cards cannot keep up. This is to be expected because the new standard is meant to last for a while and before DX 11 arrives the cards will be lightning fast.

[center]
CREATE games with ease! NO programming required!
WIP
Aaron Miller
18
Years of Service
User Offline
Joined: 25th Feb 2006
Playing: osu!
Posted: 26th Aug 2007 23:39 Edited at: 26th Aug 2007 23:40
Well, as I understand it the ATI R600, unoptimized, beat the GeForce 8800 GTX noticeably..

Here's some articles...
Article 1
Article 2

Cheers,

-naota

DBP, $80. DBP's plugins, $320. Watching DBP Crash, Priceless.
NG Website Aex.Uni forums
NeX the Fairly Fast Ferret
19
Years of Service
User Offline
Joined: 10th Apr 2005
Location: The Fifth Plane of Oblivion
Posted: 27th Aug 2007 00:32
How many people played Oblivion with a DX9 card and said aloud "hmm, this would be a thousand times better with DX10"?

Few, I'm guessing.

Besides the slight restructure, it is, like XP's task manager, difficult to improve upon. And therefore, like Vista is roughly equal to XP in (useful) functionality, DX10 is roughly equal to DX9 in (useful) functionality.


Since the other one was scaring you guys so much...
bond1
19
Years of Service
User Offline
Joined: 27th Oct 2005
Location:
Posted: 27th Aug 2007 00:48
Yeah that's disappointing for sure. Maximum PC magazine tested dx10 with similar results, and even admitted that they had fallen victim to the hype of dx10.

So what's going on here? Dx10 is supposed to be so much more efficient and optimized than dx9. Is it because the new video cards are underpowered, or are programmers not taking advantage of what dx10 has to offer...

I still don't regret getting the 8800 though, knowing that it's probably the pinnacle of dx9 performance.

----------------------------------------
"Your mom goes to college."
Xarshi
18
Years of Service
User Offline
Joined: 25th Dec 2005
Location: Ohio
Posted: 27th Aug 2007 01:57 Edited at: 27th Aug 2007 01:59
@bond1 - Well,its more so the current dx10 cards are severely underpowered. If nvidia is making their geforce 9 cards 2x better than their counterparts,I'm sure that means they are basicly making the default cards for dx10. Kind of like how the geforce 7 cards is mainly looked at for dx9,and the geforce 6 cards aren't really thought of as good. So the geforce 8 is like the geforce 6 for dx10 in my opinion. Or...the code is un-optimized. I do know that on the irrlicht forum,someone wrote a dx10 ray-tracer(irrSpintz has dx10 functionality) and it has about 10 spheres,each with ray tracing,then a bigger sphere with ray tracing. It is done using sm 4.0,and it gets about 255 fps with an 8800gtx,at least spintz said that. Heres the link to it on the irr spintz forum. This demo was made by sio2 on that forum and on the irrlicht forums. That guy is a freakin' genious with shaders.

http://irrlicht.spintz.com/smforums/index.php?topic=250.0

But with that in mind,that is no game right there. Games require much more active objects,such as ai,physics(sometimes),and other special fx other than ray tracing.

Now,don't get me wrong,if companies can use smart techniques in their games(like,dynamic LOD on characters and every object),then fps would increase. BUT,dx10.1 will include tessellation, which will make it simple to create vast terrains,tessellate characters(to be super high poly up close,and very low poly far away),and all sorts of stuff. So with that in mind,dx10.1 will be amazing. IF,the geforce 9 cards and the r700 cards can actually use dx10.1 at its fullest. Otherwise,it'll be a repeat of the geforce 8. Superior performance on last gen,low performance on current.

But I agree with the 8800 being the pinnacle of dx9 performance,and that is why I'm going to program with dx9 alone. That being said,I'm going to probably take up learning OGL or d3d9(yes I know ogl isn't dx9,haha,but the 8800s also have superior performance on ogl). Speaking of which,does ogl enable commercial quality games to be made that run fast but look amazing?

Hello
GatorHex
19
Years of Service
User Offline
Joined: 5th Apr 2005
Location: Gunchester, UK
Posted: 27th Aug 2007 02:01
Quote: "Is it because the new video cards are underpowered, or are programmers not taking advantage of what dx10 has to offer..."


I expect it's a case of once certain GPU maker rushing their cards to market first to capture the initial demand for a DX10 product.

DinoHunter (still no nVidia compo voucher!), CPU/GPU Benchmark, DarkFish Encryption DLL, War MMOG (WIP), 3D Model Viewer
Raven
19
Years of Service
User Offline
Joined: 23rd Mar 2005
Location: Hertfordshire, England
Posted: 27th Aug 2007 10:54
And this whole topic is why NON-developers should not discuss apis.

DirectX10 is one hell of alot quicker than DirectX9, what people seem to forget when developing between the two is that it isn't a 1:1 relationship performance and development wise.

What most Dx9-Gen developers seem to do is just convert a Dx9 pipeline to the Dx10 calls, and pile on new effects and more polygons because of a quicker pipeline.
The end result is you get a better looking version of the Dx9 engine but at quite a damn sight slower performance!

I mean what is forgotten is a card like say the 8800 handles 24million polygons... no matter what api you're using that is it's technical maximum. You can't go beyond that no matter how fewer draw calls are made.

What's more is Dx10-gen cards (particularly nvidia) seem to pile on more video ram. A fact that has really confused me, given that DirectX10 is orientated more towards streaming memory. So developers are still using the much slower seperate memory management way of loading everything first.

Yes, it can handle more shaders... but if you want to see performance increase you can't add more. Yet this is what developers are doing.

Hell look at FPSCreator X9 and X10. Lee has shown the performance difference between the two conserning shaders, but performance of the engine overall is half because he's doing a crapload more.

I mean in FPSCreator X9 with the 8800GTX you get around 1,500fps with shaders activated. With X10 you get around 500fps. So from that standpoint you can say DX10 is slower!
But what you're forgetting is now the engine has, Water, Bloom, Full-Dynamic Lighting, Soft-Particles, Soft-Shadowmapping, etc. plus models that are 5x more polygons!

So yeah you're dropping to almost 33% speed, but you've gained so much graphical fluff. I mean for gods sake, there is no longer a need for lightmapping at all.. the lighting is all real-time via shaders.

That is a HUGE change in not only how the engine is working, but also the work it needs to do. While FPSCreator X10, isn't the best example of what is actually capable with Dx10 itself; fact is compared to it's X9 engine it shows and increase in visual quality with a performance drop that will have it still running at a reasonable speed on lowest performance cards that can run it.

For years it has pissed me off how Dx9.0c developers seem to refuse to use the streaming memory system microsoft added, as it greatly increases performance. You wouldn't believe how much performance is lost via relying on the processor for transfer speeds. It also means low-end systems always end up loosing out even if their hardware can physically handle what is going on!

At the end of the day, PC (Windows) developer are just plain lazy. While APIs like Dx10 are being designed to help, it's not up to the API or Graphics Card to fix crappy engines.

NeX the Fairly Fast Ferret
19
Years of Service
User Offline
Joined: 10th Apr 2005
Location: The Fifth Plane of Oblivion
Posted: 28th Aug 2007 23:32
Graphical fluff that I don't care about too much. I'd rather shaders were eliminated and high-triangle environments with high-res, unrepeating textures replace them.


Since the other one was scaring you guys so much...
Digital Awakening
AGK Developer
22
Years of Service
User Offline
Joined: 27th Aug 2002
Location: Sweden
Posted: 29th Aug 2007 00:45
Shaders are required for a realistic look and you can also use them to increase polygon count at close range. There are both vertex and pixel shaders.

[center]
CREATE games with ease! NO programming required!
WIP
Aaron Miller
18
Years of Service
User Offline
Joined: 25th Feb 2006
Playing: osu!
Posted: 29th Aug 2007 00:47
For example, the lovely Aero interface uses a pixel shader on a textured quad.

Shaders are indeed required for an impressive look, they were implemented as a result of people asking for something that could be used to control the vertex rendering process, at low-level, from what I understand.


Cheers,

-naota

DBP, $80. DBP's plugins, $320. Watching DBP Crash, Priceless.
NG Website Aex.Uni forums
GatorHex
19
Years of Service
User Offline
Joined: 5th Apr 2005
Location: Gunchester, UK
Posted: 29th Aug 2007 01:06 Edited at: 29th Aug 2007 03:34
I was looking at an old computer magazine today

2004 - nVidia 5950 Ultra and 9800XT cost £300

3 years later...

2007 you can pick them up on ebay for around £30-40

Scary depreciation!

DinoHunter (still no nVidia compo voucher!), CPU/GPU Benchmark, DarkFish Encryption DLL, War MMOG (WIP), 3D Model Viewer
Digital Awakening
AGK Developer
22
Years of Service
User Offline
Joined: 27th Aug 2002
Location: Sweden
Posted: 29th Aug 2007 03:10
In 3 years a lot have happened with 3D card technology, you can't even compare the upcoming cards with those back then.

[center]
CREATE games with ease! NO programming required!
WIP
Aaron Miller
18
Years of Service
User Offline
Joined: 25th Feb 2006
Playing: osu!
Posted: 29th Aug 2007 03:14
Indeed, it's Moore's Law (I think).


Cheers,

-naota

DBP, $80. DBP's plugins, $320. Watching DBP Crash, Priceless.
NG Website Aex.Uni forums
Digital Awakening
AGK Developer
22
Years of Service
User Offline
Joined: 27th Aug 2002
Location: Sweden
Posted: 29th Aug 2007 10:23
Moore's law states that the size of a transistor halves every 18 months thus doubling the power of a CPU every 18 months (4 times in 3 years).

But there are a lot more going on, even in the CPU world, then transistors shrinking. A graphics card is like a small computer on a little board it got it's own special CPU, more and more of the latest in RAM technology, advanced cooling, it's own OS (Direct X / Open GL), it's own motherboard with buses etc.

[center]
CREATE games with ease! NO programming required!
WIP
NeX the Fairly Fast Ferret
19
Years of Service
User Offline
Joined: 10th Apr 2005
Location: The Fifth Plane of Oblivion
Posted: 31st Aug 2007 20:08
This high-poly, no-shader scene looks far better than
http://uk.media.wii.ign.com/media/748/748547/img_4816347.html
this fairly low-poly, super-shaded scene
http://uk.media.xbox.ign.com/media/015/015922/img_1289391.html
to me.


Since the other one was scaring you guys so much...
Chris K
21
Years of Service
User Offline
Joined: 7th Oct 2003
Location: Lake Hylia
Posted: 31st Aug 2007 20:13
Was that a joke?

The Wii game has way more shaders than the Xbox game, I can't actually remember any shaders as such in the original Halo...

You should compare it to Gears of War, that's low poly + normal mapped.

-= Out here in the fields, I fight for my meals =-
Kentaree
22
Years of Service
User Offline
Joined: 5th Oct 2002
Location: Clonmel, Ireland
Posted: 31st Aug 2007 20:22
You're comparing Halo, which has been out for years, to Metroid Prime 3, which isn't even out yet...

NeX the Fairly Fast Ferret
19
Years of Service
User Offline
Joined: 10th Apr 2005
Location: The Fifth Plane of Oblivion
Posted: 31st Aug 2007 22:32
Halo has:
Bump mapping (don't care)
Edge smoothing (don't care)
Real-time shadows (don't care much)
Ragdoll models (don't care because they glitch a lot)
A miles better physics engine

Metroid has:
Vertex lighting (looks alright)
No edge smoothing (looks alright)
Mostly a dark spot under bad guys (who cares?)
No ragdoll models - things explode (much more fun)
Pretty much no physics engine besides collision and gravity

Metroid still looks better; my point is that polygons look better than pixel shaders. If I had the choice, I would choose the former. It is much more artistic, and immersive. Both scenes (due to effects) take about the same processor load. Every surface on the Halo screenshot is bump mapped with specular highlights. With the exception of grenade blasts in dark corners, can you really tell the difference? Not really, no.


Since the other one was scaring you guys so much...
Chris K
21
Years of Service
User Offline
Joined: 7th Oct 2003
Location: Lake Hylia
Posted: 1st Sep 2007 12:23
Quote: "Metroid still looks better; my point is that polygons look better than pixel shaders."


What? No! Metroid has way more shaders running!!

Clearly it has Bloom lighting going on there (the developers made a big point of this because they said it was something they couldn't have done on GC), and also - look at the floor! It's blatantly normal mapped! You see all those bumps in a honeycomb shape... did you think that they were actually modelled with polygons?

It's hard to tell with jpegs, but it looks kind of like they both have anti aliasing on, I think that's what you mean by Edge Smoothing.

Quote: "Both scenes (due to effects) take about the same processor load."


Go and watch the original developer walk through of Unreal Engine 3, you'll see them switch between High Poly, and Low Poly + Normal Maps... you can't tell the difference except for the one with shaders runs about 100x faster.

-= Out here in the fields, I fight for my meals =-
Chris K
21
Years of Service
User Offline
Joined: 7th Oct 2003
Location: Lake Hylia
Posted: 1st Sep 2007 12:25
Here's an example...



It would be much slower to render a model that looks like the one on the right, just by using thousands of polygons.

-= Out here in the fields, I fight for my meals =-
bond1
19
Years of Service
User Offline
Joined: 27th Oct 2005
Location:
Posted: 1st Sep 2007 16:18 Edited at: 1st Sep 2007 16:24
Quote: "What most Dx9-Gen developers seem to do is just convert a Dx9 pipeline to the Dx10 calls, and pile on new effects and more polygons because of a quicker pipeline.
The end result is you get a better looking version of the Dx9 engine but at quite a damn sight slower performance!

I mean in FPSCreator X9 with the 8800GTX you get around 1,500fps with shaders activated. With X10 you get around 500fps. So from that standpoint you can say DX10 is slower!
"


But this isn't what we're getting at. And I'm not talking about FPSC X10, which has OBVIOUSLY more graphical goodness going on than X9.

I'm talking about retail games that have been tested back to back on dx9 and dx10, with negligable graphical differences between the two. There aren't any effects being "piled on" for the dx10 versions of the games from what I've seen , hell most of the "enhancements" (so far) seem to be soft particles or slightly softer shadows, hardly even noticeable. I can't remember the exact game they tested in Maximum PC, but we're talking about a 30% or more performance decrease in dx10 - why?

It would be a different story if dx10 was using higher-poly models or more shaders for these games, but that's just not the case, and doesn't explain at all why the games are running so much slower in dx10.

----------------------------------------
"Your mom goes to college."
Grandma
18
Years of Service
User Offline
Joined: 26th Dec 2005
Location: Norway, Guiding the New World Order
Posted: 1st Sep 2007 16:23
Quote: "Halo has:
[...]
Ragdoll models"


I can't remember Halo 1 having ragdoll physics, Halo 2 got it, but surely not Halo 1. Atleast not for the xbox, i haven't seen the PC version in action.

This message was brought to you by Grandma industries.

Making yesterdays games, today!
Raven
19
Years of Service
User Offline
Joined: 23rd Mar 2005
Location: Hertfordshire, England
Posted: 1st Sep 2007 16:45


That isn't using shaders?
As it is quite clear to make out:

Normal Mapping, Bloom, Reflection, Metallic Shaders going on.
Quite frankly there is no point having the power of an X1500 PRO, and not using the shaders for the game; as the shader pipeline is far more powerful than the fixed-function one or even fallback on the 750MHz PPC CPU.

Quite frankly you have no clue about neither what goes in to making games of this quality, or what technology is being used in the games you're so fond of... or perhaps you're just blinded with your want to believe that shaders are pointless.

At which you really should be linking a Playstation 2 game, as while effects like Bloom and Bump were possible in the same scene they were done using the twin fixed function pipeline. It was never possible to achieve real-time lighting that is anywhere close to as nice as what you can with Shaders.

Here are some better "comparison" examples

PS2 - No Shaders


GC - Shader 1.4


PS2 - No Shaders


GC - Shader 1.4


Both versions of Resident Evil 4 use identical media... so same polycount for both games, but you can see a very big difference even with these crappy screenshots.

wish I had the PS2 version, as I'd be able to capture screens for a more direct comparison you just can't see from those screenshots as they're quite low quality for some reason.

Chris K
21
Years of Service
User Offline
Joined: 7th Oct 2003
Location: Lake Hylia
Posted: 1st Sep 2007 16:56
GC could do some really nice shader effects, the water in RE4 there is a lot better than the PS2 version, which doesn't seem to distort at all. The lighting in parts of RE4 is just amazing too...

The heat haze effect in Wind Waker is really nice as well, probably the coolest shader effect on GC.

I think the PS2 models were lower poly though, in fact I'm pretty much sure of it... and also the sounds were lower quality too. And also the cutscenes weren't in engine, they were FMV. Basically, if you've only played the PS2 version of RE4, you haven't played RE4

-= Out here in the fields, I fight for my meals =-
NeX the Fairly Fast Ferret
19
Years of Service
User Offline
Joined: 10th Apr 2005
Location: The Fifth Plane of Oblivion
Posted: 2nd Sep 2007 23:38 Edited at: 2nd Sep 2007 23:39
Although there is shader usage in your screenshot of MP3, Raven, there is none in mine. It is an early screenshot before bloom, etc was added. The "Metallic shaders" you see are merely an alphablended copy of the model beneath with a different, "shiny" texture and distorted UV maps. By the way, where on Earth is the reflection in that?


Since the other one was scaring you guys so much...
Raven
19
Years of Service
User Offline
Joined: 23rd Mar 2005
Location: Hertfordshire, England
Posted: 2nd Sep 2007 23:45
Quote: "Although there is shader usage in your screenshot of MP3, Raven, there is none in mine."


I'd suggest you focus on the weapon in your so-called earlier screenshot without shaders.
And if you can't see the reflection in the shot I posted, then perhaps you should refrain from arguing about shaders against high-poly no shader pipelines.

Especially given Metroid Corruption is very far from "high-poly"

NeX the Fairly Fast Ferret
19
Years of Service
User Offline
Joined: 10th Apr 2005
Location: The Fifth Plane of Oblivion
Posted: 2nd Sep 2007 23:50 Edited at: 3rd Sep 2007 00:01
Just looks alphablended to me. Can't see any reflections. Although if there are any, I apologise. By reflection, you mean a reflection of the scene around, yes? Or do you mean a pre-drawn image that is approximate to the surroundings? (seen this in Lego Star Wars, Elite Force 2, and Metroid Prime 1-2)

Ok, maybe the screenshots I posted were bad examples.
Yes, I realize I have changed to Metroid Prime 1. It is more representative of what I am talking about than MP3, which I am yet to play or see except in the form of blurred screenshots.

Pretty high poly, no shaders. (Vertex fog and lighting. If you don't believe me, play it all the way through and watch the amusing lighting glitches it can cause in the Morph Ball)
Yes, every single bar on that door is picked out in polygons. It looks impressive when it opens.
http://uk.media.cube.ign.com/media/015/015316/img_1559236.html

Bump mapped floor (although kinda hard to see in this screenshot), bump mapped walls, real time shadowing on the Master Chief, antialias and horrendously low poly.
http://uk.media.xbox.ign.com/media/015/015922/img_1272881.html

There.
Now decide which looks better.


Since the other one was scaring you guys so much...
Chris K
21
Years of Service
User Offline
Joined: 7th Oct 2003
Location: Lake Hylia
Posted: 3rd Sep 2007 01:56 Edited at: 3rd Sep 2007 01:56
This is ridiculous, there's no way you can compare any random two screenshots, we could happily find a stunning screenshot of Halo and a bare one of Metroid Prime...

I mean, hell, you had to scroll through to 65 of 103 for the Halo one, and 111 of 262 for the Metroid one.

My objection was merely that you said there were no shaders running in the screen shot you posted, when the floor is clearly normal mapped and not made out of polygons...



You're argument only really applies to depth mapping effects as well, I mean, there is no way to recreate any distortion or shadowing effects with polygons is there?



Your main mistake, though, is thinking that it's roughly as intensive to display a high poly model than to display a lower poly model with a normal map made from the high poly one. The normal mapping way is much, much faster even though it looks the same.

-= Out here in the fields, I fight for my meals =-

Attachments

Login to view attachments
Raven
19
Years of Service
User Offline
Joined: 23rd Mar 2005
Location: Hertfordshire, England
Posted: 3rd Sep 2007 03:07
Realistically, your argument about which looks better between Halo and Metroid fails given Halo to me actually looks better... but the argument itself stands up because Halo uses less shaders on the whole than Metroid on the GC/Wii have.

While you probably can't notice (and certainly can't from those piss-poor screenshots .. seriously what is it about GC screenshot that make the output seem much worse than it is from the TV?!) is that while yes Metroid has a higher polycount for it's level; I mean seriously Bungie could've jazzed it up a bit with the occasional curved surface (atleast they learnt their lesson for Halo 2.. sort of) the fact is there are still a number of key shaders used to get Metroid looking the way it does.

As I said in one of my first replies, if you want to do the whole Shader vs Fixed-Pipeline argument; look at some Playstation2 games.
The Playstation 2 is incapable of Shaders of any form, so graphics that enhance it require physical writing of the graphics pipeline, usually cutting the potencial polygon count in-half to create a snazzy effect (nb: MGS3) .. this said some Playstation 2 games do look stunning without relying heavily on these effects regardless.
Great example is Killzone.

This said for modern graphics, if you can show me a traditional fixed-function pipeline capable of Gears of War graphics quality; then I might agree that Shaders aren't really what they're cracked up to be, however I highly doubt you will be able to.

Vertex lighting just doesn't cut it next to Pixel lighting, and while modern graphics cards can push upto 20million polygons per scene (Radeon HD 2900 XT) there is no way you can possibly have a world consisting of 5million polygons with the guys consisting of another 2million polygons each (some scene in Gears of War have up to 16 on screen at once), plus expect the video ram to handle that along with the textures that would have to be layers (for each pass of a texture you can take away some of the potencial polygons you can have).

All of that is provided you have nothing else going on like AI, Gameplay or Physics.

Login to post a reply

Server time is: 2024-11-19 05:44:58
Your offset time is: 2024-11-19 05:44:58