Sorry your browser is not supported!

You are using an outdated browser that does not support modern web technologies, in order to use this site please update to a new browser.

Browsers supported include Chrome, FireFox, Safari, Opera, Internet Explorer 10+ or Microsoft Edge.

Geek Culture / [LOCKED] GeForce 6800 (nv40)

Author
Message
Shadow Robert
21
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 15th Apr 2004 00:38
The new site went up around, 2minutes ago.
http://www.nvidia.com/page/home.html

alot of new videos, feel free to check them out.

some interesting stuff for those who wanted to know.


Athlon64 FX-51 | 1.5Gb DDR2 PC3400 | GeForce FX 5900 Ultra 56.60 | DirectX9.1 SDK | Audigy2 | Windows XP 64-Bit
The Game
21
Years of Service
User Offline
Joined: 22nd Dec 2002
Location: United States
Posted: 15th Apr 2004 00:54
TechTV has the review for it up.

[/href]http://www.techtv.com/news/products/story/0,24195,3668181,00.html[href]

I am the game and I want to play.
las6
22
Years of Service
User Offline
Joined: 2nd Sep 2002
Location: Finland
Posted: 15th Apr 2004 11:45
far cry still looks ugly as ever.


| Keyboard not detected. Press F1 to continue. |
Shadow Robert
21
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 15th Apr 2004 19:15
Now that's just stupid o_O ...
Radeon doesn't support Shader 3.0 or 3.x
GeforceFX are suppose to support 3.0 (as specified in DirectX 9.0), but for some reason the drivers lock you to 2.x

Sim & Chris (the lead designers) have both recently explained why the support for certain things is within hardware yet the drivers current do not support such things.
I'm still pretty worried about the blend banding, it doesn't affect the cards with the 56.xx drivers as badly but there is still a noticeable difference between the Geforce4 and GeforceFX processors in this respect.

Far Cry doesn't actually look too impressive to me, dunno why. It's like gun metal really; they've used the current features to show off a nice game and all, but really when you pay closer attention the graphics are seriously let down and the gameplay totally sucks.
It's why i love Doom3's graphics so much even over HL2, although HL2 at time can look better; Doom3's graphics have a consitant quality to them which doesn't appear to faulter.

Sorry but i've not been impressed with the Radeon's since release; thier speed is only marginally better under DirectX and the FX5700 has recently shown that NVIDIA processors in the hands of IBM are far more powerful at a fraction of the cost. As the nv40 is also being developed by IBM, you can bet that Geforce 6-Series is not going to be as full of holes as the FX-Series has been.


Athlon64 FX-51 | 1.5Gb DDR2 PC3400 | GeForce FX 5900 Ultra 56.60 | DirectX9.1 SDK | Audigy2 | Windows XP 64-Bit
Shadow Robert
21
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 15th Apr 2004 19:47
Quote: "Far cry looks best on ati's graphics cards

This maybe because of the new nvidia drivers for their nv40.. hopefully when nvidia sorts this 'slight' problem "


that is a definate dig at the FX; especially as even the developers stated for nZone that it performanced and looked better on the FX5950 than any other card.

from what i've gathered from the R400 is it *will* have Shader 3.0 compatibility... but just like the Radeon 9-Series will only support the Standard version, no extended version.
(which is why Far Cry performs better on the FX-Series)

but it still looks like a dog to me.


Athlon64 FX-51 | 1.5Gb DDR2 PC3400 | GeForce FX 5900 Ultra 56.60 | DirectX9.1 SDK | Audigy2 | Windows XP 64-Bit
Shadow Robert
21
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 15th Apr 2004 20:29
the shader quality looked horrible on the preview version i got my hands on; god knows what a lower quality looks like.
that said on max settings 1600x1200 my fx5950 ultra was still pumping out 60fps ... that kinda speed at that resolution i really couldn't care less if the radeon performed better.

that said the lastest drivers for the geforce fx series changes the story entirely speedwise.
example 3dmark '03 ... same system just updated drivers;
FX5200 44.04 = 859, 53.23 = 1015, 56.70 = 1420
FX5950 ultra 53.23 = 8,320 56.70 9,870

that is on bare min spec everything else for the tests, the difference though is just bloody remarkable.
whatever they've fed thier 56.xx Forceware, it's pushing some monster performance now.


Athlon64 FX-51 | 1.5Gb DDR2 PC3400 | GeForce FX 5900 Ultra 56.60 | DirectX9.1 SDK | Audigy2 | Windows XP 64-Bit
Gery
20
Years of Service
User Offline
Joined: 24th Jan 2004
Location:
Posted: 15th Apr 2004 21:59
on of my friend, who have a FX5200, the Far Cry, when he play in 800x600, with low qualify, the game runs 30 fps.
When he switch to 1024, the game run only with 22!
This is horrible slow down!

He has AMD 2200 + FX5200 + 384 DDR

Ezerkilencszázhatvanba' / ördög szart a katlanba /aki először megszólal /azé lesz a katlan szar.
Gery
20
Years of Service
User Offline
Joined: 24th Jan 2004
Location:
Posted: 15th Apr 2004 22:05
WOW WHAT A BEAST!

Its need two, not shared Power connector nad 480 Watt power!!!
And nVidia are not thelled, how speed is the GPU.
I think, it is around 500-800 Mhz....

Ezerkilencszázhatvanba' / ördög szart a katlanba /aki először megszólal /azé lesz a katlan szar.
Neophyte
21
Years of Service
User Offline
Joined: 23rd Feb 2003
Location: United States
Posted: 15th Apr 2004 23:28
Here's something from Tom's Hardware that seems pretty informative.
http://www20.tomshardware.com/graphic/20040414/index.html

The Geforce 6800 is shaping up to be quite the impressive card. It seems to be kicking a$$ and taking names. I'd seriously suggest that people check out those scores. The Geforce 6 was running at two resolutions higher than its competitor's best and it was still beating them.

@Raven

I know that I'm breaking my little self imposed rule about not arguing with you unless you argue with me, but I'm in a bad mood.

"GeforceFX are suppose to support 3.0 (as specified in DirectX 9.0), but for some reason the drivers lock you to 2.x"

Or maybe its because it isn't in the graphics card.
I thought I linked you to nvidia's site where they listed all of the OGL extensions for their cards and NONE of them had extensions for 3.0? Or how about all of the hardware review sites out there that state that the FX cards at most support VS and PS 2_x?

Will you ever learn? Why must you studiously ignore everything that contradicts your world view? Grrrrrrrr. *tears hair out*
Ian T
22
Years of Service
User Offline
Joined: 12th Sep 2002
Location: Around
Posted: 16th Apr 2004 01:00
Quote: "The Geforce 6 was running at two resolutions higher than its competitor's best and it was still beating them."


That's because the competitors haven't released their latest line yet... it's a step ahead .

Shadow Robert
21
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 16th Apr 2004 04:39
Quote: "Or maybe its because it isn't in the graphics card.
I thought I linked you to nvidia's site where they listed all of the OGL extensions for their cards and NONE of them had extensions for 3.0? Or how about all of the hardware review sites out there that state that the FX cards at most support VS and PS 2_x?"


I see a pretty major flaw in your reasoning here.
Shader Model 1, 2, 3 & 4 are Developed by Microsoft for DirectX.
How exactly would OpenGL Extensions have anything to do with them?
Add to this OpenGL only supports FP 1.0 and 2.0 ... no varations.
So why would there be any NVIDIA extentions outside of NV30/40?
There are functions graphics card specific, not Shader Model specific.

Add to this you should read the ENTIRE paragraph i wrote.
The designer stated that the cards are technically Shader 3.0 capable, DirectX9 however is not. Add to this the drivers have the features LOCKED not that they're not there, it is that they are locked. Just like nvTweak is locked, up until recently GLSL OpenGL 1.5.0 support was locked and a whole host of other things which are in the drivers but are not currently available for use.


Athlon64 FX-51 | 1.5Gb DDR2 PC3400 | GeForce FX 5900 Ultra 56.60 | DirectX9.1 SDK | Audigy2 | Windows XP 64-Bit
Neophyte
21
Years of Service
User Offline
Joined: 23rd Feb 2003
Location: United States
Posted: 16th Apr 2004 17:05
@Raven

"Shader Model 1, 2, 3 & 4 are Developed by Microsoft for DirectX"

Bull. Friggin. Sh*t. The Shader models are developed by independent companies sometimes with the collaboration of Microsoft sometimes without(PS 1.4 is the sole creation of ATI for example).

"How exactly would OpenGL Extensions have anything to do with them?"

Because it would be accessible through the OpenGL extenstion mechanisms. Would you like me to link you to their page again?
http://developer.nvidia.com/object/nvidia_opengl_specs.html
Everytime a new feature for a card is released they come out with extensions to support it. Just cause it's not in the DX spec doesn't mean that it won't be accessible.

"Add to this OpenGL only supports FP 1.0 and 2.0 ... no varations."

*cough*ATI_fragment_shader*cough*

"So why would there be any NVIDIA extentions outside of NV30/40?"

Huh? Where did I say that?

"There are functions graphics card specific, not Shader Model specific."

Quote: "NV_vertex_program
NV vertex program1_1
NV_vertex_program2"


Quote: "NV_texture_shader
NV_texture_shader2
NV texture shader3 "


Source: The link above. Read it this time.

"The designer stated that the cards are technically Shader 3.0 capable, DirectX9 however is not"

DX9 is Shader 3.0 capable. The spec for 3.0 is listed in the DX9 sdk. And if you still don't believe me take a look at this joint presentation by Nvidia and ATI where they talk about VS and PS 3.0. And the title of this presentation? Introduction to DirectX 9 Shader Models.
http://www.atitech.ca/developer/gdc/D3DTutorial1_Shaders.pdf

"Add to this the drivers have the features LOCKED not that they're not there, it is that they are locked. Just like nvTweak is locked, up until recently GLSL OpenGL 1.5.0 support was locked and a whole host of other things which are in the drivers but are not currently available for use."

More baseless stupid claims. Why don't you provide a link to this fabled discussion then and settle this once and for all? I'm not holding my breath though. You have this horrible habit of making stuff up when it suits you just like you are doing now.

Oh, and Nvidia have a habit of posting all of their presentations up on their website so I'm sure you'll have nooooo trouble whatsoever disproving me.
Shadow Robert
21
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 16th Apr 2004 21:01
Oki i'm going to put this bluntly, you are a frikkin' moron!

two links for you:
OpenGL Shader Specifications

DirectX HLSL Specifications

Now it baffles me to why OpenGL ONLY seems to cover ARB, ARB2 and VP which are quite obviously different DirectX Shader 1.0/2.0/3.0 Specifications... it also baffles me to why ARB only seems to cover what is available within Pixel 1.0 and not 1.1/1.2/1.3/1.4
Oh yeah but that's right i'm forgetting 1.3 is NVIDIA only and 1.4 is ATI only.

Which you appear to believe are part of the extensions nv_nv3x and ati_fragment_program... YET if we take a look at Cg, it appears to support Pixel 1.3 AND Pixel nv30 formats!
What would the point be in adding BOTH formats if they are identical!?

I would say you want the full story on Shader 3.0 you follow the bread crumbs on NVIDIA's updated site and stop wasting my time.


Athlon64 FX-51 | 1.5Gb DDR2 PC3400 | GeForce FX 5900 Ultra 56.60 | DirectX9.1 SDK | Audigy2 | Windows XP 64-Bit
CattleRustler
Retired Moderator
21
Years of Service
User Offline
Joined: 8th Aug 2003
Location: case modding at overclock.net
Posted: 16th Apr 2004 21:44
wow, never dull around here, eh. Especially when having a VS conversation about video cards.

carry on mates


Home of the VB.NET Class Builder Utility - Demo and FULL versions now available!!
Ian T
22
Years of Service
User Offline
Joined: 12th Sep 2002
Location: Around
Posted: 17th Apr 2004 01:17
Raven once again proves he does not know how video cards, and PS models specifically, work

Emperor Baal
20
Years of Service
User Offline
Joined: 1st Dec 2003
Location: The Netherlands - Oudenbosch
Posted: 17th Apr 2004 01:43 Edited at: 17th Apr 2004 01:49
Take a look at the new unreal (3) engine that's using the Geforce 6800 Ultra to render the objects:

movie (~11mb, need the latest codecs):

http://www.jamesbambury.pwp.blueyonder.co.uk/unreal3_0002.wmv


When the card is released it will run with 300W+ psu (believe me, its true) The latest radeon9800XT uses more power than the 6800 ultra.

The card also supports "virtual displacement mapping", when it's used on a plain, textured with a wall, it will look 3D in-game (like stones on eachother)

Quote: "
UPDATED

Amd 2800+ 1024mb pc3200 A7N8X - Deluxe Ati Radeon 9800PRO 256mb
"
Shadow Robert
21
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 17th Apr 2004 07:24
Just for Mouses' benifit then...

Quote: "
Support for 14 profiles:
- vs_1_1 for DirectX 8 and DirectX 9
- vs_2_0 and vs_2_x for DirectX 9
- ps_1_1, ps_1_2 and ps_1_3 for DirectX 8 and DirectX 9
- ps_2_0 and ps_2_x for DirectX 9
- arbvp1 [OpenGL ARB_vertex_program]
- arbfp1 [OpenGL ARB_fragment_program]
- vp20, vp30 [NV_Vertex_program 1.0 and NV_Vertex_program 2.0]
- fp30 [NV30 OpenGL fragment programs]
- fp20 [NV_register_combiners and NV_Texture_shader)
"


you can make of that what you will.
the stupidity of some people is just dumbfounding really.

that all said, the video looks awesome... wish i could've been at the launch


Athlon64 FX-51 | 1.5Gb DDR2 PC3400 | GeForce FX 5900 Ultra 56.60 | DirectX9.1 SDK | Audigy2 | Windows XP 64-Bit
Neophyte
21
Years of Service
User Offline
Joined: 23rd Feb 2003
Location: United States
Posted: 17th Apr 2004 09:39 Edited at: 17th Apr 2004 09:42
@Raven

"Oki i'm going to put this bluntly, you are a frikkin' moron!"

Oh, this is going to be fun.

"two links for you:
OpenGL Shader Specifications

DirectX HLSL Specifications"

Errr...you know that none of those support your case whatsoever, right?

"Now it baffles me to why OpenGL ONLY seems to cover ARB, ARB2 and VP which are quite obviously different DirectX Shader 1.0/2.0/3.0 Specifications...it also baffles me to why ARB only seems to cover what is available within Pixel 1.0 and not 1.1/1.2/1.3/1.4 "

Wait. First you say that the ARB, ARB2, and VP are obviously different from the DX Shader models. Than you say that ARB is just like PS 1.0 which is a DX Shader model. So which is it? Are they the same or are they different? Make up your mind.

"it also baffles me to why ARB only seems to cover what is available within Pixel 1.0 and not 1.1/1.2/1.3/1.4"

The various models are covered through vendor specific OGL extensions. Maybe *cough*ATI_fragment_program*cough* was a little too subtle for you. I'll try again. ATI_fragment_shader is an OpenGL extension that exposes ATIs shader capabilites on the Radeon 8500 and up(presumably). That means you can run PS 1.4 shaders through it.
Quote: "Pixel Shading operations of the Radeon 8500 are exposed via the ATI_fragment_shader extension."

Source: Page 3
http://www.atitech.ca/developer/ATI_EGDC_AdvancedShaders.pdf

Also, how do you explain this if OpenGL extensions only cover FP 1.0 and FP 2.0?
Quote: "This paper focuses will focus on Vertex Shader version 1.1 and Pixel Shader version 1.4"

Quote: "OpenGL functionality for both Vertex and Pixel shaders is provided by hardware vendors through OpenGL extensions and further details can be found on the relevant company's webpage."

Source: Page 3 of "Programmability Features of Graphics Hardware"
http://www.ati.com/developer/SIGGRAPH02/GHProgrammability-notes.pdf
Here is a paper that focuses on VS 1.1 and PS 1.4 and says that the functionality for them and other shaders is exposed through vendor specific extensions.

"Oh yeah but that's right i'm forgetting 1.3 is NVIDIA only and 1.4 is ATI only."

I've never said that 1.3 is NVIDIA only. That sounds like some hair-brained idea you'd cook up. If you think I did then quote me where I said it. I don't think you'll succeed though. After all, I just posted a tutorial up that listed many ATI cards as being 1.3 capable.

As for 1.4 being ATI only I'm not entirely sure about that. I know for a fact that 1.4 is not supported on the Geforce 4 ti 4200 which I have. I also know that Cg doesn't support the PS 1.4 profile at this time. This leads me to conclude that either PS 1.4 doesn't run on Nvidia hardware, or they are deliberately making sure that ATI's format doesn't get used a lot. If 1.4 isn't in the old Geforce 4s I don't see a whole lot of point adding it into the new hardware as the much superior 2.0 is implemented so developers will just use that instead. But this is all speculation. Until I can get an outside source to confirm whether Nvidia hardware supports PS 1.4 or not I'm going to have to side with the not for now.

"Which you appear to believe are part of the extensions nv_nv3x and ati_fragment_program... YET if we take a look at Cg, it appears to support Pixel 1.3 AND Pixel nv30 formats!
What would the point be in adding BOTH formats if they are identical!?"

This is just too funny. PS 1.3 and the nv30 format aren't identical. Nv30 is the codename for Nvidia's GeforceFX series. That means the Nv30 format is PS 2.0.

"I would say you want the full story on Shader 3.0 you follow the bread crumbs on NVIDIA's updated site and stop wasting my time."

Nice try. But everyone knows you are full of sh*t. The presentation page for nvidia's site is here:
http://developer.nvidia.com/object/presentations.html
Notice how none of those mention anything whatsoever about PS and VS 3.0 being hidden in the GeforceFX series? Thought so.

Also, just to show you up a little more:
Quote: "NVIDIA CineFX 3.0 is poised to unleash a new level of programming creativity. With full DirectX 9.0 Shader Model 3.0 support, the newest GeForce GPUs will soon power a new generation of games with unmatched realism, digital worlds with mind-blowing complexity, and lifelike characters that move through cinematic-quality environments."

Source: Advanced Visual Effects
http://www.nvidia.com/object/feature_cinefx3.0.html
Guess that blows your whole DX9-isn't-capable-of-Shader-Model-3.0 theory out of the water, huh?

Oh, and to rub it in a little more, if OpenGL isn't capable of anything more than Arbfp1 and fp20 then how come Nvidia says this in their press release?
Quote: "Through the power of the Microsoft® DirectX® 9.0 Shader Model 3.0 and OpenGL® 1.5 APIs, programmers can now develop shader programs utilizing these technologies and techniques:"

Source: Advanced Technologies
http://www.nvidia.com/object/feature_cinefx3.0.html

Guess that makes you feel like a bit of a prat for saying "Add to this OpenGL only supports FP 1.0 and 2.0 ... no varations.",eh?

"Just for Mouses' benifit then...


Quote: "
Support for 14 profiles:
- vs_1_1 for DirectX 8 and DirectX 9
- vs_2_0 and vs_2_x for DirectX 9
- ps_1_1, ps_1_2 and ps_1_3 for DirectX 8 and DirectX 9
- ps_2_0 and ps_2_x for DirectX 9
- arbvp1 [OpenGL ARB_vertex_program]
- arbfp1 [OpenGL ARB_fragment_program]
- vp20, vp30 [NV_Vertex_program 1.0 and NV_Vertex_program 2.0]
- fp30 [NV30 OpenGL fragment programs]
- fp20 [NV_register_combiners and NV_Texture_shader)
""

If its for his benefit you might as well tell him your source:
http://developer.nvidia.com/object/cg_toolkit.html

I'd also like to point out the conspicious absense of ps_1_4 here. And how when it comes to OGL extensions, only the very old and basic ARB_*_program and Nvidia only extensions are supported. No ATI or other vendor extensions are mentioned. Hmmmm. Could this be that Nvidia is deliberately ignoring other platforms which hold certain shader models that aren't implemented in their hardware?

"the stupidity of some people is just dumbfounding really."

I couldn't have said it better myself.

But to sum up:

Its been over a year since I started arguing with you about shaders and I can see almost nothing has changed. You still keep touting some tibit of gossip or rumor you heard off the net as fact and making stuff up as you go along as well. I'm still discrediting everyone of your sorry arguments and claims over and over and over again also.

The one positive note of change is that this time you actually put a little effort into your post and provided some links. Granted, none of them actually helped your case whatsoever, but at least it is different from your usual make-an-outrageous-claim ignore-evidence-to-the-contrary make-another-outrageous-claim cycle. But all of this has lightened my mood a bit and I'm feeling optimistic. Who knows. Maybe in another year's time you might actually post a link that's relevant to your argument.

@CattleRustler

"wow, never dull around here, eh. Especially when having a VS conversation about video cards."

This is nothing. You should have seen our arguments a year ago about shaders. There would be at least 50 posts added to a thread because of just the two of us arguing. Each of them several paragraphs long.

Here is a link to my very first argument with Raven:
http://darkbasicpro.thegamecreators.com/?m=forum_view&t=8191&b=1

And here is the second argument:
http://darkbasicpro.thegamecreators.com/?m=forum_view&t=10482&b=1

Both of them are about shaders and both are long reads.

*Edited to fix quote tag*
Rob K
Retired Moderator
22
Years of Service
User Offline
Joined: 10th Sep 2002
Location: Surrey, United Kingdom
Posted: 17th Apr 2004 14:02
Quote: "the stupidity of some people is just dumbfounding really."


How he can write that and keep a straight face I just don't know.

BlueGUI:Windows UI Plugin - All the power of the windows interface in your DBPro games. - Plus URL download, win dialogs.
Over 140 new commands
Lord Ozzum
20
Years of Service
User Offline
Joined: 29th Oct 2003
Location: Beyond the Realms of Death
Posted: 17th Apr 2004 17:02
FATALITY!

Once I dreamt that I fell into a lake full of the undead and demons. I screamed and hollered as my kitten jumped into it with me. None of my friends helped me.

I don't trust them anymore.
Shadow Robert
21
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 17th Apr 2004 19:50
This is just bloody stupid.

Quote: "PS 1.3 and the nv30 format aren't identical. Nv30 is the codename for Nvidia's GeforceFX series. That means the Nv30 format is PS 2.0."


Quote: "Here is a paper that focuses on VS 1.1 and PS 1.4 and says that the functionality for them and other shaders is exposed through vendor specific extensions."


OpenGL DOES NOT SUPPORT these Shader Specifications.
THE GRAPHICS CARD DRIVERS DO!

Why is this a major factor? Because put simply
ati_fragment_program ... includes PS 1.1/1.4/2.0
nv30_ ... includes PS 1.0/1.1/1.2/1.3/2.0/2.x

Now why is interesting?
Because under DirectX both NVIDIA FX-Series and Radeon 9-Series BOTH support the others specific shader versions!
What is even more interesting is that DirectX8.0 (8.1 for 2.x) supported these features way back in 2000 when it wasn't until OpenGL 1.4 when anything other than PS 1.3 & 1.4 were supported via these cards.

Not ALL ATI cards support 1.3, Not ALL NVIDIA cards support 1.4 however the current versions DO!

Quote: "I'd also like to point out the conspicious absense of ps_1_4 here. And how when it comes to OGL extensions, only the very old and basic ARB_*_program and Nvidia only extensions are supported. No ATI or other vendor extensions are mentioned. Hmmmm. Could this be that Nvidia is deliberately ignoring other platforms which hold certain shader models that aren't implemented in their hardware?"


The Cg Compiler is Open-Source, why add support for cards/extensions you don't have to?
If ATI wish to release a Cg Compiler update to support 1.4 they can, but considering 1.3 is only in there because of the X-Box and 2.0 runs faster on all cards that support both who cares?

Quote: "http://www.nvidia.com/object/feature_cinefx3.0.html
Guess that blows your whole DX9-isn't-capable-of-Shader-Model-3.0 theory out of the water, huh?"

Yes it would... IF they were using DirectX 9.0b, which like I said before DirectX does not CURRENTLY support 3.0!

Quote: "Quote: "Through the power of the Microsoft® DirectX® 9.0 Shader Model 3.0 and OpenGL® 1.5 APIs, programmers can now develop shader programs utilizing these technologies and techniques:"
Source: Advanced Technologies
http://www.nvidia.com/object/feature_cinefx3.0.html"


Oh, you mean OpenGL 1.5 which i believe a year ago everyone was telling me didn't exist, that OpenGL 2.0 was the next version AND the OpenGL Speicification that NVIDIA themselves have single handedly developed?
Add to this OpenGL 1.5 has only PUBLICALLY been released within the past month, whereas DirectX 8.1 is the DirectX which introduced all of the current Shader Models (if only in writing in the help file!)

March 2004 - OpenGL 1.5 (nvidia developed)
August 2001 - DirectX 8.1 (microsoft & nvidia co-developed)

Yeah, i'm sure your original argument stands fast with that there; i mean i'm at a total loss to how the hell Microsoft could possible have had these specifications to models only recently released in OpenGL ... i mean unless perhaps I WAS RIGHT in the fact that Shader Models are DirectX.

This continues to the fact that both ATI and NVIDIA have had to personally support these extensions for 3-frikkin'-years, rather than them being ACTUAL specifications in OpenGL.

Shader Models are created for DirectX, Shader 4.0 was announced to be the MAJOR feature of DirectX10 ... DirectX9 introduced Shader 3.0, DirectX 8.1 introduced 2.0, DirectX 7.0 introduced Shader 1.0!!

The specifications might be supported by OpenGL, but they are neither native or even fully supported.

God this is really some stupid ass argument you've started, and i can see all the quotes you fancy throwing my way... but at the end of the day my Quote just for Mouse is really what you should've paid attention to!

- vs_1_1 for DirectX 8 and DirectX 9
- vs_2_0 and vs_2_x for DirectX 9
- ps_1_1, ps_1_2 and ps_1_3 for DirectX 8 and DirectX 9
- ps_2_0 and ps_2_x for DirectX 9

THESE ARE THE MODELS, THIS IS WHERE THEY ARE INTRODUCED, THIS IS WHERE THEY COME FROM!
OpenGL 1.4 ONLY has support for Model 1.0 and Model 2.0

you know what is entirely tiring here is the fact that you obviously have never read the Shader Manuals provided by either ATI or NVIDIA on thier formats.


Athlon64 FX-51 | 1.5Gb DDR2 PC3400 | GeForce FX 5900 Ultra 56.60 | DirectX9.1 SDK | Audigy2 | Windows XP 64-Bit
Neophyte
21
Years of Service
User Offline
Joined: 23rd Feb 2003
Location: United States
Posted: 18th Apr 2004 12:25 Edited at: 18th Apr 2004 12:26
@RobK

"How he can write that and keep a straight face I just don't know."

Arrogance and sheer stubborness would be my best bet. I'm sure he honestly believes that he is right all the time about shaders and I'm wrong despite the Mount Everest of evidence to the contrary that I throw at him. Good thing I argue with him for my own amusement otherwise this would get boring after awhile.

@Raven

"This is just bloody stupid."

The only stupid thing here is your ridiculous claims.

"OpenGL DOES NOT SUPPORT these Shader Specifications.
THE GRAPHICS CARD DRIVERS DO!"

I'd like to point out the extremely obvious once more.
Quote: "Through the power of the Microsoft® DirectX® 9.0 Shader Model 3.0 and OpenGL® 1.5 APIs, programmers can now develop shader programs utilizing these technologies and techniques"


Yes, OGL does. I linked you to an OpenGL extension. I quoted this from Nvidia's site. Try as you might to ignore it, it won't go away. OGL can and will support other formats.

"Why is this a major factor? Because put simply
ati_fragment_program ... includes PS 1.1/1.4/2.0
nv30_ ... includes PS 1.0/1.1/1.2/1.3/2.0/2.x"

I don't understand what you are getting at here. Doesn't this support my point that OGL supports these shader models? I mean, you just said as much here that these OGL extensions support those models.

"Now why is interesting?
Because under DirectX both NVIDIA FX-Series and Radeon 9-Series BOTH support the others specific shader versions!
What is even more interesting is that DirectX8.0 (8.1 for 2.x) supported these features way back in 2000 when it wasn't until OpenGL 1.4 when anything other than PS 1.3 & 1.4 were supported via these cards."

A few points on this twisted paragraph.

1. Dx 8.1 didn't support 2.x(and here I can safely assume you mean both VS and PS versions). That's pure fantasy. The only new versions that DX 8.1 added to the shader models is PS 1.2, 1.3, 1.4. This also extends obviously to DX 8.0 which came before it. It only supported VS 1.0 and PS 1.0 theoretically. At the time these were more a demo version of shaders and only supported through software emulation(which made them dog slow). They were meant to be test versions that developers would use to prepare for the future. The real versions came in DX 8.1 as hardware support(in the form of the Geforce 3) appeared.

Quote: "New Features in DirectX Graphics
Expanded pixel shader functionality with new version 1.2, 1.3, and 1.4. "

http://msdn.microsoft.com/archive/default.asp?url=/archive/en-us/dx81_c/directx_cpp/intro/dx8whatsnew.asp

2. You are really confusing here when you say:
Quote: "Now why is interesting?
Because under DirectX both NVIDIA FX-Series and Radeon 9-Series BOTH support the others specific shader versions!"

What are you talking about? What other versions? How does this relate to the discussion we're having? You need to stop to take a breather Raven cause your hysterics are making you hard to decypher.

"Not ALL ATI cards support 1.3, Not ALL NVIDIA cards support 1.4 however the current versions DO!"

I never said that all ATI cards support 1.3. In fact, IIRC there are some laptop cards from them that only support PS 1.4 and VS 1.1 or at least that is what I heard.

I also elucidated on my points as to why I currently believe that Nvidia doesn't support 1.4. So I'll ignore this point for now.

"The Cg Compiler is Open-Source, why add support for cards/extensions you don't have to?"

I never said them not adding support for other cards was a bad thing, at least from a business standpoint. The source for the Cg compiler, which I have had on my comp for some time, only contains code for a generic profile. There are few optimizations implemented, if any, or at least that was the impression I got. In order for ATI to support PS 1.4 they would have to spend time and money developing and optimizing a compiler for a language that is primarily the creation of their chief rival.

Naturally, they aren't too inclined to waste time and money on such an endeavor and I think that was Nvidia's stratagy. They can claim compatiablity and an Open Source effort, but at the same time deny their rival the opportunity to exploit their custom shader model, thus wasting their efforts and money on it. Its a complicated chess game really and hinges around Cg becoming popular and widely used. If it becomes widely popular the PS 1.4 format may well fade to obscurity which would be a blow to the nads for ATI as they have invested quite heavily into it.

"If ATI wish to release a Cg Compiler update to support 1.4 they can, but considering 1.3 is only in there because of the X-Box and 2.0 runs faster on all cards that support both who cares?"

This "who cares" mentality is exactly what leads me to believe that Nvidia wouldn't support PS 1.4. Why support it when, as you said, 2.0 runs faster and is more feature laden?

"Yes it would... IF they were using DirectX 9.0b, which like I said before DirectX does not CURRENTLY support 3.0!"

The kind of stubborn stupidity needed to think of that sentence as coherent is beyond me. Dx 9 supports 3.0. Nvidia said it themselves. Your wrong. Get over it.

"Oh, you mean OpenGL 1.5 which i believe a year ago everyone was telling me didn't exist, that OpenGL 2.0 was the next version AND the OpenGL Speicification that NVIDIA themselves have single handedly developed?"

I was wondering when you were going to bring this up. Time for the real fun to begin.

First a little background for the folks not in the know.

The claims that he is referring to originate here:
http://darkbasicpro.thegamecreators.com/?m=forum_view&t=8191&b=1

The date is April 1st, 2003. This is important and I'll be getting back to this little tidbit later.

The claim in question is here:
Quote: "OpenGL 1.3 -> all the updates in it are for nVidia cards written by nVidia
OpenGL 1.4 -> enhanced updates by nVidia for the new GeForce3/4 enhancements
OpenGL 1.5 -> purely for the FX generation and still in Beta"


Among some of his more outrageous claims like OpenGL updates being sole by Nvidia for Nvidia he claimed that OGL 1.5 was in beta and "purely for the FX generation". I've dealt with the OGL for nvidia only claims in that thread very throughly so I'll cut to the more revelevant part of the exisitance of OGL 1.5.

OpenGL is governed by the ARB, Architecture Review Board.
Quote: "The OpenGL Architecture Review Board (ARB), an independent consortium formed in 1992, governs the OpenGL specification."

http://www.opengl.org/about/arb/overview.html
They keep meticulous notes of their procedings and from these it is clear that the OpenGL revision in progress was originally called OpenGL 2.0 and had been in the works for quite some time.

Here is a quote from one of their meeting notes from March 5, 2002:
Quote: "OpenGL 2.0 Status Update / SIGGRAPH Plans
Randi Rost presented a status update on 3Dlabs' work. Their proposed schedule has initial GL2 extensions by SIGGRAPH 2002, a full OpenGL 2.0 extension and spec at SIGGRAPH 2003."


OGL 2.0 had been in development for a long time. Over a year before we were even arguing.

It wasn't until June 10-11, 2003 that the next OpenGL was change from 2.0, to 1.5.
Quote: "OpenGL Shading Language and related extensions approved as ARB extensions, but not promoted to the core. Therefore the new core revision will be OpenGL 1.5, not 2.0."

This was determined by voting with a 5-4 vote in favor of promoting the OGL Shading Lang to the core. However, since just a majority, and not a super majority voted in favor of it, it was moved to the ARB extensions as a compromise instead.
Quote: "VOTE for immediate promotion of the OpenGL Shading Language and extensions to the core: 6 Yes / 1 Abstain / 4 No.

The result was a simple majority of non-abstaining YES votes, but not a supermajority. Interpretation of this vote required some care since final spec approval requires a supermajority vote, while consideration of features for the final spec requires only a simple majority. Because the NO votes were strongly held, we expect that trying to approval a core revision including the shading language would carry extremely high risk of failing to approve the spec. We will therefore not include the shading language into the core at this time, but instead drive a new core version as soon as there's more experience with the extensions, perhaps as soon as this fall.

As previously agreed in the marketing working group, we will call the new core revision OpenGL 1.5, reserving OpenGL 2.0 for a future core revision including the shading language."

Source: http://www.opengl.org/about/arb/notes/meeting_note_2003-06-10.html#oglnext2

This all happened exactly TWO MONTHS AFTER Raven's argument with me. It was IMPOSSIBLE for him to know that OpenGL 1.5 would be the next revision as even the board members didn't know it would be until JUNE when they voted on it and we were arguing in APRIL. So unless Raven is psychic, he was dead wrong about the next revision being OGL 1.5 and I was right at the time.

"Add to this OpenGL 1.5 has only PUBLICALLY been released within the past month,"

Unless its August 2003 you are way off the mark here.
Quote: "OpenGL version 1.5, released on July 29, 2003, is the fifth revision since the original version 1.0. "

http://www.opengl.org/documentation/opengl_current_version.html

" whereas DirectX 8.1 is the DirectX which introduced all of the current Shader Models (if only in writing in the help file!)"

Your kidding right? Please tell me your kidding.

"March 2004 - OpenGL 1.5 (nvidia developed)
August 2001 - DirectX 8.1 (microsoft & nvidia co-developed)"

July 2003 - OpenGL 1.5 (developed by the ARB)
Novemeber 2001 - DirectX 8.1(microsoft with the collobration of other gfx vendors)
Quote: "Saturday, November 10, 2001"

http://www.shacknews.com/onearticle.x/16979

"Yeah, i'm sure your original argument stands fast with that there; i mean i'm at a total loss to how the hell Microsoft could possible have had these specifications to models only recently released in OpenGL ... "

You and me too little buddy.

"i mean unless perhaps I WAS RIGHT in the fact that Shader Models are DirectX."

*shakes head in dismay*

You should really see a shrink about those dellusions of granduer of yours. They are getting out of control.

"This continues to the fact that both ATI and NVIDIA have had to personally support these extensions for 3-frikkin'-years, rather than them being ACTUAL specifications in OpenGL."

Err...so? Extensions are a part of OpenGL. If you bother to look at your own quote you can see for your self that the Cg compiler bases profiles around them.
Quote from YOU:
Quote: "Support for 14 profiles:
- vs_1_1 for DirectX 8 and DirectX 9
- vs_2_0 and vs_2_x for DirectX 9
- ps_1_1, ps_1_2 and ps_1_3 for DirectX 8 and DirectX 9
- ps_2_0 and ps_2_x for DirectX 9
- arbvp1 [OpenGL ARB_vertex_program]
- arbfp1 [OpenGL ARB_fragment_program]
- vp20, vp30 [NV_Vertex_program 1.0 and NV_Vertex_program 2.0]
- fp30 [NV30 OpenGL fragment programs]
- fp20 [NV_register_combiners and NV_Texture_shader)"


I bolded just two but as you can planly see they are referencing existing OGL extensions. Just because they are supported through extensions doesn't mean they aren't legitimate. OGL extensions are a part of OpenGL. They always have been and if you don't know that then I think that is really sad. Its basic info that anyone even remotely familar with OGL should know. I feel like I'm talking to a child.

"Shader Models are created for DirectX, Shader 4.0 was announced to be the MAJOR feature of DirectX10 ... DirectX9 introduced Shader 3.0, DirectX 8.1 introduced 2.0, DirectX 7.0 introduced Shader 1.0!!"

Just when I thought you couldn't possibly get more ignorant about shaders you open your mouth and prove me wrong.

First off, DX 9 introduced the new VS and PS 2.0 models. Not Dx 8.1.
Quote: "DirectX 9.0 introduces significant improvements across its suite of APIs. DirectSound offers new audio capabilities, DirectShow accelerates video rending hardware, and Direct3D enhances low-level graphics programmability with new programmable vertex and pixel shader 2.0 models."

http://www.microsoft.com/presspass/press/2003/Jan03/01-22DirectXHLSLPR.asp

DX 8.1 introduces PS 1.2, 1.3, and 1.4.
Quote: "Expanded pixel shader functionality with new version 1.2, 1.3, and 1.4."

Notice the distinct lack of a 2.0 in their? Thought so.
http://msdn.microsoft.com/archive/default.asp?url=/archive/en-us/dx81_c/directx_cpp/intro/dx8whatsnew.asp

DX 8.0 introduced the earilest shading models. NOT DX 7!
Quote: "Programmable vertex processing language
Enables you to write custom shaders for morphing and tweening animation, matrix palette skinning, user-defined lighting models, general environment mapping, procedural geometry, or any other developer-defined algorithm.
Programmable pixel processing language
Enables you to write custom hardware shaders for general texture combining expressions, per-pixel lighting (bump mapping), per-pixel environment mapping for photorealistic specular effects, or any other developer-defined algorithm."

Source: Under What's New in DirectX Graphics
http://msdn.microsoft.com/archive/default.asp?url=/archive/en-us/dx81_c/directx_cpp/htm/whatsnewindirectx80.asp

"The specifications might be supported by OpenGL, but they are neither native or even fully supported."

???

So first they are supported. Then they aren't. Then they are, but not fully? You aren't making any sense whatsoever. I mean, if you are going to lie and exagerate at least get your story straight.

"- vs_1_1 for DirectX 8 and DirectX 9
- vs_2_0 and vs_2_x for DirectX 9
- ps_1_1, ps_1_2 and ps_1_3 for DirectX 8 and DirectX 9
- ps_2_0 and ps_2_x for DirectX 9

THESE ARE THE MODELS, THIS IS WHERE THEY ARE INTRODUCED, THIS IS WHERE THEY COME FROM!
OpenGL 1.4 ONLY has support for Model 1.0 and Model 2.0"

A few points:

1. The only one who needs to pay attention to that quote is you. Notice how DX 8 isn't next to the vs_2_0, vs_2_x and their ps equivalants like Dx 9 is? Would you mind trying to explain who this fits in with the above quote:
Quote: "Shader Models are created for DirectX, Shader 4.0 was announced to be the MAJOR feature of DirectX10 ... DirectX9 introduced Shader 3.0, DirectX 8.1 introduced 2.0, DirectX 7.0 introduced Shader 1.0!!"

If what you said is true than why isn't Dx 8 next to vs and ps 2.0 in that little quote of yours hmmm?

2. With regards to:
Quote: "THESE ARE THE MODELS, THIS IS WHERE THEY ARE INTRODUCED, THIS IS WHERE THEY COME FROM!
OpenGL 1.4 ONLY has support for Model 1.0 and Model 2.0"

Then why did you say this?
Quote: "ati_fragment_program ... includes PS 1.1/1.4/2.0
nv30_ ... includes PS 1.0/1.1/1.2/1.3/2.0/2.x"

Your contradicting yourself again Raven. You should try harder to keep your lies straight.

But lets pay attention to your quote once again.
Quote: "arbvp1 [OpenGL ARB_vertex_program]
- arbfp1 [OpenGL ARB_fragment_program]
- vp20, vp30 [NV_Vertex_program 1.0 and NV_Vertex_program 2.0]
- fp30 [NV30 OpenGL fragment programs]
- fp20 [NV_register_combiners and NV_Texture_shader"

Notice the ARB_vertex_program in brackets? That's an extension. Something you said(or at least implied) is part of OpenGL yourself when you said:
Quote: "OpenGL 1.4 ONLY has support for Model 1.0 and Model 2.0"

This support comes through extensions which have always been apart of OpenGL. I've pointed out extensions before that support the DX models other than 1.0 and 2.0 and you said that ATI_fragment_program and nv30_ support the other models as well. Nvidia has also said that the upcoming 3.0 models will be accesible from OpenGL 1.5 themselves.
Quote: "Through the power of the Microsoft® DirectX® 9.0 Shader Model 3.0 and OpenGL® 1.5 APIs,"

http://www.nvidia.com/object/feature_cinefx3.0.html

As new features like shader models come along extensions are released to access them. Since there have been no extensions for 3.0 yet its reasonable to assume that such functionality doesn't exist in current hardware. The GeForceFX series doesn't have VS and PS 3.0 implemented in it secretly. Period. If it were in the hardware they'd add an extension like NV_vertex_program1_1 or NV_vertex_program2(those by the way, do exist. Check the extension link I gave you to nvidia's site).

"you know what is entirely tiring here is the fact that you obviously have never read the Shader Manuals provided by either ATI or NVIDIA on thier formats."

Rest assured, Raven, that the feeling is entirely mutual.
Lord Ozzum
20
Years of Service
User Offline
Joined: 29th Oct 2003
Location: Beyond the Realms of Death
Posted: 18th Apr 2004 17:08
I don't now much about gfraphics cards, but I know neophyte knows stuff, and he tends to get a little stubborn, but Raven is too. C'mon guys, it's just a graphics card...I think...but, Neophyte, you're a good arguer(?...dunno how to spell it)

Once I dreamt that I fell into a lake full of the undead and demons. I screamed and hollered as my kitten jumped into it with me. None of my friends helped me.

I don't trust them anymore.
Gery
20
Years of Service
User Offline
Joined: 24th Jan 2004
Location:
Posted: 18th Apr 2004 22:44
so some hardware guru help to me:
what buy?
GeForceFX5200 or Radeon9000?
I tough the 9000 not support DirectX9, ant the 5200 is very slow under DX9, but sombody know, which is the better?

Ezerkilencszázhatvanba' / ördög szart a katlanba /aki először megszólal /azé lesz a katlan szar.
Shadow Robert
21
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 18th Apr 2004 22:52
sod the quoting you can just figure the hell out where this has come from!
I'm sick and tired of this pure stupid attitude you have;

In-case anyone has actually forgotten why the hell Neo started this, it is because of this comment:

"GeforceFX are suppose to support 3.0 (as specified in DirectX 9.0), but for some reason the drivers lock you to 2.x"

to which the reply was

"Or maybe its because it isn't in the graphics card.
I thought I linked you to nvidia's site where they listed all of the OGL extensions for their cards and NONE of them had extensions for 3.0? Or how about all of the hardware review sites out there that state that the FX cards at most support VS and PS 2_x?"


... Okay so let's first address my first comment.

"The NVIDIA CineFX engine implements both OpenGL® and DirectX 9.0 specifications. These APIs give developers access to many new programming tools that speed the rate of effects development. Those features include support for: Pixel Shader 2.0+, Vertex Shader 2.0+"

"By combining the new NVIDIA GPU with Microsoft’s innovative DirectX® 9.0 API, NVIDIA CineFX 2.0 engine gives developers access to the full range of DirectX® 9.0 API Pixel Shader and Vertex Shader Models."

DirectX 9.0 has Shader Model 3.0 specified, thus Geforce FX is capable of these models. Geforce FX only callback Vertex Shader 2.0 and Pixel Shader 2.0 as thier versions, however they support the 2.x (2.0+) features. In DirectX you can callback this value in minor/revision versioning, the GeforceFX 5800/57x0/59x0 all call back the minor revision '35' this indicates Shader 3.x compatibility, just as '20'/'25' in DirectX 8.0 indicated 2.0/2.x compatibility.

On to what Neophyte is trying to claim here.
Anything that is in DirectX 9.0 is in OpenGL...

"Version: 56.64
Release Date: March 15, 2004
Microsoft® DirectX® 9 and OpenGL® 1.5 support"

"Version: 53.04
Release Date: February 2, 2004
Microsoft® DirectX® 9 and OpenGL® 1.4 support"

OpenGL 1.5 is what supports this full range of Pixel Shaders and Vertex Shaders. Why would the OpenGL Extensions have anything to do with these Models? (Neophyte seems to believe so)

However we'll play along for now. On the complete list of NV30 Extensions to OpenGL you will find the following:

NV_texture_shader //pixel shader 1.0
NV_texture_shader2 //pixel shader 2.0
NV_texture_shader3 // pixel shader 3.0?? what doesn't exist on the FX??
NV_vertex_program // vertex shader 1.0
NV_vertex_program1_1 // vertex shader 1.1
NV_vertex_program2 // vertex shader 2.0

http://cvs1.nvidia.com/DEMOS/OpenGL/inc/glh/

See now that is strange that in the extensions we find support for 3.0 ... very strange when the card doesn't support it, apparently.

I would go through why these extentions aren't counted as OpenGL but quite frankly why should I care if you guys understand OpenGL and NV30 development?

Next up is DirectX 9 and Shader 3.0, just to carify everything.

"Shader Model 3.0 - No Limits
Updated: April 5, 2004
By D. Sim Dietrich Jr., Nvidia
Microsoft® DirectX® 9.0 introduced several new standards for advanced vertex and pixel shader technology, version 2.0 and version 3.0. Shader Model 2.0 hardware has been available for over a year, and both hardware and software support is growing rapidly. Shader Model 2.0 includes technologies useful for advanced lighting and animation techniques, but has limited shader program length, and complexity, which limits the fidelity of the effects that can be achieved.

As developers push against the limits inherent in Pixel Shader 2.0 and Vertex Shader 2.0, they have started to adopt the newer, more advanced Shader Model 3.0. This shader model has advances in several areas, in both pixel and vertex shader processing."


This was the unveiling document for Shader 3.0, which was a speech given by Sim Dietrich at WinHEC.
Why, yes he is talking about DirectX 9.0 and he does mention Shader 3.0.

However unlike what Neophyte is trying make it appear that I said, that DirectX 9.0 does not support Shader 3.0, what has actually been said is that.
DirectX 9.0/9.0a/9.0b do not support Shader 3.0
DirectX 9.0c due out at WinHEC with in the next month however does.

So why can that quote be used by both of us to prove a point?
It can't to be truthful as it just says DirectX 9.0...
Unfortunately what Neophyte doesn't appear to understand is that there are 4 versions of DirectX 9.0.

DarkBASIC Professional 1.00 did not support DarkBASIC Objects, 1.05 however does. They're still both DarkBASIC Professional 1.0.

So a new item can state quite clearly and truethfully, that DarkBASIC Professional 1.0 supports DarkBASIC Objects; it doesn't mean that 1.00-1.04 does it just means that A version of it does.

So how exactly can I prove that I am correct and Neophyte is wrong on this matter?
Simple, DirectX 9.0b was released Summer(August) 2003. This is the press release to announce Shader 3.0, and it is post dated 4th April 2004.
DirectX has not been updated since August. Therefor DirectX 9.0b does not support Shader 3.0.

This isn't to say that Shader 3.0 is not included & outlined since DirectX 9.0 however you cannot use it.
(try using the shader_3_sw and you will see exactly what i mean)

This is also why every reviewer who has tried Far Cry a game which is capable of using Shader 3.0 when testing the 6800 has discovered that the game still uses Shader 2.0.
Now that the Shader 3.0 specification is now set, it will be fully activated and edited accordingly.

As for your claim to DirectX 8.1 not supporting Shader 2.0, what you have taken from is the DirectX 8.0 manual.
8.0 and 8.1 are COMPLETELY different APIs. 8.1 is actually closer to 9.0 in development whereas 8.0 is closet to 7.0.

Now go and take your bad mood out on some f**ker who will actually stand for it! Cause i'm not going to budge even an inch and if you have to bring up irrelivant arguments and threads just to keep yourself looking like you are ontop then quite frankly don't even bother. You might impress the slackjaw's of this forum but don't even think your going to impress or swade me from the subject at hand.

Get a bloody life!


Athlon64 FX-51 | 1.5Gb DDR2 PC3400 | GeForce FX 5900 Ultra 56.60 | DirectX9.1 SDK | Audigy2 | Windows XP 64-Bit
Rage_Matrix
21
Years of Service
User Offline
Joined: 14th Dec 2002
Location: Brighton, UK.
Posted: 18th Apr 2004 23:59
Fight! Fight! Fight!

www.tronsoftware.co.uk
AMD Athlon XP 1700+, 180GB HDD, 512MB DDR RAM, ATI Radeon 9700 Pro 128MB DDR, Windows XP Pro, DirectX 9.0b
the_winch
21
Years of Service
User Offline
Joined: 1st Feb 2003
Location: Oxford, UK
Posted: 19th Apr 2004 00:44 Edited at: 19th Apr 2004 00:45
http://www.thegamecreators.com/?m=forum_view&b=1&t=10819&b=2


Quote: "Get a bloody life!"

kinda amusing comming from someone who has just written ~1000 lines of bull

you ain't the cops
empty
22
Years of Service
User Offline
Joined: 26th Aug 2002
Location: 3 boats down from the candy
Posted: 19th Apr 2004 03:47
Yeah later version of the 80286 went upto 20MHz, altough 8,10 and 12 MHz were the most common variants. And there was no SX version. That stuff started with the 386. The 386DX hat 32bit but and SX (low-cost variant) a 16bit time multplexed bus). Oh and by the way, Doom needed at least a 386/33 cpu. It was released end of 1993, when 486s were standard anyway.

Me, I'll sit and write this love song as I all too seldom do
build a little fire this midnight. It's good to be back home with you.
Neophyte
21
Years of Service
User Offline
Joined: 23rd Feb 2003
Location: United States
Posted: 19th Apr 2004 09:24
@Raven

... Okay so let's first address my first comment.

"The NVIDIA CineFX engine implements both OpenGL® and DirectX 9.0 specifications. These APIs give developers access to many new programming tools that speed the rate of effects development. Those features include support for: Pixel Shader 2.0+, Vertex Shader 2.0+"

"By combining the new NVIDIA GPU with Microsoft’s innovative DirectX® 9.0 API, NVIDIA CineFX 2.0 engine gives developers access to the full range of DirectX® 9.0 API Pixel Shader and Vertex Shader Models."

DirectX 9.0 has Shader Model 3.0 specified, thus Geforce FX is capable of these models.


How you could miss the flaw in this logic is beyond me. Just cause its specified doesn't mean its in the card! DX 8.1 specified PS 1.4, but PS 1.4 isn't in the Geforce ti 4200, remember?

Geforce FX only callback Vertex Shader 2.0 and Pixel Shader 2.0 as thier versions, however they support the 2.x (2.0+) features. In DirectX you can callback this value in minor/revision versioning, the GeforceFX 5800/57x0/59x0 all call back the minor revision '35' this indicates Shader 3.x compatibility, just as '20'/'25' in DirectX 8.0 indicated 2.0/2.x compatibility.

No, it doesn't. What you are reading is the core version number. Not the VS and PS version number. The "GeforceFX 58x0/57x0/59x0" run on the NV35 core(or at least the 5800 does). The early FXs run on the NV30 core and the Geforce 4s were on the Nv25 core. What on earth gave you the idea that those were shader version numbers?

OpenGL 1.5 is what supports this full range of Pixel Shaders and Vertex Shaders.

Really? That's not what you said before.

Quote: "Add to this OpenGL only supports FP 1.0 and 2.0 ... no varations."

Quote: "OpenGL DOES NOT SUPPORT these Shader Specifications.
THE GRAPHICS CARD DRIVERS DO!"


Why would the OpenGL Extensions have anything to do with these Models? (Neophyte seems to believe so)

And here is where your true ignorance shows. All Shader models are accessed through extensions in OpenGL, Raven. Look at your own quote for the proof.
Quote: "
- arbvp1 [OpenGL ARB_vertex_program]
- arbfp1 [OpenGL ARB_fragment_program]
- vp20, vp30 [NV_Vertex_program 1.0 and NV_Vertex_program 2.0]
- fp30 [NV30 OpenGL fragment programs]
- fp20 [NV_register_combiners and NV_Texture_shader)"

Notice how all of those OpenGL shader models have extensions after them? And did you read the quote above about the ARB's descision to support GLSL through extensions and not the core?
Quote: "VOTE for immediate promotion of the OpenGL Shading Language and extensions to the core: 6 Yes / 1 Abstain / 4 No.

The result was a simple majority of non-abstaining YES votes, but not a supermajority. Interpretation of this vote required some care since final spec approval requires a supermajority vote, while consideration of features for the final spec requires only a simple majority. Because the NO votes were strongly held, we expect that trying to approval a core revision including the shading language would carry extremely high risk of failing to approve the spec. We will therefore not include the shading language into the core at this time, but instead drive a new core version as soon as there's more experience with the extensions, perhaps as soon as this fall.

As previously agreed in the marketing working group, we will call the new core revision OpenGL 1.5, reserving OpenGL 2.0 for a future core revision including the shading language.""

Source, Again: http://www.opengl.org/about/arb/notes/meeting_note_2003-06-10.html#oglnext2


NV_texture_shader //pixel shader 1.0
NV_texture_shader2 //pixel shader 2.0
NV_texture_shader3 // pixel shader 3.0?? what doesn't exist on the FX??
NV_vertex_program // vertex shader 1.0
NV_vertex_program1_1 // vertex shader 1.1
NV_vertex_program2 // vertex shader 2.0

http://cvs1.nvidia.com/DEMOS/OpenGL/inc/glh/

See now that is strange that in the extensions we find support for 3.0 ... very strange when the card doesn't support it, apparently.


You obviously didn't look too closely at those extensions. NV_texture_shader3 isn't PS 3.0. NV_texture_shader2 was provided as a means to better support 3D textures as not all implementations of NV_texture_shader will support 3D textures in hardware.
Quote: "Overview

This extension extends the NV_texture_shader functionality to
support texture shader operations for 3D textures."

Quote: "Why a separate extension?

Not all implementations of NV_texture_shader will support 3D
textures in hardware.

Breaking this extension out into a distinct extension allows OpenGL
programs that only would use 3D textures if they are supported
in hardware to determine whether hardware support is available by
explicitly looking for the NV_texture_shader2 extension."

Source: http://www.nvidia.com/dev_content/nvopenglspecs/GL_NV_texture_shader2.txt
Next time, bother to actually read the text on the extensions and not just assume that because there are a bunch of numbers they must indicate shader model.

I would go through why these extentions aren't counted as OpenGL but quite frankly why should I care if you guys understand OpenGL and NV30 development?

That's a pity. I'd love to hear your fabulous explaination as to why some extensions count(ARB_fragment_program for Arbfp1) and others don't.

"Shader Model 3.0 - No Limits
Updated: April 5, 2004
By D. Sim Dietrich Jr., Nvidia
Microsoft® DirectX® 9.0 introduced several new standards for advanced vertex and pixel shader technology, version 2.0 and version 3.0. Shader Model 2.0 hardware has been available for over a year, and both hardware and software support is growing rapidly. Shader Model 2.0 includes technologies useful for advanced lighting and animation techniques, but has limited shader program length, and complexity, which limits the fidelity of the effects that can be achieved.

As developers push against the limits inherent in Pixel Shader 2.0 and Vertex Shader 2.0, they have started to adopt the newer, more advanced Shader Model 3.0. This shader model has advances in several areas, in both pixel and vertex shader processing."

This was the unveiling document for Shader 3.0, which was a speech given by Sim Dietrich at WinHEC.
Why, yes he is talking about DirectX 9.0 and he does mention Shader 3.0.

However unlike what Neophyte is trying make it appear that I said, that DirectX 9.0 does not support Shader 3.0, what has actually been said is that.
DirectX 9.0/9.0a/9.0b do not support Shader 3.0
DirectX 9.0c due out at WinHEC with in the next month however does.


Then why didn't he say Dx 9.0c would support them, and not its predecessors hmmm?

So why can that quote be used by both of us to prove a point?

Oh, this is going to be good.

It can't to be truthful as it just says DirectX 9.0...
Unfortunately what Neophyte doesn't appear to understand is that there are 4 versions of DirectX 9.0.


What you don't appear to understand is there are two minor revisons to Dx 9. Dx 9.0a and b. No third revision(your Dx 9.0c) yet.

DarkBASIC Professional 1.00 did not support DarkBASIC Objects, 1.05 however does. They're still both DarkBASIC Professional 1.0.

No, they aren't. They are 1.0 and 1.05 respectively. No one refers to DarkBasic Pro as just 1.0. They specify which upgrade it is because earlier versions don't support what later versions have. This is exactly what Nvidia and Microsoft would have done if some kind of functionality that existed in a prior version didn't exist in a newer version.

So a new item can state quite clearly and truethfully, that DarkBASIC Professional 1.0 supports DarkBASIC Objects; it doesn't mean that 1.00-1.04 does it just means that A version of it does.

No it wouldn't. And the fact that you would seriously suggest this just goes to show how desperate you are to dodge the fact that I provided a direct quote from Nvidia contradicting you.

Simple, DirectX 9.0b was released Summer(August) 2003. This is the press release to announce Shader 3.0, and it is post dated 4th April 2004.
DirectX has not been updated since August. Therefor DirectX 9.0b does not support Shader 3.0.


That has to be one of the stupidest things I've heard you say in a long time. This is not the press release to announce a new Shader Model being introduced into the spec. Its there to announce that we finally have hardware capable of running it.
Quote: "Once again, NVIDIA introduces groundbreaking new hardware technologies in the GeForce 6 Series of graphics processing units (GPUs) that push 3D real-time graphics one step closer to film quality. These newest GPUs provide the hardware brainpower developers need to create stunning, real-time 3D effects in their games and applications, and the hardware muscle to keep your system performing at top speeds.

Advanced Technologies
The third-generation of the NVIDIA® CineFX™ engine unleashes the power of the latest NVIDIA GPUs and streamlines the creation of complex visual effects. Through the power of the Microsoft® DirectX® 9.0 Shader Model 3.0 and OpenGL® 1.5 APIs, programmers can now develop shader programs utilizing these technologies and techniques:"


This isn't to say that Shader 3.0 is not included & outlined since DirectX 9.0 however you cannot use it.
(try using the shader_3_sw and you will see exactly what i mean)


There is no such thing as a software version of the shader model 3.0 kiddo. That would be why you can't use it. And you can't run shader model 3.0, because *gasp* there is no hardware support for it.

I have to admire the mental gymnastics thou. You keep on claiming Dx 9 isn't capable of Shader model 3 and here is a quote saying they are going to use Shader Model 3 with Dx 9. Most people would have admitted they were wrong or at the very least dropped the subject, but not you. You keep on trying even though I'm sure even you know that, deep down, you're wrong.

This is also why every reviewer who has tried Far Cry a game which is capable of using Shader 3.0 when testing the 6800 has discovered that the game still uses Shader 2.0.

Now that the Shader 3.0 specification is now set, it will be fully activated and edited accordingly.


FarCry is not capable of running things on Shader Model 3. It only has Shader Model 2 shaders. No doubt when the Geforce 6 is launched they'll add a patch which adds in these Shaders so it looks prettier on the Geforce 6, but that's not what you are implying. What you are implying is that come Geforce 6 and the accompying driver updates it will magically be capable of it on older FX cards and that isn't going to happen.

As for your claim to DirectX 8.1 not supporting Shader 2.0, what you have taken from is the DirectX 8.0 manual.

No, what I took it from was the press release for Dx 8.1. Here is the title in BIG BOLD LETTERS:
Quote: "What's New in DirectX 8.1"

And below the title under "New Features in DirectX Graphics":
Quote: "Expanded pixel shader functionality with new version 1.2, 1.3, and 1.4."

http://msdn.microsoft.com/archive/default.asp?url=/archive/en-us/dx81_c/directx_cpp/intro/dx8whatsnew.asp

8.0 and 8.1 are COMPLETELY different APIs. 8.1 is actually closer to 9.0 in development whereas 8.0 is closet to 7.0.

No, actually 8.0 is close to 8.1. Its not completely different. Its not even really different. Really different would be the difference between 7 and 8 when the directdraw section of the api was integrated into the d3d part of the api.

Now go and take your bad mood out on some f**ker who will actually stand for it! Cause i'm not going to budge even an inch and if you have to bring up irrelivant arguments and threads just to keep yourself looking like you are ontop then quite frankly don't even bother. You might impress the slackjaw's of this forum but don't even think your going to impress or swade me from the subject at hand.

Yes, I know you are a stubborn mule who will never admit to the fact that he is wrong. I learned that long ago. Now, I just argue with you for entertainment or if you have the guts to contradict me.

Get a bloody life!

Get a bloody clue.


But since I like to save the best things for last why don't I settle this whole argument about Dx 9.0b not supporting shader model 3.0 right now.

Quote: "Q: What will the next version of DirectX be called?
A: DirectX 9.0c

Q: What will be new in DirectX 9.0c?
A: Not much. There will be some fixes for Shader 3.0, which is already in DirectX 9.0b."

Source: http://www.activewin.com/faq/faq_7.shtml

I've got one word for you Raven...



las6
22
Years of Service
User Offline
Joined: 2nd Sep 2002
Location: Finland
Posted: 19th Apr 2004 09:54
Yeah, I't just like empty said, SX was the low-cost version. The cheap one. And also the crappy one. Duron anyone? DX was the proper version, so if you would have had a such 286 SX it would've been Slower than your regular 286DX.


| Keyboard not detected. Press F1 to continue. |
IanM
Retired Moderator
22
Years of Service
User Offline
Joined: 11th Sep 2002
Location: In my moon base
Posted: 19th Apr 2004 15:11
o_O, offensive picture deleted.

Stop the childishness guys. Take it somewhere else. *LOCKED*

*** Coming soon - Network Plug-in - Check my site for info ***
For free Plug-ins, source and the Interface library for Visual C++ 6, .NET and now for Dev-C++ http://www.matrix1.demon.co.uk

Login to post a reply

Server time is: 2024-09-21 18:44:04
Your offset time is: 2024-09-21 18:44:04