Sorry your browser is not supported!

You are using an outdated browser that does not support modern web technologies, in order to use this site please update to a new browser.

Browsers supported include Chrome, FireFox, Safari, Opera, Internet Explorer 10+ or Microsoft Edge.

Geek Culture / nVidia release GeForce 6800 Ultra

Author
Message
M00NSHiNE
21
Years of Service
User Offline
Joined: 4th Aug 2003
Location: England, UK
Posted: 5th May 2004 01:13
The most powerful graphics card on the planet. Here's a quick specs rundown. Please dont turn this into an ATi - nVidia flame war.

Form factor: Single slot AGP
Interface: 8x AGP
Core clock: 400MHz
Memory: 256Mb GDDR3
Memory Clock: 600MHz (1200MHz DDR)
Memory interface: 256-bit
Bandwidth: 40 Gb per second
Polys per second: 600 million
Pixel Pipelines: 16
Pixel shader ALUs: 2 full, 1 cut down per pixel pipeline
Vertex units: 6
DirectX Version: Fully DirectX 9.0c compliant
** When the hell did 9.0c come out??**
Shader support: Pixel shader 3.0, vertex shader 3.0
2d video: Programmable video engine
Codec support: MPEG1/2/4, WM9
PSU required: 450W
Outputs: VGA, DVI, S-video
Maximum resolution: 3,456 x 2,592
OS required: MS Windows XP

Pretty powerful, no? Let's hear your views and comments.

CattleRustler
Retired Moderator
21
Years of Service
User Offline
Joined: 8th Aug 2003
Location: case modding at overclock.net
Posted: 5th May 2004 01:39
needs a 450w PSU??


* DBP_NETLIB_v1 - PLUGIN FOR DBP * Click LogoMooooooo!
M00NSHiNE
21
Years of Service
User Offline
Joined: 4th Aug 2003
Location: England, UK
Posted: 5th May 2004 01:40
Aye, must be a power hungry little monster.

IanM
Retired Moderator
22
Years of Service
User Offline
Joined: 11th Sep 2002
Location: In my moon base
Posted: 5th May 2004 01:49
PC Format loved this card when they reviewed it, but refused to publish benchmarks until the drivers were complete ...

http://www.pcformat.co.uk/reviews/default.asp?pagetypeid=2&articleid=29785&subsectionid=679&subsubsectionid=711

*** Coming soon - Network Plug-in - Check my site for info ***
For free Plug-ins, source and the Interface library for Visual C++ 6, .NET and now for Dev-C++ http://www.matrix1.demon.co.uk
Ian T
22
Years of Service
User Offline
Joined: 12th Sep 2002
Location: Around
Posted: 5th May 2004 01:56
Quote: " The most powerful graphics card on the planet."


Not for long! Both nVidia and ATI have more on the way I believe.

CattleRustler
Retired Moderator
21
Years of Service
User Offline
Joined: 8th Aug 2003
Location: case modding at overclock.net
Posted: 5th May 2004 02:39
yes, then comes PCI Express which stomps all over AGP 8x


* DBP_NETLIB_v1 - PLUGIN FOR DBP * Click LogoMooooooo!
The Big Babou
21
Years of Service
User Offline
Joined: 10th Feb 2003
Location: Cyberspace
Posted: 5th May 2004 05:10
Quote: "The most powerful graphics card on the planet."

Quote: "Please dont turn this into an ATi - nVidia flame war."


... must resist ... must resist

No seriously, actually it is around for two weeks now. i think i read something of a 480W PSU, butwho cares about 30W at those high numbers? Today ATIs X800 was introduced. It would be MY choice, because a 350W PSU is more than sufficient to power it, while it is nearly as fast as a GeForce 6800. (Though today different sites came up with tests, some declaring ATI as new king, others say nVidia is. ) More interresting is here the comparison between X800 Pro and GeForce 6800GT, because this GeForce has also only one Power connector and is also a single slot card. It was faster in nearly all benchmarks, but as the X800 Pro is the same Layout as the X800 XT, with a lower memory and core clock plus 4 pixelpipes disabled, driver hacks will perhaps apear, which turn the Pro into XT making it again faster than the GeForce GT. And at last nVidia seems to driver cheat again, because if the farcry.exe is renamed, the fps are lower than before. ATI calls it cheating, nVidia calls it optimizing. Everyone has to decide on his own, what it is. Personally i think it is cheating, because all games should be treated equal by the driver. that means the driver shouldn't contain things like "if game=farcry.exe then make_it_faster(10fps)".

Again this is all MY Opinion, don't feel offended by it. Both ATI and nVidia created great Graphics Cards, personally I will wait how things go on. When those cards hit the shelves, first user problems will appear and new drivers will solve them, also will they increase and optimize performance, prices will drop, which one is going to be cheaper and will there be driver hacks? when all this is clear i'm going to buy my new Graphics Card. So long my ancient radeon 8500/9100 will serve me well.

... they call it a royale with cheese ...
Shadow Robert
21
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 5th May 2004 05:56
Babou, i'd strongly suggest taking the current benchmarks of each of the cards with not a punch but a sodding bucket of salt.
It seems every single website seems to have thier own tests that are identical to everyone elses yet are getting massively different results... even on almost identical machine setups your seeing on one site the X800 boasting a few FPS and on others you see the 6800 boasting a FPS.

In my opinion what is currently going down is a load of crap.
I'll gather some online info about FarCry and see if I can explain this situation;

point of fact is ATi is crying wolf; NVIDIA replied, 'if Far Cry is running faster on the 60.11 drivers this is because we have an optimised runtime compiler' ... sites have taken this to mean that NVIDIA are cheating *again*, and as such have taken it as gold about the rumour that you rename the exe the speed drops.

the most annoying fact about all of this is everyone still knows damn well that ATi are using 32->16-bit conversion for images and thier 16pipelines at 96bit (rather than 128bit) colour but still with the 128bit width allowing them to have 21 pixels per cycle rather than the standard 16 they should.

the drop in quality is damn near unnoticeable unless your really looking for it, but this is actually why ATi have refused to put Shader 3.0 into thier cards.
i'm afraid thier gamble this time isn't going to pay off in the long run as Shader 3.0 is just too damn valueable.

Add to this Far Cry's speed jumps dramatically using Shader 3.0, once DirectX 9.0c is released later this week(?) as well as the Far Cry Shader 3.0 patch no doubt the benifit gained from this new spec is going to prove a little to much for ATi to keep cheating.

I'm not too bothered about the fact they are still cheating like this; what pisses me off more is the fact that this wavers the compatibility for games, and so they have to patch each game individually. As such if your game is not important enough any graphical glitches on ATi and you can just forget about them being fixxed. (in other words it's us who's screwed over not the 'power' gamers)

This said this time around it looks like it is a far more even power ratio; and if you remember how much speed there was between the FX 5800 and 5900 ... well just think what NVIDIA and IBM are no doubt cooking up as we speak.

The X800 XT is really pushing thier technology as far as it'll go, where-as NVIDIA are reknown for constant improvements beyond what you'd think possible from the hardware.
FX 5700 is a prime example.

Quite frankly although I have a reference 6800 Ultra for testing animations; I don't plan to actually buy a gaming 6-Series card until I see some results I can trust.
Right now i can't think of a single online reviewer who doesn't constantly show thier loyalty to a particular brand; I want to see some true independant tests ... some University or something can put the cards through thier paces.


AthlonXP 2500+ | 256MB DDR PC2700 | GeForce FX 5200 44.04 | DirectX 9.0 | Audigy2 | Crystal Clean OS
The Big Babou
21
Years of Service
User Offline
Joined: 10th Feb 2003
Location: Cyberspace
Posted: 5th May 2004 06:26
agreed on those benchmarks being crap. the results are too different from site to site.
again one calls nVidia a cheater, others call ATI a cheater, whilst they both call it optimzing. everyone has to know for hisself what to think of it.
I know nVidia has future plans with IBM, but don't forget ATI won't resign on beeing faster than nVidia. They surely have also some things in mind.
ATI has also some improvements, such as this new AA thingy, which doubles the performance of AA, when the fps is above 50 (2x25). more usefull seems to be their new normal map compression named 3Dc. It is supposed to compress by 4:1 nearly lossless and should be easy to implement.
As I said, i'll wait till everything clears up.

... they call it a royale with cheese ...
Shadow Robert
21
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 5th May 2004 09:55
actually it's already in the drivers aparently. some site shows you how to unlock the 'temporal aliasing' ... however no one is willing to show the results of it.

I know ATi aren't going to shy away from being the fastest, this said the cards this time around are very evenly matched;
Especially when you consider the fact that both ATi's are running between 75-120MHz faster than the 6800 Ultra.

With the Pro being able to be overclocked to around the same as the XT; this would suggest these cards have a very visible limit.
Geforce reference boards are known to be tame compared to market models, so another reason to keep it calm atm.

We're also looking at the future with NVIDIA. Shader 4.0 is atleast a year off as DirectX 10 is being released with this new model, and is going to be released initially as part of Windows 6.0 (the official new name is released tommorrow apparently). Which means until then NVIDIA lead the market with technology...

right now that doesn't matter greatly as everything is 1.1 and 2.0 based, this said developers are no doubt going to patch because 3.0 does offer greater performance as well as a more flexible pipeline.
Far Cry already has the patch ready for the Dx9.0c release; although it is unsure what else will feature these extensions, E3 is just around the corner.

We're going to know very very soon, and my money is on HL2s reworked shader engine suddenly sprouting 3.0; Same goes for Halo2.
Really the only thing truely dragging the Geforce's performance right now is thier Anti-Aliasing as only 2x/2xQ/4x are native, the rest are pure software and you can see it in the huge performance drop. There is a new technology i've seen them impliment into the Goforce range which seems to mimic analog scanline signals; really how they're doing it right now is just far too costly.

I don't think people quite understand that for the Radeons it is a post process thing; whereas for the Geforce they tripple the screen size and adverage down.

Which means when your playing at 640x480 w/4x AA your really playing at 1920x1440 and it's being scaled down. As the Geforce can't maintain this every frame, they only sample every Nth frame; and a loss in quality occurs.

What is sad is that, the AA factor is what is considered Image Quality; very little is ever taken into account in terms of REAL image quality - as in colour bleeding, colour sat/hue/light, etc...

Personally another thing i find interesting is how you can now use 64bit image processing for the Geforce, again not used yet; but it will be used as this is OpenEXR standard (industrial, light and magic) ... this is going to become as standard as EAX is for sound cards. And the difference in quality is just phenominal over the FX-Series who only can process in 8/16bit and the Radeons which process in a std 24bit.

I find it quite amusing that not many radeon users sit down and question; how come Radeon buffers use 24bit colour yet uses 32bit depth?
Geforce don't, they use 24X8 Depth ... but they run on a 32bit pipeline; it's weird but true.


AthlonXP 2500+ | 256MB DDR PC2700 | GeForce FX 5200 44.04 | DirectX 9.0 | Audigy2 | Crystal Clean OS
las6
22
Years of Service
User Offline
Joined: 2nd Sep 2002
Location: Finland
Posted: 6th May 2004 14:29
Quote: "actually it's already in the drivers aparently. some site shows you how to unlock the 'temporal aliasing' ... however no one is willing to show the results of it."


umm... you haven't read tomshardware, then?
And it's "anti-aliasing", we don't want more edges, do we?
First of all, temporal antialiasing doesn't show in screenshots as it works bit like tv, it applies the antialising to alternate frames or lines or whatever. Basically you can't grab a screenies and point it out. But it apparently works, and 2xTAA looks like 4xAA but with unnoticeable speed drop. Tomshardware did include benchmarks for this mode too...


| Keyboard not detected. Press F1 to continue. |
Shadow Robert
21
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 6th May 2004 22:32
I've seen the benchmarks Las6, but unless i see it won't believe it. You could capture the TAA in a video output, i would be willing to download a video of it; but somehow i doubt we will ever see it.

I have also tried the solution that ATi have posted about how to get it to work on thier current range of cards using Cat 4.4
As far as i can see, there doesn't appear to be much of a difference in quality.


AthlonXP 2500+ | 256MB DDR PC2700 | GeForce FX 5200 44.04 | DirectX 9.0 | Audigy2 | Crystal Clean OS
The Big Babou
21
Years of Service
User Offline
Joined: 10th Feb 2003
Location: Cyberspace
Posted: 7th May 2004 03:19
That video output idea is good, but has a big disadvantage: it has to be uncompressed or losless compressed, making it VERY big. no problem for us broadband/flatrate users, but poor modem people. compressing it isn't possible, because artifacts occur on the edges, you know right there, where AA is in action.

There seems to be a mechanism, which turns off TAA at under 50 FPS, because otherwise it would flicker. that means you get worse quality at under 50 fps. no difference between 2xAA and 2xTAA then

... they call it a royale with cheese ...
Shadow Robert
21
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 7th May 2004 03:46
a good test of how effect it is would be at 640x480, without sound output to a clean HDD (wouldn't need more than 30seconds of footage to get a good idea of it's quality)

in raw avi this would cost around 60MB-ish but if you use MediaSeries9 Encoder without Audio this will come up alot smaller around 8MB and be pretty loss-less (especially if it is already at 640x480)

the quality would be more than enough to see that it is actually doing what ATi claim it does.
as i've said it just painly doesn't look any different on my 9800 Pro.


AthlonXP 2500+ | 256MB DDR PC2700 | GeForce FX 5200 44.04 | DirectX 9.0 | Audigy2 | Crystal Clean OS

Login to post a reply

Server time is: 2024-09-21 23:49:09
Your offset time is: 2024-09-21 23:49:09