Sorry your browser is not supported!

You are using an outdated browser that does not support modern web technologies, in order to use this site please update to a new browser.

Browsers supported include Chrome, FireFox, Safari, Opera, Internet Explorer 10+ or Microsoft Edge.

Geek Culture / nVidia series 8 graphics cards on sale!

Author
Message
Kenjar
19
Years of Service
User Offline
Joined: 17th Jun 2005
Location: TGC
Posted: 12th Nov 2006 16:19 Edited at: 12th Nov 2006 16:36
Dabs.com is selling the new 8800 GTS graphics card with a 1.6Ghz DDR3 memory technology starting at 640Mb, and a 500Mhz core speed, for £319.99. They are also selking the 8800 GTX with a core speed of 575Mhz, 786Mb DDR3 rated at 1.8Ghz. It's fully direct X 10 compatible so I am told.

http://www.dabs.com/productlist.aspx?&NavigationKey=11137&NavigationKey=48070000&CategorySelectedId=11137&PageMode=1

Now they've introducted a few new ratings with this model as well, shader clock speed, the GTS is rated at 1.2Ghz and the GTX at 1.35Ghz. Instead of graphical pipelines, they now have Shader Processing units, the GTS has 96, and the GTX has 128. What these are I don't know yet, and few do it seems. The next is the Texture Address Unit or TAU's, the GTS has 24 and the GTX has 32. Finally there is the Rasterization Operations Unit, the GTS has 20 and the GTX has 24. As far as I can tell the old graphic pipes comparison method is out the window so it's going to be hard to compair it to the older series 5,6, and 7's.

Futhermore, and bad news for Ageia is the new Qauntum effects Technology which produces vividly realistic smoke, fire and explosion effects. The smoke example on http://www.nvidia.com/page/geforce8.html looks breath taking. The real time particle engine system which handles collision of particles with soild objects is very, very impressive. They even demonstrate real time water effects as it rolls off a realistic looking rock. The cards now support x16 anti-aliasing, so there's no fear of pixalated edged anymore. 128-bit HDR engine helps provide more realistic lighting. With this next generation physics technology, built into a single card, I have to ask myself, why get a top of the line series 7 card, and an Ageia card, when I get get, for more or less the same sort of price, a 8800GTS card? The answer is simple of cause, Ageia has been around for longer and is supported. But as games start supportin Direct X 10, the transition will not be far behind. It remains to be seen however, if the quantum engine will benefit the 9.0c direct X users.

I lay upon my bed one bright clear night, and gazed upon the distant stars far above, then I thought... where the hell is my roof?
BatVink
Moderator
21
Years of Service
User Offline
Joined: 4th Apr 2003
Location: Gods own County, UK
Posted: 12th Nov 2006 17:48
The other question is this...it's a release of new technology for Nvidia, but what are Ageia already working towards?



David R
21
Years of Service
User Offline
Joined: 9th Sep 2003
Location: 3.14
Posted: 12th Nov 2006 17:51
Quote: "but what are Ageia already working towards?"


...failure possibly? I think this is just going to be Ageia being squashed by big companies who already have monopolies in their industry areas.

Ageia is a first-gen tech, and personally, I do not trust it enough; I would rather get a GPU that came with a PPU as an additional extra, rather than shelling out tons of cash on a separate component.

Also, having a non combined GPU and PPU means updating both Ageia's PPU framework AND the GPU driver. I would rather just do a single update for a GPU/PPU than two, thanks.

Kenjar
19
Years of Service
User Offline
Joined: 17th Jun 2005
Location: TGC
Posted: 12th Nov 2006 18:13
Amen to that. Thanks Ageia for getting the ball rolling and all, but I think it's time for them to bug out of the PC industry and go make money somewhere else, or join forces with ATI.

I lay upon my bed one bright clear night, and gazed upon the distant stars far above, then I thought... where the hell is my roof?
Chris Franklin
19
Years of Service
User Offline
Joined: 2nd Aug 2005
Location: UK
Posted: 12th Nov 2006 18:37
Quote: "I think it's time for them to bug out of the PC industry and go make money somewhere else, or join forces with ATI.
"


ATI is already with amd tho , so there's no space there

Kenjar
19
Years of Service
User Offline
Joined: 17th Jun 2005
Location: TGC
Posted: 12th Nov 2006 19:26
I beleive AMD basically bought ATI, so they are the same company these days in all but name. They don't have to buy out Ageia, merely work with them.

I lay upon my bed one bright clear night, and gazed upon the distant stars far above, then I thought... where the hell is my roof?
Raven
19
Years of Service
User Offline
Joined: 23rd Mar 2005
Location: Hertfordshire, England
Posted: 12th Nov 2006 21:50
Yeah, AMD bought-out ATI last July. The will remain ATI for trading sake, and remain a different division; but AMD now own them in the same way Eidos is technically part of SCi, but remain with the name Eidos to make it look like they're still two companies.

It's becomming a very common thing in business today.

Quote: "The other question is this...it's a release of new technology for Nvidia, but what are Ageia already working towards?"


I guess you'll know that when it arrives. Ageia are fairly new to the hardware game, and have nowhere close to the resource both ATI or NVIDIA have at their disposal. So if this did become an all-out competition on purely a hardware stand point, then no doubt the GPU hardware developers would win.

Thing is that Ageia have a secret weapon the shape of an overly priced black/silver box that will be on the market for atleast 5years and provides quite a bit of name-dropping clout.

In other words.. Playstation 3.

They also have an edge on the market being the only professional middleware solution available to independant developers willing to make freeware or develop with, without having to be established and pay out ridiculous amounts of money for.

Just looking at the hardware then, yes I'd agree the new Quantum Effects does look pretty; but on the flip-side we're talking PC-Only for the next years. We'll get gorgeous effects, but the rest of the gaming world wont. At the end of the day, PC Gamers only make up a very small percentage of the gaming community world-wide.

Honestly speaking, this is exciting from a development position. Then again it's going to be a pain in the arse making several versions between systems.

For now I'm personally sticking with my own physics solution that I plan to extend and enhanced with both Shaders and Physics Processor technology in time. Right now I'm going to assume that most of the people who my software will affect will not have either of the technologies capable of making it worth-while.

Perhaps in a year or so.

As for the Hardware itself. It's a very impressive performance boost from the previous generation; but then again it's been a while since I've bought a "high-end" graphics card. I've always waited for the "mid-range" and purchased the best of them.

X1900XT for £140, or X1900XTX for £320 - with the performance difference being ~20fps when both perform over 100fps in Half-Life 2 with max graphics. tbh, that 20fps just ain't worth the extra cost.

Intel Core 2 Duo E6400, 512MB DDR2 667MHz, ATi Radeon X1900 XT 256MB PCI-E, Windows Vista Business / XP Professional SP2
Jeku
Moderator
21
Years of Service
User Offline
Joined: 4th Jul 2003
Location: Vancouver, British Columbia, Canada
Posted: 12th Nov 2006 22:17
As long as they keep ATI in Canada I'm fine with the merger.

Kenjar
19
Years of Service
User Offline
Joined: 17th Jun 2005
Location: TGC
Posted: 12th Nov 2006 23:44
lol, I'm sure AMD won't want to move offices.

I lay upon my bed one bright clear night, and gazed upon the distant stars far above, then I thought... where the hell is my roof?
BatVink
Moderator
21
Years of Service
User Offline
Joined: 4th Apr 2003
Location: Gods own County, UK
Posted: 13th Nov 2006 14:54 Edited at: 13th Nov 2006 14:56
Quote: "Thing is that Ageia have a secret weapon the shape of an overly priced black/silver box that will be on the market for atleast 5years and provides quite a bit of name-dropping clout.

In other words.. Playstation 3."


Interesting point, although I'm sure I heard they are using a software-only solution, not the hardware. But does this mean it's embedded in the Playstation itself, or there is a PhysX module that Playstation developers can incorporate?

<NAMEDROP>I'm attending a conference on Thursday that will be attended by some big names across all platforms, I'll try and get their opinion on the situation.</NAMEDROP> Picking the right technology at their level could be make-or-break.


The other question is...is this generic shader support, or is it specific to Nvidia cards? Because if it's the latter, then we will potentially have 3 different solutions to cater for.



Codelike
18
Years of Service
User Offline
Joined: 13th Dec 2005
Location: DBP - Scouseland
Posted: 13th Nov 2006 23:44
Reg Hardware: nVidia readies five more G80 GPUs

Presumably for 8200, 8400, 8600, 8850 & 8875 models?

I have an XP3000+, 1.5gb DDR333, a 6600GT and I'm programming 3k text-based exe's?!
Kenjar
19
Years of Service
User Offline
Joined: 17th Jun 2005
Location: TGC
Posted: 14th Nov 2006 09:51 Edited at: 14th Nov 2006 09:52
Sounds about right to me. Well the 8200, 8600 and perhaps the 8850 (though I'd expect 8900) do anyway. It's certainly consistant with previous releases, and I hope they stick to the numbering system. I dunno about anyone else but I find the ATI 9550, 9600, 9800, then this X1600 X1800 rubbish totally confusing. nVidia's method of techology generation identification, followed by devisions of 1000 makes much more sense. It's a shame they don't stick with it totally, introducing all this GTS, LE, GTX, GT, GS rubbish, but nothing's perfect.

I lay upon my bed one bright clear night, and gazed upon the distant stars far above, then I thought... where the hell is my roof?
Codelike
18
Years of Service
User Offline
Joined: 13th Dec 2005
Location: DBP - Scouseland
Posted: 14th Nov 2006 15:17 Edited at: 14th Nov 2006 15:18
Quote: "It's a shame they don't stick with it totally, introducing all this GTS, LE, GTX, GT, GS rubbish, but nothing's perfect."


Yeah, speccing the last two digits (80xx) as a replacement for LE, GT, GTX etc. would be more user friendly, or even adding a .1, .2 etc. (80xx.1?).

I have an XP3000+, 1.5gb DDR333, a 6600GT and I'm programming 3k text-based exe's?!
Kenjar
19
Years of Service
User Offline
Joined: 17th Jun 2005
Location: TGC
Posted: 14th Nov 2006 15:58 Edited at: 14th Nov 2006 15:59
They don't even need to do that. I mean 8200 referes to the entry level reference design, the 8600 referes to the mid range reference design and the 8800 referes to the high performance reference design. So if a 8150 turns up then you know it's not much good (just like the 6150) and is below the entry level requirements making it totally unsuited to game play. A 8250 of course referes to an improvement in speed and power over the standard nvidia 8200 reference design, meaning that while it's still entry level, you will be able to set your games graphics settings a little higher. If a 8500 turns up it's on the upper reaches of entry level classification, boardering on, but not quite mid range in performance. And so on.

Instead, because companies who make these boards think that labelling something 8550 will hurt their sales, it's better to call it a 7600XT, XT is typically a value model of the reference design, and is often cheaper. The 7600GS again is a reduced proformance model, and the 7600GT is the classic reference design. They've made this a little more complex of course in the 8 series because it's now 8800GTS (the same as GS for now, at least until someone makes a 8800GS) and 8800GTX which is the real base line model right now for the high performance model.

But like I say, at least with the nVidia cards it's easy to work out roughtly where they lay and what sort of proformace you will get in relation to other models. ATI has no logical numbering system that I can see. They simply start off with a number like 9000 or X1000 and work their way up. So it's harder to work out at a simple glance, which generation of technology powers it (shader models, openGL version supported, direct X version supported etc).

nVidia isn't perfect mainly due to squabbling companies, out to cast a false impression, but it's easier to understand the rough level at a single glance.

To be honest it's this numbering system that keeps me loyal to nVidia.

I lay upon my bed one bright clear night, and gazed upon the distant stars far above, then I thought... where the hell is my roof?
Raven
19
Years of Service
User Offline
Joined: 23rd Mar 2005
Location: Hertfordshire, England
Posted: 14th Nov 2006 15:58
Quote: "Interesting point, although I'm sure I heard they are using a software-only solution, not the hardware. But does this mean it's embedded in the Playstation itself, or there is a PhysX module that Playstation developers can incorporate?"


Embedded PPU, it's not *as* powerful as the current on-market solutions and it utilises the System memory rather than having it's own.

If you're interested about it then sign-up to Ageia as a developer and try to blag access to the PS3 section. I've not bothered, cause have no interest in PS3 development; can't stand developing for Linux platforms.

Quote: "The other question is...is this generic shader support, or is it specific to Nvidia cards? Because if it's the latter, then we will potentially have 3 different solutions to cater for."


I'm probably wrong about what you mean, but if you're talking about CUDA; it's nvidia only and G80+ GPUs.
However the compiler like Cg is open-source when it's finally publically released, so like Cg it's up to ATI if they want to add their own support.

Considering ATI already have their own version called 'Stream' in development along with Ashli, I highly doubt they will.

This graphics war is honestly begining to hurt the one set of people it relies on in order to help push it all forward : the developers.

As I'm working with the Xbox360 now more exclusively, don't see a point in learning the nvidia technology. In recent month I've even had to ditch my beloved brand in favour for the X1K-Series.. no point in having 2 cards given the price of cards with reasonable performance.

Rather have one mid-high range compared to two low-range cards.
People's finances aren't exactly infinite, neither our developers time to learn all of this new technology.

Quote: "Yeah, speccing the last two digits (80xx) as a replacement for LE, GT, GTX etc. would be more user friendly, or even adding a .1, .2 etc. (80xx.1?)."


Yeah but what would you buy based on name...

GeForce 8800.3
or
GeForce 8800 GTX

honestly?
It made sense a while ago...

GeForce 4 MX, GeForce 4 Ti
GeForce FX 5200, GeForce FX 5200 Ultra

I'm begining to get irritated with the actual card numbering systems from nvidia.

Would prefer it to get back to basics.

GeForce 8 LE (Lite Edition)
GeForce 8 SE (Standard Edition)
GeForce 8 Ultra (Ultra Edition)
GeForce 8 Extreme (Extreme Edition)

Also sod having several different cards, go back to the basics we had in the early 90s with the 2D Cards when VL-Bus was available.
Had an S3 805/810 ISA-VL Graphics Board, which came with the 805 Chip and 512K RAM. This was upgradeable to the 810 Chip and expandable to 2MB RAM.

Sure it was a pain swapping the chips and adding more ram given they will also single chips holding 128K each. Still it was a very damn good solution.

Performance between the basic version (I got mine for ~£50) and the fully decked out version (810 chip £30, 1.5MB Chips ~£40) was just ridiculous.

Games on my 486 DX/100 like Earthsiege went from 10fps to 35fps. Considering it was doing all of it with sofware draw calls, that was an impressive increase especially in those days.
Given most games were designed to run at 8fps and max out at 25fps, it was certainly impressive to me.

Don't see any reason why either nvidia or ati can't start producing such boards, or even have this feature for their on-board graphics processors.

Intel Core 2 Duo E6400, 512MB DDR2 667MHz, ATi Radeon X1900 XT 256MB PCI-E, Windows Vista Business / XP Professional SP2

Login to post a reply

Server time is: 2024-11-17 21:27:49
Your offset time is: 2024-11-17 21:27:49