Quote: "Interesting point, although I'm sure I heard they are using a software-only solution, not the hardware. But does this mean it's embedded in the Playstation itself, or there is a PhysX module that Playstation developers can incorporate?"
Embedded PPU, it's not *as* powerful as the current on-market solutions and it utilises the System memory rather than having it's own.
If you're interested about it then sign-up to Ageia as a developer and try to blag access to the PS3 section. I've not bothered, cause have no interest in PS3 development; can't stand developing for Linux platforms.
Quote: "The other question is...is this generic shader support, or is it specific to Nvidia cards? Because if it's the latter, then we will potentially have 3 different solutions to cater for."
I'm probably wrong about what you mean, but if you're talking about CUDA; it's nvidia only and G80+ GPUs.
However the compiler like Cg is open-source when it's finally publically released, so like Cg it's up to ATI if they want to add their own support.
Considering ATI already have their own version called 'Stream' in development along with Ashli, I highly doubt they will.
This graphics war is honestly begining to hurt the one set of people it relies on in order to help push it all forward : the developers.
As I'm working with the Xbox360 now more exclusively, don't see a point in learning the nvidia technology. In recent month I've even had to ditch my beloved brand in favour for the X1K-Series.. no point in having 2 cards given the price of cards with reasonable performance.
Rather have one mid-high range compared to two low-range cards.
People's finances aren't exactly infinite, neither our developers time to learn all of this new technology.
Quote: "Yeah, speccing the last two digits (80xx) as a replacement for LE, GT, GTX etc. would be more user friendly, or even adding a .1, .2 etc. (80xx.1?)."
Yeah but what would you buy based on name...
GeForce 8800.3
or
GeForce 8800 GTX
honestly?
It made sense a while ago...
GeForce 4 MX, GeForce 4 Ti
GeForce FX 5200, GeForce FX 5200 Ultra
I'm begining to get irritated with the actual card numbering systems from nvidia.
Would prefer it to get back to basics.
GeForce 8 LE (Lite Edition)
GeForce 8 SE (Standard Edition)
GeForce 8 Ultra (Ultra Edition)
GeForce 8 Extreme (Extreme Edition)
Also sod having several different cards, go back to the basics we had in the early 90s with the 2D Cards when VL-Bus was available.
Had an S3 805/810 ISA-VL Graphics Board, which came with the 805 Chip and 512K RAM. This was upgradeable to the 810 Chip and expandable to 2MB RAM.
Sure it was a pain swapping the chips and adding more ram given they will also single chips holding 128K each. Still it was a very damn good solution.
Performance between the basic version (I got mine for ~£50) and the fully decked out version (810 chip £30, 1.5MB Chips ~£40) was just ridiculous.
Games on my 486 DX/100 like Earthsiege went from 10fps to 35fps. Considering it was doing all of it with sofware draw calls, that was an impressive increase especially in those days.
Given most games were designed to run at 8fps and max out at 25fps, it was certainly impressive to me.
Don't see any reason why either nvidia or ati can't start producing such boards, or even have this feature for their on-board graphics processors.
Intel Core 2 Duo E6400, 512MB DDR2 667MHz, ATi Radeon X1900 XT 256MB PCI-E, Windows Vista Business / XP Professional SP2