Sorry your browser is not supported!

You are using an outdated browser that does not support modern web technologies, in order to use this site please update to a new browser.

Browsers supported include Chrome, FireFox, Safari, Opera, Internet Explorer 10+ or Microsoft Edge.

Geek Culture / physics processing unit (ppu)

Author
Message
Simian
20
Years of Service
User Offline
Joined: 16th Feb 2004
Location: my room
Posted: 9th May 2005 23:35
flibX0r
21
Years of Service
User Offline
Joined: 14th Feb 2003
Location: Western Australia
Posted: 9th May 2005 23:39
yup, looks sweet. hardware implemented physics engine is one thing i would love to have



You can't wield supreme executive power just because some watery tart threw a sword at you
ToyImp05
19
Years of Service
User Offline
Joined: 3rd May 2005
Location: Home
Posted: 10th May 2005 07:39
All I have to say is... Oh... my... god.. The computer gods are looking upon us in the future.

eat a beaver, save a tree.
Simian
20
Years of Service
User Offline
Joined: 16th Feb 2004
Location: my room
Posted: 10th May 2005 18:00
and the coolest thing is that because it frees up the cpu from handling the physics, we should now start seeing OTHER areas of games that start coming into their own as well.
Like, im sure that the fact that the GPU freed up the CPU from doing graphix is what helped physics become a bigger feature in games in the first place. Its a nice cascading effect.

exciting times


http://www.toontch.com
Nicholas Thompson
20
Years of Service
User Offline
Joined: 6th Sep 2004
Location: Bognor Regis, UK
Posted: 10th May 2005 19:31 Edited at: 10th May 2005 19:31
3 points:
1) Am I the only one that noticed the file name in the URL starts with STFU (ie Shut The F*** Up)
2) Am I the only one that believes graphics/physics/sound/bells and whistles dont makes games fun? They're nice and can AID a game to be fun - but since when did ANY of the classic games that were released 20 years ago REQUIRE hardware accelerated graphics, sound and now physics to be fun? They didn't.
Quote: "Hardware-accelerated physics is a major innovation that is likely to breathe new life into the PC as a gaming platform"


I think that is wrong. It MIGHT make the physics better, but to breathe new life into the PC as a gaming platform? NO! It wont only be the PC that takes it up, every console will too - thus making that statement redundant. Also - I refer you to my previous point - physics dont MAKE games fun. They can be a COMPONENT of a game - but its the concept and structure of a game that makes it fun.

I also tend to find that its the simple, pick-up-and-play games that stay around for longer than most things (most of the time).

3) Am I the only one thats noticed a severe lack of decent games recently, but an ever increase plague of tech demo's?

My Website:
TKF15H
21
Years of Service
User Offline
Joined: 20th Jul 2003
Location: Rio de Janeiro
Posted: 10th May 2005 19:51
Nicholas Thompson
20
Years of Service
User Offline
Joined: 6th Sep 2004
Location: Bognor Regis, UK
Posted: 10th May 2005 20:42
Oops.. *hides in the corner sobbing*

My Website:
OSX Using Happy Dude
21
Years of Service
User Offline
Joined: 21st Aug 2003
Location: At home
Posted: 10th May 2005 21:02
Quote: "2) Am I the only one that believes graphics/physics/sound/bells and whistles dont makes games fun?"

Probably...

Visit http://homepages.nildram.co.uk/~nickk
Calm down dear! Its only The Unofficial DB Sci-Fi Con...
http://spaces.msn.com/members/BouncyBrick/
Simian
20
Years of Service
User Offline
Joined: 16th Feb 2004
Location: my room
Posted: 10th May 2005 21:03
Quote: "2) Am I the only one that believes graphics/physics/sound/bells and whistles dont makes games fun? They're nice and can AID a game to be fun - but since when did ANY of the classic games that were released 20 years ago REQUIRE hardware accelerated graphics, sound and now physics to be fun? They didn't."


okay, graphics and sound aren't COMPLETELY neccessary to make games fun, but they do help to make an emmersive experience. As more and more people watch movies with special effects these days, they become desensitised (sp?) to that.You cant get away with anything less than perfect effects anymore. even the layest laymen can notice a bad greenscreen shot. That ofcourse spills over to the gaming world.

the first Doom, scared the poop out of me when it came out. Playing it late at night alone in the darkness i thought i was really being attacked by monsters from hell.Since i've seen doom3 though, i can't get that involved into doom1 anymore. They'll growl at me and slide over, and the best response they'll get out of me is a pity-chuckle and a head shake.

Simmilarly,with physics,even if i am fully emersed in a great looking game,im quickly reminded that its only virtual reality when i bump into a cardboard box and it stops me dead in my tracks instead of being flung across the room.

If you wanna fool people into believing your fantasy world is real for that hour or whatever that they're playing it, you HAVE to have nice graphix, sound,physics and the ocasional 'bell and/or whistle' wont hurt either.

Sure games dont NEED these things to make them fun, but i dont like the kind of thinking where people use this as an excuse to make a half-ass game. (not that i think anyone here did that offcourse)


aaahhhh....its good to vent.

http://www.toontch.com
Raven
19
Years of Service
User Offline
Joined: 23rd Mar 2005
Location: Hertfordshire, England
Posted: 10th May 2005 22:16
Heh... the thing is gamers don't want Real Graphics and Physics, they just want them to be realistic.

You know like Hollywood real.
Still a hardware chip for this stuff is awesome!
Surprised I never heard anything about it after GDC, normally one of those things you have everyone mentioning.

[ Liandri Corporation / Chief Software Architect ]
Dave J
Retired Moderator
21
Years of Service
User Offline
Joined: 11th Feb 2003
Location: Secret Military Pub, Down Under
Posted: 10th May 2005 23:10
What's the difference between this and a second CPU?


"Computers are useless, they can only give you answers."
OSX Using Happy Dude
21
Years of Service
User Offline
Joined: 21st Aug 2003
Location: At home
Posted: 10th May 2005 23:35 Edited at: 10th May 2005 23:38
A CPU is for general things.

FPU is for mathematical operations
GPU is for graphics operations
PPU is for physics operations

All of which are designed to reduce the load off the main CPU (or CPU's if you have more than one), which is important as it can then do more things.

Hopefully in a few years there might be an AIU (AI Unit)...

Visit http://homepages.nildram.co.uk/~nickk
Calm down dear! Its only The Unofficial DB Sci-Fi Con...
http://spaces.msn.com/members/BouncyBrick/
Benjamin
21
Years of Service
User Offline
Joined: 24th Nov 2002
Location: France
Posted: 10th May 2005 23:55
Quote: "FPU"

Floating point unit!


"Lets migrate like bricks" - Me
TKF15H
21
Years of Service
User Offline
Joined: 20th Jul 2003
Location: Rio de Janeiro
Posted: 11th May 2005 00:03
I think the name "PPU" is not fit for a physics processor. PPU is good for "Pie Processing Unit".
Mmm... pie...

OSX Using Happy Dude
21
Years of Service
User Offline
Joined: 21st Aug 2003
Location: At home
Posted: 11th May 2005 00:25
Quote: "Floating point unit!"

Mathematical operations! Its actually now part of the processor (and has been since the 1990's).

Visit http://homepages.nildram.co.uk/~nickk
Calm down dear! Its only The Unofficial DB Sci-Fi Con...
http://spaces.msn.com/members/BouncyBrick/
Benjamin
21
Years of Service
User Offline
Joined: 24th Nov 2002
Location: France
Posted: 11th May 2005 00:29
Calm down dear! Its only a floating point unit..


"Lets migrate like bricks" - Me
flibX0r
21
Years of Service
User Offline
Joined: 14th Feb 2003
Location: Western Australia
Posted: 11th May 2005 00:33
my computer can do lots of FLOPS





You can't wield supreme executive power just because some watery tart threw a sword at you
Jess T
Retired Moderator
21
Years of Service
User Offline
Joined: 20th Sep 2003
Location: Over There... Kablam!
Posted: 11th May 2005 01:01
Urg... Don't remind me of MatLab... That thing's evil!


Team EOD :: All-Round Nice Guy
dbHelp
Ian T
22
Years of Service
User Offline
Joined: 12th Sep 2002
Location: Around
Posted: 11th May 2005 01:18
I honestly don't see the point of it. CPUs are already well designed for physics mathematics purposes. We have GPUs seperatly because they perform vastly different mathematical commands. Physics and collision don't.

"Grif, if there's one thing I've learned working with you, it's there there's always a margin for error."
"It's pronounced margarine, dumbass!"
flibX0r
21
Years of Service
User Offline
Joined: 14th Feb 2003
Location: Western Australia
Posted: 11th May 2005 01:31 Edited at: 11th May 2005 01:33
Quote: "CPUs are already well designed for physics mathematics purposes"


CPU's are already well designed for graphic mathematical purposes too. The reason we have a seperate GPU is so i can be radically accelerated by designing them specifically for this purpose, rather than using a general purpose designed chip. Plus the addition of its own dedicated memory, with a wide-ass bus

The benifit of having a dedicated physics card is the same as a graphics card, increased speed, seperate, dedicated memory, and releasing strain on the CPU, leaving more CPU time avaliable for more cpu-intensive code (like hard-core AI systems, wouldn't that be nice)

Quote: "We have GPUs seperatly because they perform vastly different mathematical commands"


They don't. Really, they don't. It's just floating point math



You can't wield supreme executive power just because some watery tart threw a sword at you
mm0zct
20
Years of Service
User Offline
Joined: 18th Nov 2003
Location: scotland-uk
Posted: 11th May 2005 01:31 Edited at: 11th May 2005 01:32
just now we use simple maths and stuff for physics because that's all the cpu will allow, like the gpu used to be. freeing the cpu of physics calculations means the same workload can be used for things like ai which do make a game more fun/challenging. if you dedicated a whole cpu to just physics then the physics can become much more complex and specialised instead of having to be simple because the are sharing a processor with things like ai.

edit dam you got there first
(mum came and interupted me while writing)

http://www.larinar.tk
AMD athlon 64 3000+, 512mb ddr400, abit kv8, 160gb hdd, gigabit lan, ati radeon 9800se 128mb.
billy the kid
19
Years of Service
User Offline
Joined: 7th Dec 2004
Location:
Posted: 11th May 2005 01:52 Edited at: 11th May 2005 02:07
Quote: "We have GPUs seperatly because they perform vastly different mathematical commands."


*insert buzzer sound* Please try again.

Quote: "They don't. Really, they don't. It's just floating point math"


*insert ding ding ding sound* Yes.

One big advantage of a PPU that a lot of people dont seem to touch on very often is the impact it will have on simulations in industry. Sometimes these simulations take days or weeks to run (Im not exaggerating). With a PPU these simulations would now take hours or days. The real market, at least initially, will be the realistic simulation market. And then later games will start using it. I think games would start using it now but most gamers dont have one and wont for awhile.

P.S. The CPU can calculate anything and everything a GPU can, a PPU can, etc. However the hardware is not specialized for graphics, physics, etc so its a lot slower to do those calculations. If you really wanted to, you could run DirectX in "reference" mode which means everything is done on the CPU but it takes forever. And you would actually be able to do the latest effects like HDR even if you have a crappy video card, but it would take forever for 1 frame to render.
Lukas W
21
Years of Service
User Offline
Joined: 5th Sep 2003
Location: Sweden
Posted: 11th May 2005 02:05
just think of it this way:

now we might finally be able to add real* sand

i can see it before me: Half life 3 (if there will be one) with Gordon standing in the sand, there is wind, sand in his eye, his orange suit is covered by sand. his hair moves with the wind. an alien comes from behind, making footsteps in the sand, gordon shoots the alien with his shotgun and the creature falls to the ground where it leaves a big hole in the ground and the wind is covering the alien with sand so eventually it dissapears. but one can dig it back up and shoot it away with the antigravity gun..

or how about just a simple car game with sand.

* in the virtual world

Dave J
Retired Moderator
21
Years of Service
User Offline
Joined: 11th Feb 2003
Location: Secret Military Pub, Down Under
Posted: 11th May 2005 11:56
It's not going to happen.


"Computers are useless, they can only give you answers."
billy the kid
19
Years of Service
User Offline
Joined: 7th Dec 2004
Location:
Posted: 11th May 2005 13:23 Edited at: 11th May 2005 13:24
Actually I think that could definitely be a possibility. The PPU technology may actually get to the point where you can actually build a combustible engine for a car and have that actually drive the car. Not sure why you would do that unless it was for a simulation, but still the possibility exists.

Like it was said before, this will open up whole new areas of gameplay. In the next-generation games, everything will be run by physics and I mean EVERYTHING minus characters. Similar to the way Half Life 2 was made. Even Black and White 2 has its own physics engine, a RTS/God game. And maybe even someday there wont be anymore character animations, characters will just walk, run, etc completely based on physics rules. That will cut down some on character development time.

BTW based on demos Ive seen of B&W 2, the physics actually enhances the gameplay and opens new areas of gameplay. Think someone earlier said physics wouldnt open new areas of gameplay, but with B&W 2 this clearly isnt true.

Basically the more stuff the engine can do, the more gameplay options are available. However that doesnt mean better games will be made, just means there are more gameplay options.
Raven
19
Years of Service
User Offline
Joined: 23rd Mar 2005
Location: Hertfordshire, England
Posted: 11th May 2005 13:24
Erm..

While sure it's true that each of the Processing Units are just Calculators.. I think what is being clearly forgotten is how each of your current peices of hardware work.

Remember everything calculates Integer, even the Floating-Point Units specialised to FP Math.. are still limited by Integer-based Binary Systems. As a result these Units are created specifically so that they will calculate a given task at hand.

While it's true that underneath they're all just doing the same Mathematics, and they all have similar core operations like

ADD, SUB, MUL, DIV

The fact remains that they all have specific instruction operations that are unique to the task they're set to perform.

For example the FPU has enlarged Registers (40bit Wide) which when combined with certain Instructions sets allows it to access that Register to have the full-length Floating-Point with the additional information required to calculated it at roughly the same speed as it would an Integer 32bit Number.

We flip over to the GPU, and you'll find the standard instructions sure.. but my CPU doesn't have a DOT Instruction set specifically designed to calculate the Dot Product of 2 Vectors.

That one instruction does in a single cycle what a CPU would take 5 cycles to accomplish.

Even when we look at the CPU itself, it has changed and extended for the Digital Media Age. MMX Instruction sets extend the processors ability to calculate 16bit Colour within single passes allowing it to run High Colour desktops without the need of a high-performance 2D Card. It isn't used so much now, unless you using Linux but still it also extended Floating-Point Accuracy from 40bit to 64bit.

While it's true there's not really that one Processing Unit can do that the other can't.. the question isn't about technical ability, but the speed with which it can achieve it.

Why do you think the CISC x86 Processor despite it's difficult for developer design is used so much more widely than the RISC PPC Processor? It's because it is technically capable of much much more, without needing to developed specifically for each design of the chip.

From the website and the speech, I reckon the Playstation 3 and Revolution have are going to have this PPU built-in. Just a hunch, but I can see it happening.

[ Liandri Corporation / Chief Software Architect ]
billy the kid
19
Years of Service
User Offline
Joined: 7th Dec 2004
Location:
Posted: 11th May 2005 13:34 Edited at: 11th May 2005 13:35
Not sure you are siding with the "CPU can do everything but slower than specialized hardware" people or siding with the "CPU cant do everything" people while at the same time siding with the former. I think that sentence makes about as much sense as the purpose of that previous post.
Raven
19
Years of Service
User Offline
Joined: 23rd Mar 2005
Location: Hertfordshire, England
Posted: 11th May 2005 14:06 Edited at: 11th May 2005 14:16
I'm not siding with anyone.. and I certainly don't remember reading anywhere in this thread 'CPU cant do everything'.

[ Liandri Corporation / Chief Software Architect ]
Dave J
Retired Moderator
21
Years of Service
User Offline
Joined: 11th Feb 2003
Location: Secret Military Pub, Down Under
Posted: 11th May 2005 14:17
We'll have none of that language, thank you. That response was completely uncalled for.


"Computers are useless, they can only give you answers."
BearCDPOLD
21
Years of Service
User Offline
Joined: 16th Oct 2003
Location: AZ,USA
Posted: 11th May 2005 15:07
This PPU thing sounds like cool, but sometimes I find super-realistic or exaggerated physics to be a bit tedious in games. You find yourself playing the game for the sake of kicking boxes and such around, we're doing okay with what we have. It could save regular CPU time, but why not just make a faster CPU?

An AI unit would be worthwhile though. If anything needs its own processor it's AI.

Crazy Donut Productions
Current Project: A Redneck game
Raven
19
Years of Service
User Offline
Joined: 23rd Mar 2005
Location: Hertfordshire, England
Posted: 11th May 2005 15:19
Quote: "We'll have none of that language, thank you. That response was completely uncalled for. "


There is a post about this in the Moderator forum.
Your not allowed to edit posts as you see fit, and Rich had promised me this wouldn't be an issue again after Mouse.

Quote: "This PPU thing sounds like cool, but sometimes I find super-realistic or exaggerated physics to be a bit tedious in games. You find yourself playing the game for the sake of kicking boxes and such around, we're doing okay with what we have. It could save regular CPU time, but why not just make a faster CPU?

An AI unit would be worthwhile though. If anything needs its own processor it's AI."


An AI unit would be hard right now. Although the IBM PPCs have started the ground work for a customiseable Processor Unit, it is quite far from being a realistic technology that could be used on it's own.

I agree with above though. Physics while being used never really became possible until the CPU was freed up from the majority of graphics processing routines.

I can see in about 10years time, computers won't be running around the CPU; but rather the CPU will simply be used as a staging area to allow everything to cooperate with each other. Much like Mobo Chipsets do right now, only more intelligently.

[ Liandri Corporation / Chief Software Architect ]
billy the kid
19
Years of Service
User Offline
Joined: 7th Dec 2004
Location:
Posted: 11th May 2005 15:39 Edited at: 11th May 2005 15:45
Yeah really the only obstacle to creating an AIPU is pathfinding as it is still quite variable. Currently this also happens to be the thing that takes up the most CPU time for AI. Other stuff like perception (sight, touch, etc), steering behaviors, etc are fairly standard for games minus the high-level logic. Its just pathfinding thats the real problem currently.

And for those of you that think A* is the standard pathfinding algorithm, you are right and wrong. You are right as thats the standard algorithm to find the best path out of many possible paths. However there are many ways to creating these paths. And right now, most of them require at least some amount of manual input. The goal currently is to make path setup completed automated for every case. Once that is achieved, then pathfinding will be ready for a hardware implementation. Actually I was reading a white paper from Renderware and they seem to be onto something, but not sure it will work. Or I should say Im not sure it will work for all cases.
Jess T
Retired Moderator
21
Years of Service
User Offline
Joined: 20th Sep 2003
Location: Over There... Kablam!
Posted: 11th May 2005 16:06
Quote: "There is a post about this in the Moderator forum."


... There is?

Show me.


Team EOD :: All-Round Nice Guy
dbHelp
Dave J
Retired Moderator
21
Years of Service
User Offline
Joined: 11th Feb 2003
Location: Secret Military Pub, Down Under
Posted: 11th May 2005 16:27
Quote: "There is a post about this in the Moderator forum.
Your not allowed to edit posts as you see fit, and Rich had promised me this wouldn't be an issue again after Mouse."


I'd also like to see this 'post'. Stop insulting our forum users and we'll stop editing your messages. We don't edit out borderline content that might be considered offensive, however, your post was a straight out flamebait and that's something we're trying to prevent.


"Computers are useless, they can only give you answers."
billy the kid
19
Years of Service
User Offline
Joined: 7th Dec 2004
Location:
Posted: 11th May 2005 16:36
Hmm I assume these deleted comments were directed at me. Im kinda curious what it was actually.

Tesio - I wasnt trying to be offensive or anything. Its just that your post kinda came out of left field. Although it is on the topic of CPUs, GPUs, etc. We were never really talking about the specifics of the instructions and stuff. The post just seems really out of place with the rest of the thread to me. Thats what I was getting at with my comment, could have probably worded it better. And I certainly didnt mean to offend.
Jeku
Moderator
21
Years of Service
User Offline
Joined: 4th Jul 2003
Location: Vancouver, British Columbia, Canada
Posted: 11th May 2005 16:52
Quote: "I can see in about 10years time, computers won't be running around the CPU; but rather the CPU will simply be used as a staging area to allow everything to cooperate with each other."


I see this too. Kind of like the CPU connects to the other PUs like our brain connects to the other parts of our body.


--[R.O.B.O.I. and FireTris Coming Soon]--
Dave J
Retired Moderator
21
Years of Service
User Offline
Joined: 11th Feb 2003
Location: Secret Military Pub, Down Under
Posted: 11th May 2005 16:58
Well, that's what they call the CPU, don't they? The 'brain of the computer'!


"Computers are useless, they can only give you answers."
Raven
19
Years of Service
User Offline
Joined: 23rd Mar 2005
Location: Hertfordshire, England
Posted: 11th May 2005 18:31
Yeah, the CPU is the Brain. It's more becomming like the Cerebral(sp?) though.

More like the Control Center of the Brain. As companies appear to be slowly but surely dividing up the tasks from a single chip to multiple.

No doubt this will come full circle at some-point, and the addition Units will become integrated in to the CPU. That isn't likely for a while though, not while there are several companies setting standards in different areas.

I would be interested in seeing IBM and NVIDIA come up with a Combined CPU/GPU Solution though. It would really help to unify them, especially given GPU are still working at around 500HHz yet CPU are breaking 3GHz.

Then again Processor design, heh not really something I understand once you get past the simplistic 8086

Quote: "... There is?

Show me."


Can't, not a member. I just know that Rich made a post when the new moderators were abusing thier powers about what they are and aren't allowed to do.

He directly stated that one of the rules were the that Moderators COULD NOT EDIT users posts, as this was my major gripe at the time given Mouse was editing and deleting my posts as he saw fit.

[ Liandri Corporation / Chief Software Architect ]
OSX Using Happy Dude
21
Years of Service
User Offline
Joined: 21st Aug 2003
Location: At home
Posted: 11th May 2005 18:32
Acorn, before they decided to drop out of making computers, created a prototype computer that didn't use clock cycles to make sure the instructions are operated on at the correct time - if I remember correctly, all instructions took the same amount of time.

Visit http://homepages.nildram.co.uk/~nickk
Calm down dear! Its only The Unofficial DB Sci-Fi Con...
http://spaces.msn.com/members/BouncyBrick/
Raven
19
Years of Service
User Offline
Joined: 23rd Mar 2005
Location: Hertfordshire, England
Posted: 11th May 2005 18:41
Async Processing? Would be interesting to read up on that one.

[ Liandri Corporation / Chief Software Architect ]
OSX Using Happy Dude
21
Years of Service
User Offline
Joined: 21st Aug 2003
Location: At home
Posted: 11th May 2005 20:05
I dont know what happened to the research papers - must be around somewhere.

Its something that AMD and Intel should start working on, along with 128-bit processors.

Visit http://homepages.nildram.co.uk/~nickk
Calm down dear! Its only The Unofficial DB Sci-Fi Con...
http://spaces.msn.com/members/BouncyBrick/
Raven
19
Years of Service
User Offline
Joined: 23rd Mar 2005
Location: Hertfordshire, England
Posted: 11th May 2005 20:42
Yeah, I'm worried that we're only just slowly getting 64-bit.
I guess this is because of the way x86 is designed, there is no real solution to upgrade them without either causing incompatibility or more main in the arse extensions.

Not to mention IA-64 really isn't anything to write home about heh, seriously thier IA-32 processors seriously out perform them.

Hopefully though, Intel will skip to 128-bit x86 and force the industry to play catch up. Really thier aim should be, to develop a processor around the CLR. As it is very quickly becomming a universal standard now with Mono and .NET being compatible.

[ Liandri Corporation / Chief Software Architect ]
IanM
Retired Moderator
22
Years of Service
User Offline
Joined: 11th Sep 2002
Location: In my moon base
Posted: 11th May 2005 21:30
The general-purpose/dedicated question was answered long ago. There is no way that a modern CPU can perform faster than a modern GPU - the proof of that is the fact that GPU's exist. If it wasn't true, then Nvidia and ATI would bung a P4 on the graphics card and save on all of the development costs.

Yes, all of the calculations basically come down to binary integer maths, but those operations can be hard-wired into the chip to take place in minimal clock cycles. A general purpose chip cannot have this level of dedication, because if it did, it wouldn't be general purpose.

I don't know whether a PPU will actually take off - but if the makers include a standard SDK that will use a PPU if there is one in place, and fall-back to a processor-based mode if not, then they might be in with a chance. Especially if they can get one of the big console makers to take a chance on it.

*** Coming soon - Network Plug-in - Check my site for info ***
For free Plug-ins and source code http://www.matrix1.demon.co.uk
Raven
19
Years of Service
User Offline
Joined: 23rd Mar 2005
Location: Hertfordshire, England
Posted: 11th May 2005 21:49
Yeah they do Ian, they use the NovadeX Phsyics Engine which is compatible with both the current PCI PPU Card and CPU-based Physics.

It looks like a very good system, and looking through the NovadeX Rocket (heh catapulting ragdolls at a tower was fun) ... it does look like a real alternative. Especially as they package an open source Framwork (which is .NET compatible) that sits on-top of NovadeX and acts a bit like DBP does with DirectX.

This company has put together a very strong portfolio and a damn good arguement for using the PPU. I'm looking over the specifications on it and will continue to over the next month to see if it can't be implimented in to Kismet.

/-/

On a side note, has anyone seen that AMD are planning to release new Multi-Core range of Athlon64/Opteron processors in June?
I'm definately going to look forward to getting my hands on the cheaper 4200+ sometime in August once there are enough Mobo choices. Hopefully NVIDIA will give nForce4 support for it.

[ Liandri Corporation / Chief Software Architect ]
Redmotion
21
Years of Service
User Offline
Joined: 16th Jan 2003
Location: Mmm mmm.. Marmite
Posted: 11th May 2005 22:57
Looks like just another thing we PC users will have to keep upgrading to play the latest games.

(I'll leave my rant about where gamers really want games to go for another thread.)

Quote: "Its something that AMD and Intel should start working on, along with 128-bit processors."


The way we are being drip-fed CPU tech - you'll be waiting until at least 2015 for 128bit PCs. Meanwhile the PS5 will host 5 gigabit 20 Ghz co-processors and you'll be plugging yourself into the joystick port.

PROJECTS:
Scorched Real Estate - beta release scheduled for october
OSX Using Happy Dude
21
Years of Service
User Offline
Joined: 21st Aug 2003
Location: At home
Posted: 11th May 2005 23:54
Quote: "you'll be waiting until at least 2015 for 128bit PCs"

Its a shame really...

Visit http://homepages.nildram.co.uk/~nickk
Calm down dear! Its only The Unofficial DB Sci-Fi Con...
http://spaces.msn.com/members/BouncyBrick/
Lukas W
21
Years of Service
User Offline
Joined: 5th Sep 2003
Location: Sweden
Posted: 12th May 2005 00:20
ehum, sand anyone?
wouldnt it be possible to make very much realistic sand environments as it is possible to make quite an impressive art of water. waves, reflection etc. etc.. i dont think anyone could see realistic water in games in the future back in the 1990's?? as it is impossible for us to imagine sand in games now, in the future.

anyone understand what i said?

Ian T
22
Years of Service
User Offline
Joined: 12th Sep 2002
Location: Around
Posted: 13th May 2005 09:13
My money is on this never catching on. The less seperate pieces of hardware there are to account for, the easier it is to move forward and preserve backwards compatibility. There's nothing I've heard of you can do on this PPU that you can't do at an equal speed on a seperate CPU-- quite unlike GPUs (try running an Unreal game in software mode to see what I mean).

"Grif, if there's one thing I've learned working with you, it's there there's always a margin for error."
"It's pronounced margarine, dumbass!"
billy the kid
19
Years of Service
User Offline
Joined: 7th Dec 2004
Location:
Posted: 13th May 2005 09:18
People said the same thing about GPUs. And like all the GPU nay-sayers, you will be proved wrong too.
Ian T
22
Years of Service
User Offline
Joined: 12th Sep 2002
Location: Around
Posted: 13th May 2005 09:23
Quote: " People said the same thing about GPUs."


Okay, but can you think of even one equivilant to onboard AS, FSAA, T&L, etc for a physics processor? That's what makes the difference for GPUs.

Quote: "And like all the GPU nay-sayers, you will be proved wrong too."


Not to mix analogies, but don't count your eggs before the bandwagon even exists!

Oh, wait...

"Grif, if there's one thing I've learned working with you, it's there there's always a margin for error."
"It's pronounced margarine, dumbass!"

Login to post a reply

Server time is: 2024-11-15 04:01:34
Your offset time is: 2024-11-15 04:01:34