Sorry your browser is not supported!

You are using an outdated browser that does not support modern web technologies, in order to use this site please update to a new browser.

Browsers supported include Chrome, FireFox, Safari, Opera, Internet Explorer 10+ or Microsoft Edge.

Geek Culture / Deciding which GPU to get...

Author
Message
Toasty Fresh
17
Years of Service
User Offline
Joined: 10th Jun 2007
Location: In my office, making poly-eating models.
Posted: 10th Jan 2009 12:13 Edited at: 10th Jan 2009 12:20
Well, I know that there is a thread similar to this lower down in the page, but I'm being a bit more narrowed down here...

Anyway, I am searching for a new GPU. Here are my options;
-GTX 280 (this is currently looking to be the one I'll get, although expensive)
-GTX 260 (sounds good, is cheaper, but not as good performance)
-Radeon HD 4850 (not sounding very good from tests I've seen)
-Radeon HD 4870 (sounds pretty good, but runs hot and is expensive)
-Nvidia GeForce 9800 GX2 (haven't found much on this, but it sounds good)

So, which do you guys suggest?

-Toasty

Demon Air 3D
16
Years of Service
User Offline
Joined: 16th Sep 2008
Location: England
Posted: 10th Jan 2009 12:35
i personally suggest the the Nvidia geforce 9800GX2. its pretty much a recommended card, btw if you get that card you are a lucky kid.
Toasty Fresh
17
Years of Service
User Offline
Joined: 10th Jun 2007
Location: In my office, making poly-eating models.
Posted: 10th Jan 2009 12:37 Edited at: 10th Jan 2009 12:37
Yeah, but check out this page: http://www.tomshardware.com/reviews/nvidia-gtx-280,1953-18.html

A GTX 280 is by far a much better card in comparison.

Demon Air 3D
16
Years of Service
User Offline
Joined: 16th Sep 2008
Location: England
Posted: 10th Jan 2009 12:56
attualy go with the GTX 280 that seems very good. aspecialy compared to the 9800's
BiggAdd
Retired Moderator
20
Years of Service
User Offline
Joined: 6th Aug 2004
Location: != null
Posted: 10th Jan 2009 13:12
My brother has two 9800 GX2s. They are very good cards, but one 9800 isn't enough to handle Crysis comfortably (1080p). Takes one and a bit.
But it is still a good card.

Puts out an incredible amount of heat though.

The GTX 280 has similar specs to the 9800 GX2, but it is one card instead of two stuck together, so will probably put out less heat and have a slight performance increase.

It also means you can have 3 of them when they eventually sell for £50 each.

Toasty Fresh
17
Years of Service
User Offline
Joined: 10th Jun 2007
Location: In my office, making poly-eating models.
Posted: 10th Jan 2009 13:16
Well my dad would be considering getting multiple GPUs so that I can use SLI, but the problem is that I only have one PCIE slot. As well as the extensive price for multiple GPUs.

Here are my specs:
Motherboard: 1 x Gigabyte GA-MA78GM-S2H M/b - AMD 780G/SB700 chipset, DDR2 800, PCI-E x16, Integrated VGA, DVI/HDMI with HDCP, SATA II RAID, USB, 1394, 8-Ch, mATX $134.00
RAM: 1x Corsair TWIN2X4096-6400C5 4GB (2x XMS2 2GB) PC-6400 (800MHz) DDR2 RAM, 2x240-pin DIMMs, Non ECC Unbuffered, 5-5-5-18 $92.00
Processor: 1 x AMD PHENOM X4 Quad Core 9550 CPU, 2.2GHz (95W), 4MB Cache, Socket AM2+ $210.00
Hard drive: 1 x Seagate 500GB 'ST3500320AS' SATA II 3Gb/s NCQ HDD - 7200rpm, 32MB Cache, 5-year warranty $85.00

Chenak
22
Years of Service
User Offline
Joined: 13th Sep 2002
Location: United Kingdom
Posted: 10th Jan 2009 15:04 Edited at: 10th Jan 2009 15:05
I have the HD4870, its an absolutely fantastic card, when overclocked (and sometimes not overclocked) it almost beats the geforce 280 by 0.2 FPS which is insane. Very good value for money, my one was only £160. There is the HD4870X2 out too which is so powerful it could take over the world

http://www.tomshardware.com/reviews/radeon-hd-4870,1964-9.html
El Goorf
18
Years of Service
User Offline
Joined: 17th Sep 2006
Location: Uni: Manchester, Home: Dunstable
Posted: 10th Jan 2009 19:40
it seems usual to complain about graphics cards outputting heat, but imo, any self-respecting man has the brains to get some kind of non-standard cooling for their pc. my gpu has a FANLESS cooler, which costed JUST £30, and my gpu runs at 35C during stress, and plays crysis on high settings, surely thats a good investment. I hope anyone is willing to get a £30 cheaper card if it means they can get better cooling for what they end up getting.

According to the most recent Custom PC benchmark, the ATi HD4870 offers the best bang per buck, which is typical for ATi. NV on the other hand have the reputation for being the most powerful cards, but for a lot more money in comparison to performace. that said, when the HD48xx series came out, NV nervously lowered the prices in order to compete

http://notmybase.com
All my base are not belong to anyone.
Uncle Sam
19
Years of Service
User Offline
Joined: 23rd Jul 2005
Location: West Coast, USA
Posted: 10th Jan 2009 22:12
I have a geforce 9800 gtx +. Eats games alive.

Toasty Fresh
17
Years of Service
User Offline
Joined: 10th Jun 2007
Location: In my office, making poly-eating models.
Posted: 10th Jan 2009 23:59 Edited at: 11th Jan 2009 02:03
Well those charts on Tom's Hardware say that the 280 is the best at higher resolutions, while the GX2 beats it at lower resolutions.

GTX 280 is looking like the best at this point...

EDIT: Just noticed on various sites that the GTX 285 and 295 are on sale thursday. My power supply is 550W, and GTX 285 has a minimum power requirement of 550W. So that means that if I get the GTX 285 then I have the best that my comp can handle.

Raven
20
Years of Service
User Offline
Joined: 23rd Mar 2005
Location: Hertfordshire, England
Posted: 11th Jan 2009 13:59
You know what seems quite bizarre to me is that you've already made your mind up to get an NVIDIA card over an ATI, only citing ATI as option because you felt you needed to or something.

For the less than a single NVIDIA GTX 280, you can get Sapphire ATi Radeon HD 4850 512MB X2. Combined they used less power, they would out-perform the NVIDIA GTX 280 by a comfortable margin, they are quieter... plus the most important part is that the AMD Series-7 Chipset and Phenom processor are design to actually run better with ATI GPUs, even without AMD Fusion installed.

Not to mention the drivers are far more mature.
In-fact even if you were to have only a single ATi HD 48xx against an NVIDIA 280, only Unreal 3 Engine games does it really have a noticable performance lead, more over this is ONLY at much higher resolutions. As soon as you use Anti-Aliasing and Anisotropic Filtering that performance gap once again closes greatly.

What is possibly more confusing is that only CRT mointors are even capable of resolutions past 1440x900 so what the hell does it matter if the 280 is quicker by a few fps at 2560x1920? Your mointor can't even sodding display that high.

I mean half of the arguing between which is best is kinda retarded even even my old HD 2600 XT 256MB was capable of 1080p HDMI gaming with max graphics settings on Tomb Raider Underworld, Crysis, and Unreal Tournament 3 at 60fps+ w/AA-4x, AF-8x

Sure I've recently bought a new card from Josh (HD 3870 512MB), but really this was mainly to higher AA/AF at 1080p and while testing XNA/DirectX code it pays to have a better performing hardware than what it is destined for.

I still have my HD 2600 XT for testing release code.
What's probably better is that I'm running each of those cards even at full performance on a 350w PSU in a cube case. Maybe they do run slighly hotter than their NVIDIA counter-parts at 60c, but ya know what my Chipset, CPU and Memory ALSO are running similar.

To be perfectly honest while cooler is better, you don't have to worry unless the chips start running higher than 95c; there is so much put on how hot chips run now... but hell 5 years ago if a chip ran at 60c it would be consider extremely cool. Now it's a bad thing?

Seriously ppl need to get a grip.

About the only reason I could see getting a GeForce over a Radeon being would be it could support PhysX Hardware; but I've not seen any tests that show how much that actually helps performance of games.

Your signature has been erased by a mod because it's larger than 600x120...
General Reed
19
Years of Service
User Offline
Joined: 24th Feb 2006
Location:
Posted: 11th Jan 2009 15:29
I want to know something once and for all. Basicly people keep saying that ATI's flagship card the 4870X2, Is better than Nvidia's, But:

1. Is it fair to compare a card, Which still utilizes X-Fire drivers effectivly being two cards sandwiched together? Its not like its simply dual core.

2. I cant find ANY benchmarks online to prove this claim.

If one can provide proof that this is true, And why its actualy fair to compare a dual card to a single card, Then i will accept that ATI is better. TBH, i dont care what vendor produces the best card, Only that it is the best, So i just want to know for knowlage's sake.

CPU: AMD X2 6000+ 3.0ghz GFX: NVIDIA BFG Geforce 8800GTS 640MB OC-550mhz core RAM: 2048mb

Toasty Fresh
17
Years of Service
User Offline
Joined: 10th Jun 2007
Location: In my office, making poly-eating models.
Posted: 12th Jan 2009 00:04
Raven, please, settle down. I am a big supporter of ATi, but at the moment Nvidia has the most powerful GPUs on the market.

I was considering the 4870, but it runs too damn hot, as well as the fact that Nvidia cards have... well, all that Nvidia stuff like CUDA etc...

Lemonade
16
Years of Service
User Offline
Joined: 10th Dec 2008
Location:
Posted: 12th Jan 2009 00:13
I recommend you getting an Nvidia card. I've used many graphics cards and not had one problem.

I heard that ATi uses an older "architecture", whatever that means. Is that even remotely true?
Chenak
22
Years of Service
User Offline
Joined: 13th Sep 2002
Location: United Kingdom
Posted: 12th Jan 2009 01:10
Quote: "I want to know something once and for all. Basicly people keep saying that ATI's flagship card the 4870X2, Is better than Nvidia's, But:

1. Is it fair to compare a card, Which still utilizes X-Fire drivers effectivly being two cards sandwiched together? Its not like its simply dual core.

2. I cant find ANY benchmarks online to prove this claim.

If one can provide proof that this is true, And why its actualy fair to compare a dual card to a single card, Then i will accept that ATI is better. TBH, i dont care what vendor produces the best card, Only that it is the best, So i just want to know for knowlage's sake."


Yes it is fair compare the cards. They have one price and one slot to fit into. why isn't it fair? If nvidia don't release a dual card for the current series then whats there to compare it to?

As for not finding any benchmarks, here you go. It also compares the hd4870 to the 9800gx2 which seems to be the latest nvidia dual card

http://www.tomshardware.com/reviews/4870-x2-amd,1992-5.html
AndrewT
18
Years of Service
User Offline
Joined: 11th Feb 2007
Location: MI, USA
Posted: 12th Jan 2009 01:29
Quote: "What is possibly more confusing is that only CRT mointors are even capable of resolutions past 1440x900"


What?! Then how is my laptop displaying 1680x1050?

Uncle Sam
19
Years of Service
User Offline
Joined: 23rd Jul 2005
Location: West Coast, USA
Posted: 12th Jan 2009 01:45
Yes, surely you made a mistake. My brother runs stuff at 1980x1050 on his LCD.....

Toasty Fresh
17
Years of Service
User Offline
Joined: 10th Jun 2007
Location: In my office, making poly-eating models.
Posted: 12th Jan 2009 02:10
Quote: "What is possibly more confusing is that only CRT mointors are even capable of resolutions past 1440x900"


Yeah, I'm on an LCD and I'm running 1680x1050.

General Reed
19
Years of Service
User Offline
Joined: 24th Feb 2006
Location:
Posted: 12th Jan 2009 02:50 Edited at: 12th Jan 2009 02:52
Quote: "What is possibly more confusing is that only CRT mointors are even capable of resolutions past 1440x900"

Lol no offence dude, But what year are you living in? Im just about to buy a 24Inch TfT, With a resolution of 1900 x 1200.

As for ATI, It would appear that the X2's strength are memory intensive games, Because of its 2gb's of gddr5.

CPU: AMD X2 6000+ 3.0ghz GFX: NVIDIA BFG Geforce 8800GTS 640MB OC-550mhz core RAM: 2048mb

Toasty Fresh
17
Years of Service
User Offline
Joined: 10th Jun 2007
Location: In my office, making poly-eating models.
Posted: 12th Jan 2009 05:48
Okay, I've noticed that the HD 4870 X2 is the best out there ATM, but it's really expensive...

Sunflash
19
Years of Service
User Offline
Joined: 28th Jun 2005
Location: Seattle, Wa
Posted: 12th Jan 2009 05:54 Edited at: 12th Jan 2009 06:00
WAIT!! DON'T GET THE GTX 280 YET:

Seriously though, you'll be making a big mistake you will later regret. Nvidia just released the GTX 295 which has 1792MB DDR3 Ram, unlike the GTX 280 which only has 1GB DDR3.

I bought a GTX 280 with my computer and LOVE it. I wish I had the money to get another one and utilise dual SLI, but I truthfully think you would have graphics capabilities far beyond the graphics processing challenge most games offer now days.

Also, in regard to GTX280/285/295 vs ATI Radeon HD 4850/4870, I wouldn't recommend going with ATI right now. I used to be a big proponent of their cards, but recently Nvidia has managed to turn the tables. Here's why:


SLI vs Crossfire- As I understand it, Nvidia SLI currently outperforms ATI Crossfire solutions. In addition, using more than 2+ cards while utilizing SLI is still going to get you more of a performance boost than Crossfire, so you really have nothing to gain from using a Crossfire solution yet.

Nvidia PhysX- As you know, each Nvidia card in the 200 series (and a few cards below that) all have an included PhysX processor to boost your games dramatically. This is something that ATI cards lack, and won't be getting anytime soon

PhysX Scalability- One of the amazing things with PhysX in respect to Nvidia is scalability. When some people think of multiple PhysX processors, they think it's 2X as powerful as one. While this is somewhat true, there's really more to it than that. The PhysX physics engine is built to be dybamically scalable, this means that if you are playing a game with a single PhysX enabled GPU, imagine an explosion that blows a building up into 1000 pieces. We can "oooh and awww" at that thinking it's the best we get, until we utilise a second PhysX enabled GPU. With more PhysX processing ability, you will see that the same explosion that you jut saw using the single GPU, now scales up in quality, possibly giving you 5000 pieces of exploding rubble. This scaling effect is something that really sets PhysX apart from other engines, the more you give it, the more it gives you!

Nvidia CUDA- CUDA is something that I see being a big proponent of GPU sales in the future. the ability to write programs to directly utilise the GPU is something that has always been inevitable. This is something we are going to see in the future, it's a great way to use your GPU to the fullest all the time!

If you are interested, here is the link for the GTX 295: (from Evga) http://www.evga.com/articles/00446/

It looks like it's currently going for around $509.99, but keep in mind that this GPU was just released very recently (just got the email notification this afternoon) so expect lower prices in the futore as well as some cheap overclocked editions from Evga. For reference, when the GTX 280 was released the superclocked edition averaged for around around $450, and now the superclocked editons are selling for $420. thats a good $30 for something that was pre superclocked. I suspect that the GTX 295 superclocked editions (if there will be any) will eventually sell for around $490, which in my oppinion is DEFINITELY worth the money.

Mountain Dew, happiness in a bottle.
Toasty Fresh
17
Years of Service
User Offline
Joined: 10th Jun 2007
Location: In my office, making poly-eating models.
Posted: 12th Jan 2009 06:03
Well, thanks for the heads up. The 295 looks really awesome, but I need a minimum wattage of 680. I have a Thermaltake TR2 RX Cable Management (550w). But, the other thing is that the staff at the shop I bought it at said that Thermaltake's stuff is reall underrated, as in, one of their 300w power supplies is about equivalent to a 400w one of standard quality.

I could go and buy the GTX 295, but I'd be taking a biiiig risk. What might be the consequences of a power supply not being powerful enough for the GPU?

Sunflash
19
Years of Service
User Offline
Joined: 28th Jun 2005
Location: Seattle, Wa
Posted: 12th Jan 2009 06:15 Edited at: 12th Jan 2009 06:19
Don't use a 550w PSU with a GTX 280. When I built my computer a few months ago I bought a 650W PSU and the GTX 280 eat it for lunch. I would say the 200 series cards consume more power than is admitted.

A lot of people told me that 600W max would be fine. But I found out the hard way and had to order a new PSU from Newegg. To be safe, and to give me a lot of extra leg room for future improvement, I ended up just going with an inexpensive 1000W PSU, and everything has been running amazingly, although I could use a few more fans to help disipate the extra heat my behemoth is producing, but I've just learned to sip ice water when my room starts to boil

EDIT: When the PSU I had wasn't powerful enough for my GTX 280, the card wouldn't even do any processing. You could see a little red light indicating it was getting some power, but the fan wasn't turning at all, and nothing would display.

Mountain Dew, happiness in a bottle.
Toasty Fresh
17
Years of Service
User Offline
Joined: 10th Jun 2007
Location: In my office, making poly-eating models.
Posted: 12th Jan 2009 06:19
Oh my god. This happens every bloody time I get new hardware. I go to use it, and I find out it's not good enough for something.

Thanks for the advice, anyway.

Uncle Sam
19
Years of Service
User Offline
Joined: 23rd Jul 2005
Location: West Coast, USA
Posted: 12th Jan 2009 07:48
Quote: "This happens every bloody time I get new hardware. I go to use it, and I find out it's not good enough for something."


Same thing always happens to me! It's always some small thing.

General Reed
19
Years of Service
User Offline
Joined: 24th Feb 2006
Location:
Posted: 12th Jan 2009 07:51
You need to do extensive research when buying new hardware.

CPU: AMD X2 6000+ 3.0ghz GFX: NVIDIA BFG Geforce 8800GTS 640MB OC-550mhz core RAM: 2048mb

Toasty Fresh
17
Years of Service
User Offline
Joined: 10th Jun 2007
Location: In my office, making poly-eating models.
Posted: 12th Jan 2009 08:21
Yeah, but then new hardware keeps bloody getting released.

General Reed
19
Years of Service
User Offline
Joined: 24th Feb 2006
Location:
Posted: 12th Jan 2009 08:25
Indeed, But thats both a good and bad thing, Depends on how big your pocket is.

CPU: AMD X2 6000+ 3.0ghz GFX: NVIDIA BFG Geforce 8800GTS 640MB OC-550mhz core RAM: 2048mb

Uncle Sam
19
Years of Service
User Offline
Joined: 23rd Jul 2005
Location: West Coast, USA
Posted: 12th Jan 2009 08:43
For me it's usually just an adapter or some thermal paste, nothing too big....

Toasty Fresh
17
Years of Service
User Offline
Joined: 10th Jun 2007
Location: In my office, making poly-eating models.
Posted: 12th Jan 2009 10:27
Well right at the moment, seeing as Nvidia seems to be a bit of a liar when it comes to GPUs (as Subflash said) it seems that the most powerful I can get is the HD 4870. I'm satisfied with that.

General Reed
19
Years of Service
User Offline
Joined: 24th Feb 2006
Location:
Posted: 12th Jan 2009 16:22
? Sunflash said dont go with ati.

CPU: AMD X2 6000+ 3.0ghz GFX: NVIDIA BFG Geforce 8800GTS 640MB OC-550mhz core RAM: 2048mb

Sunflash
19
Years of Service
User Offline
Joined: 28th Jun 2005
Location: Seattle, Wa
Posted: 12th Jan 2009 17:50
He means in the way of Power consumptiion. But really, if it's a matter of not having enough power, I would just wait and save for a bigger PSU, by that time GPU prices will fall lower also. Lol it's sort of a win/win situation except for the bloody saving part

In regard to the annoiance of new hadware always being released, that used to really bother me till I realized that I'm not buying a computer to show off the latest and greatest hardware reelase, I want something that works for me, it doesn't need to be the best... just something close

Mountain Dew, happiness in a bottle.
Raven
20
Years of Service
User Offline
Joined: 23rd Mar 2005
Location: Hertfordshire, England
Posted: 12th Jan 2009 22:28
Quote: "Lol no offence dude, But what year are you living in? Im just about to buy a 24Inch TfT, With a resolution of 1900 x 1200."


That's just not possible, the maximum possible physical resolution of a 24" TFT is 1600x1200 Sq or 1600x1000 Wi... well technically it's a little more but the extra pixels are usually used as an overscan buffer zone.

To physically display 1900x1200 you would need atleast 28" TFT display. It isn't TFT displays that have changed, because they haven't in over 8 years; what has changed is the more wide useage of built-in Video Signal Processors (or Embedded GPUs) that allow the displays themselves to scale to a given display.

In-fact you may have noticed they've been improving their perform quite a bit over the last few years, given monitor timings have been reduced from 8ms to 2ms in performance displays.

Unlike CRT, TFT can't physically be improved without making the technology smaller; something that is much more difficult than you'd imagine. Business' get around that by simply making larger displays capable of better picture quality because well they're actually physically displaying a signal rather than needing to process it first.

Have you never wondered why your monitor has a "Smoothness" option?

Quote: "SLI vs Crossfire- As I understand it, Nvidia SLI currently outperforms ATI Crossfire solutions. In addition, using more than 2+ cards while utilizing SLI is still going to get you more of a performance boost than Crossfire, so you really have nothing to gain from using a Crossfire solution yet."


NVIDIA SLi doesn't out-perform ATI CrossFireX, not by a long shot.
SLi vs CrossFireX X2 (plus CrossFireX X4)

ATi have a commanding lead, when it comes to the performance of multiple GPU.

Quote: "Nvidia PhysX- As you know, each Nvidia card in the 200 series (and a few cards below that) all have an included PhysX processor to boost your games dramatically. This is something that ATI cards lack, and won't be getting anytime soon"


Is PhysX-Enabled GPU better for gaming?
Well it all depends on what sort of game you're playing doesn't it?

If you can afford a SLi NVIDIA GPU, and play quite a few PhysX-based games then yes... it is an incredible peice of technology for physics solutions.

If however you want to do it on a single NVIDIA GPU, sooner or later you'll have to choose between Graphics or Physics.
The real issue is that very few PC games actually rely on PhysX, in-fact only Console games using the Unreal 3 Engine heavily rely upon it. So question is, will it ever really be needed?

Quote: "PhysX Scalability- One of the amazing things with PhysX in respect to Nvidia is scalability. When some people think of multiple PhysX processors, they think it's 2X as powerful as one. While this is somewhat true, there's really more to it than that. The PhysX physics engine is built to be dybamically scalable, this means that if you are playing a game with a single PhysX enabled GPU, imagine an explosion that blows a building up into 1000 pieces. We can "oooh and awww" at that thinking it's the best we get, until we utilise a second PhysX enabled GPU. With more PhysX processing ability, you will see that the same explosion that you jut saw using the single GPU, now scales up in quality, possibly giving you 5000 pieces of exploding rubble. This scaling effect is something that really sets PhysX apart from other engines, the more you give it, the more it gives you!"


Yeah this is all "ooh and ahh" but realistically GRAW and GRAW 2 didn't actually look that much more impressive when PhysX-enabled.
In-fact I've yet to see a game where not having PhysX-capable hardware has made me actually really feel it changed the game dynamic or visual quality far beyond that of what was going on.

In-fact only CellFactor and the NVIDIA Tech Demos actually made me get excited about PhysX. Since then, I've seen regular hardware utilised to achieve similar results. To me that makes me thing that PhysX is deliberately inhibited when running in software-mode in order to make the Hardware version far more appealing.

Don't get me wrong, I'm sure you can do more with hardware physics; especially on a dedicated GPU... but the point is that past NVIDIA's own tech demos that can be achieved via shaders, there is nothing I've seen that has been as impressive.

In-fact if anything it keeps showing me time and time again, that Havok is a far better physics engine because it is a damn sight quicker without hardware acceleration; plus the GPU Physics they offer works regardless of the GPU you use, and can be directly linked to Shaders without having to pass anything back to the main CPU run application to then output to Shaders which you have to do with CUDA.

Quote: "Nvidia CUDA- CUDA is something that I see being a big proponent of GPU sales in the future. the ability to write programs to directly utilise the GPU is something that has always been inevitable. This is something we are going to see in the future, it's a great way to use your GPU to the fullest all the time!

If you are interested, here is the link for the GTX 295: (from Evga) http://www.evga.com/articles/00446/

It looks like it's currently going for around $509.99, but keep in mind that this GPU was just released very recently (just got the email notification this afternoon) so expect lower prices in the futore as well as some cheap overclocked editions from Evga. For reference, when the GTX 280 was released the superclocked edition averaged for around around $450, and now the superclocked editons are selling for $420. thats a good $30 for something that was pre superclocked. I suspect that the GTX 295 superclocked editions (if there will be any) will eventually sell for around $490, which in my oppinion is DEFINITELY worth the money."


CUDA looks awesome from a developers perspective, but ultimately pointless. Not being funny but it's just one more damn thing to learn that frankly, we don't need to.
Yes it's impressive, but until ATi adopts it or some form of standard is reached between ATi and NVIDIA concerning a common GPU ASM. Then there is no point in it for real-world game development applications. Sorry but there just isn't.

CUDA is definately a step in the right direction, but as far as the game development is concerned; in a market rife with piracy, supporting it means either twice the work to make two different versions of the same program for different hardware... or not supporting ATi hardware.

It'd be business suicide to support right now.

Your signature has been erased by a mod because it's larger than 600x120...
Benjamin
22
Years of Service
User Offline
Joined: 24th Nov 2002
Location: France
Posted: 12th Jan 2009 22:55
I think Raven might be referring to the screen's native resolution, which may be true; I have no idea.

Uncle Sam
19
Years of Service
User Offline
Joined: 23rd Jul 2005
Location: West Coast, USA
Posted: 12th Jan 2009 22:56 Edited at: 12th Jan 2009 23:01
Quote: "To physically display 1900x1200 you would need atleast 28" TFT display. It isn't TFT displays that have changed, because they haven't in over 8 years; what has changed is the more wide useage of built-in Video Signal Processors (or Embedded GPUs) that allow the displays themselves to scale to a given display."


To physically display 1900x1200 you would need almost any 24'' or more sized LCD that I have seen in my life.

Quote: "I think Raven might be referring to the screen's native resolution, which may be true; I have no idea."


I sincerely hope you are right.

And even so, take a look at this article (scroll down to "Computer Monitors"):

http://en.wikipedia.org/wiki/1080p#Computer_monitors

"Additionally, many 23, 24 and 27-inch (690 mm) widescreen LCD displays use 1920×1200 as their native resolution, 30 inch displays can display beyond 1080p at up to 2560x1600 or 1600p."

General Reed
19
Years of Service
User Offline
Joined: 24th Feb 2006
Location:
Posted: 12th Jan 2009 23:15
Quote: "To physically display 1900x1200 you would need atleast 28" TFT display. It isn't TFT displays that have changed, because they haven't in over 8 years; what has changed is the more wide useage of built-in Video Signal Processors (or Embedded GPUs) that allow the displays themselves to scale to a given display."

Nope we are all wrong. 24" TFT monitors have a native resolution of 1920x1200. Im not exactly sure what would make you thing its Impossible to have a 24" at that resolution, Just decreace the size per pixel.

Just take a look at scan, Even this cheap one has this feature:
[herf]http://www.scan.co.uk/Products/24-LG-W2452T-Black-Widescreen-1080p-HDCP-LCD-Monitor-1920x1200-100001-DCR-5-ms-DVI-D-VGA[/herf]

CPU: AMD X2 6000+ 3.0ghz GFX: NVIDIA BFG Geforce 8800GTS 640MB OC-550mhz core RAM: 2048mb

Uncle Sam
19
Years of Service
User Offline
Joined: 23rd Jul 2005
Location: West Coast, USA
Posted: 13th Jan 2009 01:55
How was I wrong?

Toasty Fresh
17
Years of Service
User Offline
Joined: 10th Jun 2007
Location: In my office, making poly-eating models.
Posted: 13th Jan 2009 02:16
Sunflash, my dad went to buy a new power supply and he wasn't expecting to pay a lot, but he pulled his finger out and bought the best thing the shop had. He really didn't want to pay that much, but he ended up paying that much anyway.

I will not be getting a new power supply, and that's that. But thanks for your input anyway.

Mods, you can lock this now.

Uncle Sam
19
Years of Service
User Offline
Joined: 23rd Jul 2005
Location: West Coast, USA
Posted: 13th Jan 2009 05:42
What do you mean you're not getting a new power supply, you just said your dad got one.

Toasty Fresh
17
Years of Service
User Offline
Joined: 10th Jun 2007
Location: In my office, making poly-eating models.
Posted: 13th Jan 2009 05:58
Yeah, that's the one I have now. The 550 watt one.

General Reed
19
Years of Service
User Offline
Joined: 24th Feb 2006
Location:
Posted: 13th Jan 2009 06:12
So let me get this straight. You have in your possesion a £300 GPU, Which all gamers crave for, And you are not going to use it, because you dont want to get a PSU?

CPU: AMD X2 6000+ 3.0ghz GFX: NVIDIA BFG Geforce 8800GTS 640MB OC-550mhz core RAM: 2048mb

Toasty Fresh
17
Years of Service
User Offline
Joined: 10th Jun 2007
Location: In my office, making poly-eating models.
Posted: 13th Jan 2009 06:23
No. I am attempting to purchase one which my power supply can handle. I haven't actually got a GPU yet, I'm on integrated graphics ATM.

Uncle Sam
19
Years of Service
User Offline
Joined: 23rd Jul 2005
Location: West Coast, USA
Posted: 13th Jan 2009 07:14
Confused I am.

Toasty Fresh
17
Years of Service
User Offline
Joined: 10th Jun 2007
Location: In my office, making poly-eating models.
Posted: 13th Jan 2009 07:44
Okay. I recently bought a whole boatload of things to upgrade my computer (I meaning my dad) and those things were the previously mentioned motherboard, hard drive, processor and RAM cards. Not mentioned was the 550 watt power supply. I then decided to make this thread in an attempt to purchase a bloody good GPU for my computer. I asked about that, and Sunflash pointed me towards a GeForce GTX 295. I then proceeded to say 'My power supply is only 550 watts, I cannot run it'. Then I said 'How about a GTX 285?' Then Sunflash said 'The minimum requirement is a bit of a lie, you should consider getting a better PSU'. I proceeded to conpletely freak out, and got pissed off because my dad just reluctantly bought me a brand spanking new power supply.

And here we are.

Lemonade
16
Years of Service
User Offline
Joined: 10th Dec 2008
Location:
Posted: 13th Jan 2009 07:58
Yeah, so what are the specs on the new power supply?
Toasty Fresh
17
Years of Service
User Offline
Joined: 10th Jun 2007
Location: In my office, making poly-eating models.
kBessa
18
Years of Service
User Offline
Joined: 8th Nov 2006
Location: Manaus, Amazonas, Brazil
Posted: 13th Jan 2009 08:23 Edited at: 13th Jan 2009 08:29
@Toasty Fresh:

One thing that might help you decide which card you will get is to base it upon your monitor's maximum resolution.

I recently bought a new pc, and had to decide between an ATI GPU: HD 4850 or HD 4870. I ended up getting the HD 4850 because I didn't not have the money to get HD 4870 (there's a huge price difference, I'm from Brazil).

But I'm more than happy with it, I play Unreal Tournament 3, Prince of Persia (2008), Silent Hill Homecoming, Dead Space, Red Alert 3, FlatOut UC, etc. Everything is running smoothly on the highest settings available, with full AA(8x) and AF(16x), on my monitor's native resolution: 1680x1050 (It's a 22" LCD from LG, model M228WA).

I even tried the Crysis Demo, it runs at 22fps on GPU benchmark, but when I play it runs just as smooth as other games, all maxed@1680x1050.

Off course games will be heavier in the future, but I just feel it ain't right when I see people buy a HD 4870x2 to play on a 17" screen with 1024x768 max resolution.


Edit: Just in case someone's wondering what are the other components. C2D E8400 (3GHz, 6MB cache), 4GB (2x2GB) DDR2@667MHz (Half FSB, some kingston memories, only 3GB available because it's XP-32), X-Fi Titanium

[center][center]
Toasty Fresh
17
Years of Service
User Offline
Joined: 10th Jun 2007
Location: In my office, making poly-eating models.
Posted: 13th Jan 2009 09:08
Thanks mate. Guess what? You just happened to be in the exact same situation as I am now, to some extent. My maximum res is 1680x1050 (at least I'm pretty sure it is, I'm running on that now and I can't go any higher) and I'm considering getting an ATi card.

Thanks for the tips, everybody.

Login to post a reply

Server time is: 2025-06-08 00:34:23
Your offset time is: 2025-06-08 00:34:23