Sorry your browser is not supported!

You are using an outdated browser that does not support modern web technologies, in order to use this site please update to a new browser.

Browsers supported include Chrome, FireFox, Safari, Opera, Internet Explorer 10+ or Microsoft Edge.

Geek Culture / CPU's Usefullness

Author
Message
General Reed
19
Years of Service
User Offline
Joined: 24th Feb 2006
Location:
Posted: 20th Nov 2008 08:08
Basicly, How useful is a powerful cpu for gaming nowdays? Because tbh i have never got more than a 5-10fps boost simply from significantly upgrading my cpu. So what is a good CPU useful for? Or is it simply so you can say to your friends "Ive got a £1000 CPU"?

CPU: AMD X2 6000+ 3.0ghz GFX: NVIDIA BFG Geforce 8800GTS 640MB OC-550mhz core RAM: 2048mb

draknir_
18
Years of Service
User Offline
Joined: 19th Oct 2006
Location: Netherlands
Posted: 20th Nov 2008 09:35
Well, theres no point in having a £1000 GPU either if its not supported by a similarly overpriced CPU.

I think it depends entirely on the game and the CPU you're selecting. Quad Core CPUs don't really boost gaming much because next to no games use the four cores effectively. Some games take more advantage of the CPU than others, for Physics, Lighting, AI, etc.

Unless you're swimming in money I wouldn't spend £1000 on a CPU.
El Goorf
18
Years of Service
User Offline
Joined: 17th Sep 2006
Location: Uni: Manchester, Home: Dunstable
Posted: 20th Nov 2008 09:40
i have an AM2 3600+ and my pc plays crysis, supreme commander, and company of heroes to name a few on max graphics settings

http://notmybase.com
All my base are not belong to anyone.
General Reed
19
Years of Service
User Offline
Joined: 24th Feb 2006
Location:
Posted: 20th Nov 2008 09:48
"Well, theres no point in having a £1000 GPU either if its not supported by a similarly overpriced CPU."
Lol can you get a gpu for £1000? Anyway if i put a 280gtx in my current system, im sure i would see a diffrence. If i upgraded my processor, there would be a few fps increace at max.

Exactly. Why are intel like "Oh buy this cpu for the ultimate gaming experience" When it makes virtualy no diffrnece.

CPU: AMD X2 6000+ 3.0ghz GFX: NVIDIA BFG Geforce 8800GTS 640MB OC-550mhz core RAM: 2048mb

Veron
18
Years of Service
User Offline
Joined: 22nd Nov 2006
Location:
Posted: 20th Nov 2008 10:07
It's a balance of both. Sure, I could rip a 280GTX into my PC, but only have a Pentium 3 CPU. I'd load Crysis up, and it'd collapse, obviously because the CPU just couldn't handle the load.

Whether the GPU or CPU is more important in obtaining high frame rates, I don't know, but I think having a decent CPU is just as important as having a decent GPU, and the better the CPU, the better frame rates you'll get. If that wasn't true, everyone would still be using a P4, and there would be no dual/tri/quad core.
General Reed
19
Years of Service
User Offline
Joined: 24th Feb 2006
Location:
Posted: 20th Nov 2008 10:32
"Whether the GPU or CPU is more important in obtaining high frame rates, I don't know, but I think having a decent CPU is just as important as having a decent GPU, and the better the CPU, the better frame rates you'll get. If that wasn't true, everyone would still be using a P4, and there would be no dual/tri/quad core. "
Yes this is true up to a point. If you took a processor made 2 years ago, and then took a new i7, there would be a performance boost of about 10-15fps max in games. Most games dont even use up my old 6000x2.

CPU: AMD X2 6000+ 3.0ghz GFX: NVIDIA BFG Geforce 8800GTS 640MB OC-550mhz core RAM: 2048mb

El Goorf
18
Years of Service
User Offline
Joined: 17th Sep 2006
Location: Uni: Manchester, Home: Dunstable
Posted: 20th Nov 2008 11:52
Quote: "It's a balance of both. Sure, I could rip a 280GTX into my PC, but only have a Pentium 3 CPU. I'd load Crysis up, and it'd collapse, obviously because the CPU just couldn't handle the load."



Quote: "i have an AM2 3600+ and my pc plays crysis, supreme commander, and company of heroes to name a few on max graphics settings"


just save the money for decent graphics card, ram and cooling. of course a pentium 3 would be rubbish, but you can still get by with a relatively old cpu.

http://notmybase.com
All my base are not belong to anyone.
Kevin Picone
22
Years of Service
User Offline
Joined: 27th Aug 2002
Location: Australia
Posted: 20th Nov 2008 12:39
Quote: "Because tbh i have never got more than a 5-10fps boost simply from significantly upgrading my cpu"


These numbers are meaningless, without some type of context. Because a 5 Fps increase can mean a little or lot.

For example,

If Program A was running at 10fps on CPU1 (100 millisecond refresh), but runs and 15fps with CPU2 (66.6 millisecond refresh) so CPU2 is giving approximately 30% performance improvement

If Program B was running at 100fps on CPU1 (10 millisecond refresh), but runs and 105fps with CPU2 (9.5 millisecond refresh) CPU2 is only giving us approximately 5% performance improvement

BatVink
Moderator
22
Years of Service
User Offline
Joined: 4th Apr 2003
Location: Gods own County, UK
Posted: 20th Nov 2008 15:59 Edited at: 20th Nov 2008 15:59
I also don't understand why people get so obsessed with frame rates in games. Beyond 50FPS it makes no difference because that is the limitation of the human eye.

The question is actually how much it can handle at a reasonable frame rate. One machine might be able to process everything you throw at it at 50FPS, while another just manages the minimum, running at the same FPS.

I would rather my machine was number-crunching some good game AI at 50FPS, than being able to tell my mates I was running at 1000FPS.

General Reed
19
Years of Service
User Offline
Joined: 24th Feb 2006
Location:
Posted: 20th Nov 2008 16:01
Quote: "Beyond 50FPS it makes no difference beause that is the limitation of the human eye."

Well i dont know about you but i can tell the diffrence between 50fps, and 80fps very easily.


Quote: "I would rather my machine was number-crunching some good game AI at 50FPS, than being able to tell my mates I was running at 1000FPS."

I agree, but i prefer 75-80FPS. Anything below 60 seems sluggish to me.

CPU: AMD X2 6000+ 3.0ghz GFX: NVIDIA BFG Geforce 8800GTS 640MB OC-550mhz core RAM: 2048mb

zenassem
22
Years of Service
User Offline
Joined: 10th Mar 2003
Location: Long Island, NY
Posted: 20th Nov 2008 16:56 Edited at: 20th Nov 2008 17:54
Quote: "Batvink: "Beyond 50FPS it makes no difference beause that is the limitation of the human eye."

General Reed: "Well i dont know about you but i can tell the diffrence between 50fps, and 80fps very easily.""



Batvink, I thought the same thing until I realized....
How many frames per second can the human eye see?
This is a tricky question. And much confusion about it is related to the fact, that this question is NOT the same as:

How many frames per second do I have to have to make motions look fluid?

And it's not the same as

How many frames per second makes the movie stop flickering?

And it's not the same as

What is the shortest frame a human eye would notice?

"When I look at that square... I wish FPSC noobs would stay on their side of the forums and stop polluting these boards." - Benjamin
Darth Kiwi
20
Years of Service
User Offline
Joined: 7th Jan 2005
Location: On the brink of insanity.
Posted: 20th Nov 2008 17:49
I have an AMD Athlon 64 3500+ processor, 1GB Ram and an Nvidia 8600. I can play most things but the more recent games can cause problems. For example, BioShock was okay on medium settings (I could play 1024x768 with medium textures fine, though high textures caused some stuttering). But Mass Effect was an absolute nightmare: occassional stutterings (ie. turn the camera and it gets "stuck" for a second), I had to play with everything on low, and it crashed every 45 minutes. (Pretty much like clockwork.) Since my GPU is fairly good (not great but not shoddy) and my 1GB Ram is also kind of okay (another GB would be good but I'm not sure it's necessary) I think it's the CPU which might be slowing everything down. Plus it doesn't help that I have a few years' worth of digital cobwebs on the machine taking up processing power.

Question: is it possible to replace my AMD Athlon 64 3500+ CPU for a better (preferably dual core) CPU without completely replacing my computer? I looked into it before but got the impression that the CPU was fairly integral to the machine. (Yeah, I'm not much of a hardware buff...)

General Reed
19
Years of Service
User Offline
Joined: 24th Feb 2006
Location:
Posted: 20th Nov 2008 17:50
Stuttering in games is usualy caused by lack of memory, not cpu problems.

CPU: AMD X2 6000+ 3.0ghz GFX: NVIDIA BFG Geforce 8800GTS 640MB OC-550mhz core RAM: 2048mb

David R
21
Years of Service
User Offline
Joined: 9th Sep 2003
Location: 3.14
Posted: 20th Nov 2008 18:02 Edited at: 20th Nov 2008 18:03
Quote: "Stuttering in games is usualy caused by lack of memory, not cpu problems.
"


That's not necessarily true, especially since there are various forms of stutter - stuttering can be caused by anything from the hard drive being slow/slow to spin up (resource caching) to anything that's causing a blocking operation (primary example being things that block the soundcard or sound from being serviced, which causes the "sound loop" effect where the sound data in the card's memory isn't changed at the correct pace, and causes audible stutter)


09-f9-11-02-9d-74-e3-5b-d8-41-56-c5-63-56-88-c0
zenassem
22
Years of Service
User Offline
Joined: 10th Mar 2003
Location: Long Island, NY
Posted: 20th Nov 2008 18:07
Quote: "Stuttering in games is usualy caused by lack of memory, not cpu problems."


I agree (usually), but not always.

The Gamer's Graphics & Display Settings Guide

Performance Tip: This step highlights an important fact: your
graphics card can only work as fast as your CPU can allocate it
information. So a fast graphics card will typically perform better
on a system with a faster CPU. In fact many high-end graphics
cards are artificially bottlenecked because they are on a system
with a CPU that is not fast enough to allocate data to them. In
contrast, the higher the resolution and graphics settings for a
game, the less important the CPU becomes as the bulk of the work
will be done by the graphics card. The bottom line is the CPU is
still the linchpin of your entire system.

"When I look at that square... I wish FPSC noobs would stay on their side of the forums and stop polluting these boards." - Benjamin
General Reed
19
Years of Service
User Offline
Joined: 24th Feb 2006
Location:
Posted: 20th Nov 2008 18:12
Well its just from personal experience. Ive never gained more than a few fps in gaming, after a cpu upgrade. If i upgrade graphics or memory, the fps shoots up.

CPU: AMD X2 6000+ 3.0ghz GFX: NVIDIA BFG Geforce 8800GTS 640MB OC-550mhz core RAM: 2048mb

_Nemesis_
21
Years of Service
User Offline
Joined: 9th Nov 2003
Location: Liverpool, UK
Posted: 20th Nov 2008 18:53 Edited at: 20th Nov 2008 18:54
http://www.tomshardware.co.uk/cpu-gpu-upgrade,review-30828.html

Tom's Hardware, as always, have something about this.

[url="http://www.devhat.net"]www.devhat.net[/url] :: Devhat IRC Network.
Jeku
Moderator
21
Years of Service
User Offline
Joined: 4th Jul 2003
Location: Vancouver, British Columbia, Canada
Posted: 21st Nov 2008 01:20
Almost all algorithms you use in game engineering boils down to speed over memory. You can have something execute quickly with caching to RAM, or something CPU intensive and slower with little to no RAM usage.

Game stuttering is usually bottlenecks and incorrect or poor decisions made in regards to saving either speed or memory. You can technically make a game and have it run entirely without using RAM but then you have to use processor intensive math to do the calculations with every tick. Or you can store lookup tables with pre-determined values in RAM and the game should execute much faster but of course, will use much more RAM or hard drive storage. Often times there's no perfect answer and the user's CPU or RAM size is not beefy enough to handle the developer's decisions.

My point is CPU, RAM, hard drive speed, disc speed, etc. can all potentially factor into various games' performance in different ways, and CPU isn't any less important than, say, your GPU. It would be cool if developers allowed advanced settings so you could choose to, for example, preload entire game environments in RAM if you have the storage.


General Reed
19
Years of Service
User Offline
Joined: 24th Feb 2006
Location:
Posted: 21st Nov 2008 04:54
Well, surely if the CPU was just as important as the GPU for games, then it would make the same sort of diffrence in performance when you upgrade the CPU (Obviously with the same generation gap as the gpu). The simple fact is it dosent, nowhere near, and while the cpu is important for numbercrunching applications, Games simply arent one of them, or we would see a significant diffrence. I personlay have never managed to justify a processor upgrade for gaming. The two things that make a diffrence are Ram, and GPU, regardless of what theoreticly should be.

CPU: AMD X2 6000+ 3.0ghz GFX: NVIDIA BFG Geforce 8800GTS 640MB OC-550mhz core RAM: 2048mb

Mahoney
17
Years of Service
User Offline
Joined: 14th Apr 2008
Location: The Interwebs
Posted: 21st Nov 2008 05:00
Quote: "Well, surely if the CPU was just as important as the GPU for games, then it would make the same sort of diffrence in performance when you upgrade the CPU (Obviously with the same generation gap as the gpu). The simple fact is it dosent, nowhere near, and while the cpu is important for numbercrunching applications, Games simply arent one of them, or we would see a significant diffrence."


Or, just maybe, the CPU you had in the first place was sufficient, because they offload a lot of the processing to the GPU.

Windows Vista Home Premium Intel Pentium Dual-Core 1.6 Ghz 1GB DDR2 RAM GeForce 8600GT Twin Turbo
Raven
20
Years of Service
User Offline
Joined: 23rd Mar 2005
Location: Hertfordshire, England
Posted: 21st Nov 2008 05:34
Processor performance realistically has little weight on the overall performance past throughput of instructions and delegation of tasks.

So while sure, the more money you spend to get more cores will improve performance to a degree simply given you'll have new threads run in a serialised form across all of the cores; despite parralisation being a better option as it make better use of the multi-core aspect this has to be done application specific.

Same can be said about the raw performance.

I've done this analogy before, but think of the processor as the engine of a car. Sure a bigger block, more cylinders, etc... will mean a better torque. The trouble is it does you no good if all that performance means is your tyres will keep popping.

Putting together a "fast" computer is realistically as easy as putting the best possible components in it your money can afford. That doesn't mean it will be particularly good at doing tasks properly.

For example a Drag Car is a damn fast car, but try turning a corner and you find you need to slow everything to a crawl in order to do this.

It's finding a decent balance, and parts that work together that you'll be able to have something that specification wise seems quite modest to actually out-perform someone with more money than sense.

Simplist thing you can do is basically making sure you have a processor, graphics card and memory that all can play well together. You delve deeper in to this tuning though, and getting those parts that can sync their bus speeds to the same multiple ranges; and you'll notice a huge performance boost from eliminating the usual suspect bottlenecks.

While multi-cores for example are cool, right now they're still fairly young as a technology and still rely heavily on the developer taking advantage. This isn't to say they're a pointless technology, but 2-3-4-8 Cores... realistically makes very very little difference unless you're encoding audio or visual feeds.

I'll use my own computer as an example, given it is quite finely tuned anyway but more so because the hardware was bought specifically to work together.

The system I'm on right now might seem modest:

AMD Athlon 64 X2 Dual Core 3800+ 2.0GHz 1MB Cache
AMD Series-7 750FX Motherboard
Crucial 2GB (2x1GB) 800MHz PC2-6400 CL5 Memory
AMD ATI Radeon HD 2600 XT CrossFireX 256MB 800MHz (Dual Card)
Western Digital 250GB SATA-2 16MB Cache

This system gets a nice respectable 4.8 Vista SPI, limited only by the Processor; which it believes is a weak-point.

Now a beautiful part of the setup, firstly is that everything inside works on multiples of 200MHz FSB. This is quite an important fact as this sets the system to run syncronised from the start, meaning there is very little downtime waiting for cycles of each component to finish and sync data again.

Another important thing is that memory bandwidth is not stretched to close to the limitation allowing it to really utilise what is available.

Something else to note is the Memory setup. I could've bought a single 2GB stick for the same prices as the two identically sized sticks; it might not seem important but memory is designed to work in pairs, just the way it's always been. Although sure now you do have Dual Inline Memory Modules no longer requiring you to pair, fact is that the majority of memory going in computers now is actually SoDIMM; meaning while designed to run as a single inline, you will see performance from adding a second identical board.

I could also go in to how it can also help the CL performance by allowing memory passes on both incoming and outgoing calls; but meh.

Another thing to really note is that having more than one provides more than one memory area that can be accessed at once. This leads to quite dramatic performance increases.. and possibly one of the more important aspects as well that's what gets you closer to that theoretical performance the manufacturers' claim.

Now another point to note is the Hard Disk size and cache size. Again these are quite important (same with Optical Media Drives) is that you want more Cache even if it is at the sacrifice of raw performance or space. In-fact for Hard Disks, more space per disk = poorer performance.

Think of the Hard Disk as a library, and the cache as the trolly you have to take the books to the front desk. No matter how quickly you can get those books at, if you've got a pathetic trolly then you're going to spend ALOT of time taking the small collection you find back'n'forth.
Also if you have a much larger collection, again no matter how well organised which honestly with Windows as the librarian often means books aren't always where you expect them or in any order that makes sense over time... the reality again is more books = more search times. As such a smaller drive frankly performances better as the data is closer together and easier to keep in check (with defraggers).

Realistically the static media of optical media, ends up honestly being a quicker way to keep data as it never fragments and is a fairly small space to search. As such actual data search and retrieval is extremely small. I mean there is often a difference in the "number of employees" so to speak between the two meaning they both are better are different things... plus performance/bandwidth is greater on one over the other, but when accessing constantly hard disks end up overworked thus fragmentation thus eventual slowing.

This might seem like some stupid over thinking just for building a computer, but you consider it is running technology that realistically is now completely out-dated; yet by simply making sure parts work together providing the correct bandwidth and syncronisation, there are very noticeable performance results in and outside of games.

One I'm quite proud of is despite having several start-up programs:
Messenger, Media Center, Autodesk Backburner, Steam, Live, Live Mesh, Visual Studio Development Environment, XNA

All on boot... yet being able to go from a cold boot to a useable machine in under 20 seconds. Compared to adverage systems even from fresh installs (this has had Vista since release installed) can take up to 60seconds (or more).

That is a noticeable performance increase. The same spills over in to games as well, with games that are often requiring more performance for recommended; running at acceptable playable performance rates with graphics maxed.

You'd be surprised how much of a difference keeping in mind the small easily overlooked aspects of the system can be.

Green Gandalf
VIP Member
20
Years of Service
User Offline
Joined: 3rd Jan 2005
Playing: Malevolence:Sword of Ahkranox, Skyrim, Civ6.
Posted: 21st Nov 2008 21:14
Is it my imagination or are Raven's posts getting shorter? Probably a trick of the light.

Anyway, to turn to zenassem's post:

Quote: "Batvink, I thought the same thing until I realized....
How many frames per second can the human eye see?
This is a tricky question. And much confusion about it is related to the fact, that this question is NOT the same as:

How many frames per second do I have to have to make motions look fluid?

And it's not the same as

How many frames per second makes the movie stop flickering?

And it's not the same as

What is the shortest frame a human eye would notice?
"


Just had a look at that link. I wonder if they realise that the electric light that we use to light our rooms imposes an fps on us? If you look at a wall and move your hand across your face you may see a flickering set of images (certainly if you move it across the screen when you read this) - but you might be so used to it you wouldn't notice.
Sasuke
19
Years of Service
User Offline
Joined: 2nd Dec 2005
Location: Milton Keynes UK
Posted: 22nd Nov 2008 00:50
Quote: ""Ive got a £1000 CPU?""


Damn, so close, mines a Intel Core i7-965 Extreme.

My Rig:
Intel Core i7-965 Extreme 3.73 GHz
Asus P6T Deluxe motherboard
9GB Corsair DDR3 Ram
Samsung F1 1TB
Western Digital VelociRaptor 300GB
HIS Radeon 4870X2 2GB GDDR5
Cooler Master Silent Pro 700 PSU

It's funny that every time something like this comes up, everyone (other Raven) seems that the only reason to get something like this it must be for games. Stuff games, you ever tried rendering a 30 minute animation in HD 1080p on a slow CPU, you'd have to leave it for a weak, same for decoding and encoding of video files. But the best thing is the multicores, who wants to sit infront of there computer wait for a movie to encode, let one core do it, the others will enjoy a quick game of TF2 or L4D.

A dream is a fantasy, if you achieve that fantasy it was never a dream to begin with.
General Reed
19
Years of Service
User Offline
Joined: 24th Feb 2006
Location:
Posted: 22nd Nov 2008 01:20
Wow sasuke. Dont take this offencivly, but you must be spoilt rich to aford that rig. I wish i could get something like that.

Anyway, i do understand that CPU's are good for applications, or running more than one thing at once, but for games, as i said, theres no real performance increace.

CPU: AMD X2 6000+ 3.0ghz GFX: NVIDIA BFG Geforce 8800GTS 640MB OC-550mhz core RAM: 2048mb

Sasuke
19
Years of Service
User Offline
Joined: 2nd Dec 2005
Location: Milton Keynes UK
Posted: 22nd Nov 2008 03:12
Quote: "Wow sasuke. Dont take this offencivly, but you must be spoilt rich to aford that rig. I wish i could get something like that."


I wish! This is all down to pure saving. I had saved up about £4000 over 3 years, for a flat or car or something. But during those 3 years, I managed to gain them, minus the car, never really needed one, I love walking/running (also skateboarding, freerunning, bmx/trials biking the list goes on) everywhere (though I have a drivers licence for when I start). So I said stuff it, you've only got one life, best to go mad and spend a but load of cash now than when you have cashflow sucking girlfriend or kids.

Quote: "theres no real performance increace."


Ture, but it does take the limit off your graphics card, plus it's still early day, many new game dev technique to be made that take advantage of all this power.

A dream is a fantasy, if you achieve that fantasy it was never a dream to begin with.
Kevin Picone
22
Years of Service
User Offline
Joined: 27th Aug 2002
Location: Australia
Posted: 22nd Nov 2008 05:53
Real time (asynchronous) graphical applications such as games have a tipping point. If the CPU is quicker than GPU can respond, then forgetting infrastructure, the applications performance will be bound to the speed of the GPU, moreover, if the CPU can't pass the GPU data fast enough (it stalls), then it becomes CPU bound. In former case, the performance could theoretical be better improve more by replacing the GPU and in the later the inverse, the CPU is the bottleneck.

General Reed
19
Years of Service
User Offline
Joined: 24th Feb 2006
Location:
Posted: 22nd Nov 2008 10:13
"also skateboarding, freerunning, bmx/trials biking the list goes on" Hehe i Love freerunning!

Quote: "So I said stuff it, you've only got one life, best to go mad and spend a but load of cash now than when you have cashflow sucking girlfriend or kids."

Lol, i can understand that. Every time i 'blow' alot of money on a peice of computer equipment, my whole family are lile 'Oh you could have bought something else' - Very annoying.

CPU: AMD X2 6000+ 3.0ghz GFX: NVIDIA BFG Geforce 8800GTS 640MB OC-550mhz core RAM: 2048mb

draknir_
18
Years of Service
User Offline
Joined: 19th Oct 2006
Location: Netherlands
Posted: 22nd Nov 2008 12:25 Edited at: 22nd Nov 2008 12:26
Quote: "Real time (asynchronous) graphical applications such as games have a tipping point. If the CPU is quicker than GPU can respond, then forgetting infrastructure, the applications performance will be bound to the speed of the GPU, moreover, if the CPU can't pass the GPU data fast enough (it stalls), then it becomes CPU bound. In former case, the performance could theoretical be better improve more by replacing the GPU and in the later the inverse, the CPU is the bottleneck."


This is fairly obvious, but how can you tell which one is the bottleneck?
Mr Z
17
Years of Service
User Offline
Joined: 27th Oct 2007
Location:
Posted: 22nd Nov 2008 12:27
A good CPU is awsome to have. Maybe not the only factor in games, but it also makes the general computer experience just so much more pleasent.

There is no greater virtue, then the ability to face oneself.
Mahoney
17
Years of Service
User Offline
Joined: 14th Apr 2008
Location: The Interwebs
Posted: 22nd Nov 2008 17:08
Quote: "This is fairly obvious, but how can you tell which one is the bottleneck? "


Upgrading your CPU and nothing happening is a pretty good sign.

But, aside from answering the OP: try underclocking or overclocking the GPU and watch the results. Try the same with the CPU, if you can. Other than that, you should be able to figure it out simply by knowing how nice a CPU you need to keep up with your graphics card.

Windows Vista Home Premium Intel Pentium Dual-Core 1.6 Ghz 1GB DDR2 RAM GeForce 8600GT Twin Turbo
General Reed
19
Years of Service
User Offline
Joined: 24th Feb 2006
Location:
Posted: 22nd Nov 2008 17:12
"Upgrading your CPU and nothing happening is a pretty good sign."
That is wrong. What if you tried upgrading your graphics card, then your cpu and nothing happend, then it simply means your old cpu is already powerful enough, which is 9/10 the case nowdays.

CPU: AMD X2 6000+ 3.0ghz GFX: NVIDIA BFG Geforce 8800GTS 640MB OC-550mhz core RAM: 2048mb

Mahoney
17
Years of Service
User Offline
Joined: 14th Apr 2008
Location: The Interwebs
Posted: 22nd Nov 2008 21:16
Quote: "That is wrong. What if you tried upgrading your graphics card, then your cpu and nothing happend, then it simply means your old cpu is already powerful enough, which is 9/10 the case nowdays."


Um, what? ?

Windows Vista Home Premium Intel Pentium Dual-Core 1.6 Ghz 1GB DDR2 RAM GeForce 8600GT Twin Turbo
bitJericho
22
Years of Service
User Offline
Joined: 9th Oct 2002
Location: United States
Posted: 22nd Nov 2008 21:19
He's saying, if you upgrade the cpu and it didn't change anything, you just wasted your money


It's not just for BYOND you know!
Mahoney
17
Years of Service
User Offline
Joined: 14th Apr 2008
Location: The Interwebs
Posted: 22nd Nov 2008 21:23
Quote: "He's saying, if you upgrade the cpu and it didn't change anything, you just wasted your money"


Oh. It was a joke.

Windows Vista Home Premium Intel Pentium Dual-Core 1.6 Ghz 1GB DDR2 RAM GeForce 8600GT Twin Turbo
draknir_
18
Years of Service
User Offline
Joined: 19th Oct 2006
Location: Netherlands
Posted: 22nd Nov 2008 22:08
Quote: "Upgrading your CPU and nothing happening is a pretty good sign.

But, aside from answering the OP: try underclocking or overclocking the GPU and watch the results. Try the same with the CPU, if you can. Other than that, you should be able to figure it out simply by knowing how nice a CPU you need to keep up with your graphics card."


Hmm, I sort of meant before you purchase I guess theres no easy way of figuring that out.
General Reed
19
Years of Service
User Offline
Joined: 24th Feb 2006
Location:
Posted: 22nd Nov 2008 22:36
Lol, no it was not a joke. I will try to explain it more clearly:
Say one had a 6000X2+, and had a 8800GT/GTX or something to start with. Then if one then upgraded the graphics card to 280gtx/4870x2, you would see a huge performance gain. But then if you after upgrading the gpu, one upgraded the CPU to the latest I7, you would see much much less of a performance increace (in games), for alot more money.

I dont know if that made any sence, but i cant make it much clearer than that.

CPU: AMD X2 6000+ 3.0ghz GFX: NVIDIA BFG Geforce 8800GTS 640MB OC-550mhz core RAM: 2048mb

Veron
18
Years of Service
User Offline
Joined: 22nd Nov 2006
Location:
Posted: 23rd Nov 2008 02:41
Quote: "Say one had a 6000X2+, and had a 8800GT/GTX or something to start with. Then if one then upgraded the graphics card to 280gtx/4870x2, you would see a huge performance gain. But then if you after upgrading the gpu, one upgraded the CPU to the latest I7, you would see much much less of a performance increace (in games), for alot more money."


Do you actually have any proof of that being true?
Mahoney
17
Years of Service
User Offline
Joined: 14th Apr 2008
Location: The Interwebs
Posted: 23rd Nov 2008 04:08
Quote: "Say one had a 6000X2+, and had a 8800GT/GTX or something to start with. Then if one then upgraded the graphics card to 280gtx/4870x2, you would see a huge performance gain. But then if you after upgrading the gpu, one upgraded the CPU to the latest I7, you would see much much less of a performance increace (in games), for alot more money."


That's what I was saying. The GPU, in that example, was more of a bottleneck than the CPU.

Windows Vista Home Premium Intel Pentium Dual-Core 1.6 Ghz 1GB DDR2 RAM GeForce 8600GT Twin Turbo
General Reed
19
Years of Service
User Offline
Joined: 24th Feb 2006
Location:
Posted: 23rd Nov 2008 07:57
Quote: "Do you actually have any proof of that being true? "

No, but i went thorugh the same process recently, and i was very dissapointed from the gain thru cpu power in games.

Quote: "That's what I was saying. The GPU, in that example, was more of a bottleneck than the CPU. "

Sorry my bad. I wasnt very awake.

CPU: AMD X2 6000+ 3.0ghz GFX: NVIDIA BFG Geforce 8800GTS 640MB OC-550mhz core RAM: 2048mb

Raven
20
Years of Service
User Offline
Joined: 23rd Mar 2005
Location: Hertfordshire, England
Posted: 24th Nov 2008 04:50
I'd suggest if you're wondering if your CPU is a bottleneck, try reading through my post carefully.

There are three main types of bottlenecking the CPU can have and cause.

1) Memory Throughput - This is where you barely see framerate changes between resolutions (often lower resolutions) where the CPU performance is holding up the transfer of data to the GPU.
In this case while yeah adding more raw power does help, it only minorly eleviates the issue; giving you net gains of only a few frames.

More over often this issue might not be entirely CPU bound, but actually an issue with the physical overall bus speeds available. The better bandwidth you can allow, the more data can be transferred and the better you're GPU can work.

This said as far as the CPU goes, you'd be surprised how much of a difference in performance you'll notice getting the exact same processor with a larger Cache.

Good example really means going back a generation (or two) as they used to all be named by intel based on their Cache size.

Celeron - 384K Cache
Pentium - 1MB Cache
Pentium Extreme - 2MB Cache

You could get all three running at 2.5GHz, but you'll notice some serious framerate performance increases with all three if they were tied to a fairly decent dedicated gpu.

The same can be said for increasing your memory size, so there is more available physical data. Although doing that is a double edged blade, because you have too much memory at too low bus speed and increasing past a certain point requires more time for Random-Access Read/Write operations.

It is only recently with DDR3/DDR4 memory performance that larger than 4GB memory becomes useful rather than a bottleneck; conversely smaller amounts on these speeds makes very little sense as the memory itself can cause itself to wait for new instructions and thus drop entire cycles doing nothing; which is why no one saw real performance advantages from DDR3 when it first arrived.

2) Processor Performance - The raw speed of the processor can cause issues and often isn't really as noticeable if it is your graphics card or processor, making it the most annoying bottleneck without using a PIX (Performance Indexing) application to see the system seems to be stalling on. Usually the most noticable issues you'll see will be tied more to the regular I/O operations of the system... i.e. the Hard Disk. As especially in Vista, it will needlessly be constantly showing it is accessing the hard disk; or your system will cause lag in applications like Windows Media Player while you're trying to doing something like Install an application.

In-game this again manifests as perminant hard disk accessing as the CPU is trying to keep up with the GPU, by trying to pre-load resources. Half-Life 2 is an extremely good game if you want to test that out on.

3) Syncronisation Lag - This is a truely annoying one and extremely easy to spot. Let's say you're playing a game, and every so often you notice it seems to drop a frame.. or just pauses momentarily.

Unfortunately more and more modern games are using Framerate Smoothers, meaning you don't notice this; and as the game appears to run at full speed unless you check the FPS itself, you don't notice it dramatically drops when smoothed over normal where you see the stutter.

An extremely good game to notice this in, is Tomb Raider Angel of Darkness; that game never performs right or quickly as it's basically stock everything with a very templated from DirectX9 examples thrown all together. Actually makes it a very good raw performance benchmarker for DirectX9.0c capable cards.

Anyhow the solution to that is simple, but expensive in most cases. You have to make sure that the bus speeds of your parts match... I'm not talking "get everything running at 800MHz" or such; although yeah that'd work, doesn't have to be that precise.

That said if you have one of the boards that still let's you select your FSB, try to match your processor with your Memory. Then Overdrive/clock your Graphics card to Match the memory performance and a multiple of your CPU speed.

Don't worry if you suddenly find yourself underclocking your 2.5GHz processor to 2.2GHz in order to fit a 200MHz bus or such; you'll find even though it's running slower, you'll get more performance from it being able to get data as and when it needs to.

A good way to see bottlenecks in system performance and see what you need to "upgrade" is to run something like Microsoft PIX on it. As it breaks down what the system is doing, it's not great for actual game performance; but you can see if everything is running as smoothly as you might want it to.

Login to post a reply

Server time is: 2025-06-08 10:05:27
Your offset time is: 2025-06-08 10:05:27