Sorry your browser is not supported!

You are using an outdated browser that does not support modern web technologies, in order to use this site please update to a new browser.

Browsers supported include Chrome, FireFox, Safari, Opera, Internet Explorer 10+ or Microsoft Edge.

Geek Culture / When two tribes go to war (Nvidia v's ATI)

Author
Message
GatorHex
19
Years of Service
User Offline
Joined: 5th Apr 2005
Location: Gunchester, UK
Posted: 14th Nov 2007 23:04 Edited at: 14th Nov 2007 23:17
Whoa, probably old news but I only just noticed, Nvidia and ATI look like they are getting into a price war.

I've started to see 8800GT and 2900 Pro for as little as £150 (cheaper US side $250)

That's more like it, I might be tempted to sell the 8600GT

From this image it looks like the 8800GT won't take up two slots either I bet ATI aren't happy..


DinoHunter (still no nVidia compo voucher!), CPU/GPU Benchmark, DarkFish Encryption DLL, War MMOG (WIP), 3D Model Viewer
EdzUp
22
Years of Service
User Offline
Joined: 8th Sep 2002
Location: UK
Posted: 14th Nov 2007 23:15
ATI are just a brand name now that AMD own them

-EdzUp
NeX the Fairly Fast Ferret
19
Years of Service
User Offline
Joined: 10th Apr 2005
Location: The Fifth Plane of Oblivion
Posted: 14th Nov 2007 23:46
It doesn't take up two slots... but it's very long. It'd probably not fit my PC (even if it did have anything better than AGP 8x) because it'd hit the heatsink on the sound chip.


Since the other one was scaring you guys so much...
Agent Dink
20
Years of Service
User Offline
Joined: 30th Mar 2004
Location:
Posted: 15th Nov 2007 02:21
It is long, but it's no longer than the normal 8800s

I sorta wish I bought one of those, but it wasn't quite as good a deal as the one I had gotten at the time.

Warning: Please be advised. Geek Culture is under lockdown. All mods are set to Indi mode. Any and all topics WILL BE LOCKED. Post at your own risk!
Keo C
17
Years of Service
User Offline
Joined: 3rd Aug 2007
Location: Somewhere between here and there.
Posted: 15th Nov 2007 02:22
I think I read in one issue of CPU (Computer Power User) that 2900 Pro struggles to meet the higher 7 series Geforce cards or something like that.


Osiris
20
Years of Service
User Offline
Joined: 6th Aug 2004
Location: Robbinsdale, MN
Posted: 15th Nov 2007 02:54
Well...it looks pretty.

RIP Max-Tuesday, November 2 2007
You will be dearly missed.
CattleRustler
Retired Moderator
21
Years of Service
User Offline
Joined: 8th Aug 2003
Location: case modding at overclock.net
Posted: 15th Nov 2007 03:23 Edited at: 15th Nov 2007 03:30
I just got the XFX nv8800GTS 320bit 320MB for 269.00 (after 30.00 rebate) from newegg. Not a bad deal. Sucker is a double-wide and weighs a ton, not to mention its a power guzzler, and can easily overheat due to the fact that nvidia are a bunch of retards when it comes to nTune/NVControl Panel/Auto-fan stuff.

I have been using ATITool 0.27b and RivaTuner 2.06 to learn exactly whats going on, and have since uninstalled nTune (useless) from my machine. I'm going to use the RivaTuner API and make my own little app/service that monitors gpu temp from the on-die diode, and the fan duty cycle % via the adt7473 controller, and ramps the fans based on gpu temps. I know riva does this but its not as good as it could be for the 8800 series, so Ill make my own.

If you have an 8800 x beware the auto fan by Nvidia is broken!! It doesnt ramp the fan with high temps like it should. My brand new card kept freezing up the system due to over heating until I realized what the problem was grrr. I may even void my warranty and do an Arcit Silver 5 application to the gpu and mem chips under the giant removable heatsink/fan assembly.

currently have riva setting the fan to 80% at start up, this keeps the gpu at 60c in windows, altho if I play vids in mp it gos to 65c, but comes back down, which is fine. But in a game 80% is going to eventually over heat, the stock settings would cause games to hang after 15 or 30 mins, depending on the game. When I manually set the fan to 100% duty cycle nothing locked up, or over heated, the gpu never got above 80c, where previously 95c would kill it.

My DBP plugins page is now hosted [href]here[/href]
Raven
19
Years of Service
User Offline
Joined: 23rd Mar 2005
Location: Hertfordshire, England
Posted: 15th Nov 2007 09:34
Quote: "From this image it looks like the 8800GT won't take up two slots either I bet ATI aren't happy.."


Actually, seems like you've been out of the hardware loop for a month or so. As within days of NVIDIA announcing their new Mid-range priced and performance 8800GT, AMD threw them a major curveball.

It comes in two shapes, and shows off part of their next-generation technology.

The RV670 (final cards in the Radeon HD-Series) have been deceptively called 3850 and 3870. They're both hitting the market today in the US with them hitting British shores by the end of the month.

These cards fall in to the $150 and $250 price-range respectively, and are design to compete directly with the GeForce 8800GT and 8800 Ultra cards.

What is quite awesome about these new cards are their technical specifications though. First thing you'll notice about these cards is that they boast support for the new DirectX 10.1 coming with Vista Service Pack 1.

AMD have not only made the first 55nm chips, but done something quite amazing. Usually power requirements for cards with basically top-end performance boarderline on ridiculous.

8800GTX for example I believe takes almost 220watts just to run it, that means you need atleast a 400-500watt PSU just to be able to play games.

So what I find truely awesome from these new AMD cards is even in the most intensive 3D games, they only max 75 and 95watts respectively. Bare in mind these are cards that can play Crysis totally maxed out as well as any current card on the market.

Another really cool thing is they use the new PCIe v2.0 x16 system. With the updates to CrossFire, you can now run upto 8.. yes EIGHT. Cards in the same machine. That's just mind-blowing. Although AMD have said the first drivers will only support up to 4 cards are once.

Still another aspect of the Crossfire update is you no longer need identical cards. The new Asymmetrical system they've added allows you to throw any two crossfire cards together, and bam instant powerboost.

When this is combined with GDDR4 256-bit memory. These cards are damn cheap, use less power, run as well as todays current top-end with less transistors (2900XT 700million, 38x0 666million) in smaller packages. Plus as they include the AMD Quiet'n'Cool system it also means when you're not pushing them to their limits they become background.

These cards really are shaping up impressively, and are only here to bridge the gap to provide compatibility with Dx10.1 and to show NVIDIA that the next round of RV700 vs G90 cards is going to be something fairly epic (well if you're in to graphics cards lol)

Given realistically the battle between graphics card manufacturers is fought in the mid-range (something neither has really had on the market this generation).. then really their fight has just begun conserning the DirectX10 generation.

Arkheii
21
Years of Service
User Offline
Joined: 15th Jun 2003
Location: QC, Philippines
Posted: 17th Nov 2007 06:27 Edited at: 17th Nov 2007 06:30
The HD3xxx cards are being pre-sold a little more expensive than expected here (the cheapest HD3850 is PhP 9500 ~= US$ 211). My guess is that initial supplies are still limited.

Here's the dual core HD3870 X2...

http://www.fudzilla.com/images/stories/2007/November/General%20News/r6c2.jpg

http://www.fudzilla.com/images/stories/2007/November/General%20News/r6c1.jpg

GatorHex
19
Years of Service
User Offline
Joined: 5th Apr 2005
Location: Gunchester, UK
Posted: 17th Nov 2007 10:20 Edited at: 17th Nov 2007 10:22
The 38xx won't be as fast as the 8800GT either so I've heard, will use more power, will sit over 2 slots. Also the drivers work poo wit FPSC X and DBP X

DinoHunter (still no nVidia compo voucher!), CPU/GPU Benchmark, DarkFish Encryption DLL, War MMOG (WIP), 3D Model Viewer
Raven
19
Years of Service
User Offline
Joined: 23rd Mar 2005
Location: Hertfordshire, England
Posted: 17th Nov 2007 12:22
Quote: "The 38xx won't be as fast as the 8800GT either so I've heard, will use more power, will sit over 2 slots. Also the drivers work poo wit FPSC X and DBP X"


3850 takes up a single slot, uses 75watt (as opposed to the 8800GT that uses 160watt) and is equal performance to the 8800GTS.

3870 takes up two slots, uses 95watt (as opposed to the 8800 Ultra that uses 220watt) and is equal performance to the 8800 Ultra.

As for TGC X10 product compatibility, this is a new series of cards that are not yet supported by Catalyst (you have to use the drivers provided with it Cat 7.9) and unless TGC have these cards which AMD(ATI) have not allowed anyone to have these prior to release (which is why there are no reviews of them yet) meaning TGC would've had to have bought manufacturer direct from US Suppliers they don't have one to test your "drivers work poo wit FPSC X and DBP X" mention.

I heard that there are issues with the Radeon HD 2900 XT using Catalyst 7.7 and 7.8, however I asked Rick if I could beta test their X10s given I own a 2400, 2600 and 2900 cards.
All of them so far are far more compatible with all DirectX9 & 10 titles I've played and bought over the last 6months as well as far more legacy games than the GeForce 6100 and 8800GTS cards I also own.

In-fact I have the GeForce 8800GTS and Radeon 2900 XT in the same machine, due to Vistas' capabilities you can switch between Forceware (Currently run 163.75 WHQL) and Catalyst (Currently running 7.8) on-the-fly without rebooting.

While I've found the 8800GTS seems to have a bit extra performance under the hood, but trying to run legacy games is very hit'n'miss.
For example I updated my driver last night, and Knights of the Old Republic 2 died. No explaination to why, just bam! gone.
Before that I lost all of my Fullscreen Effects, as well as Shadows.
Performance for FSAA dropped the framerate by half.
And for some reason the game takes a good 2-3minutes to switch between video and game.
The 6100 suffers from the exact same problems, as they're purely driver based.

The old 5200 Ultra I have, no issues (well cept being painfully slow framerate).. equally I've had no issues with any of my Radeons on it.

In-fact the only way to run most of the legacy DirectX9 games in Vista I've found was to basically use the Microsoft Reference drivers, or release Forceware.

Quite frankly I'm completely inclined to believe any issues that the TGC X10 products have with Radeons are purely down to poor programming. Been developing with Radeons for a couple of years now, and frankly breaking support to Radeons often happens when making GeForce specific optimisations, otherwise they'll tick over code without any issues.

Login to post a reply

Server time is: 2024-11-19 13:21:29
Your offset time is: 2024-11-19 13:21:29