There's nothing wrong with AMD CPU's. The better ones will run just about everything fine. If you're building a monster gaming rig with dual video cards and 12 gigs of high quality ram and pushing a huge display then go Intel. If you're building a budget gaming rig, I'd go AMD so you can drop a little more on a video card and good RAM. The AMD Black Editions overclock quite nicely on stock cooling if you get a case with good ventilation.
Holy shit, seriously? There must be a trade off if they can be overclocked so severely.
One trade-off is power consumption. It turns out that most people don't want a processor that puts out 200 W of heat in their system.
Another trade-off is reliability. If you clock it at 4.2 GHz at the stock voltage and cool it well, and it's stable like that, and you have a good motherboard and power supply to feed it power properly, then there isn't too much risk of frying it. But a lot of people won't get a motherboard, power supply, or heatsink appropriate to a large overclock, because those things all cost money. If you want to push it to 5 GHz, then you take a serious risk of frying the processor after some months or years, or possibly sooner if you get unlucky or had to nudge the voltage too far.
"i bet if amd listened to ati a bit more instead of being stuborn they would pass ahead of intel in term of speed even with their bigger die size."
AMD can't use Intel's fabs, so they're usually behind in process nodes. Even if they're on the same process node size, AMD has to use a very new process node, while Intel has a much more mature one. That's the main reason why AMD tends to be behind Intel in CPU performance. Right now, they're also behind because their architecture is simply worse. Bulldozer may change that later this month, however.
AMD doesn't have that disadvantage in video cards, as both AMD and Nvidia use other foundries, and currently mainly TSMC. If they're using exactly the same process node to produce their GPU chips, then the best architecture wins. And currently that's AMD's.
Not to get off topic.... but whoever said nVidia VS AMD/ATI is the same thing in the GPU Platform is COMPLETELY WRONG.
An ATI/AMD GPU thats the same price as an nVidia similar is always better. ATI/AMD GPUs are better right now on the market than nVidia. Just find 2 cards that are similar in price and run the benchmarks, the ATI/AMD will always win.
I'm seeing a lot of posts mentioning "Over clocking." I didn't plan on over clocking my CPU. It seems pretty advanced and I don't know much about all of this stuff. Plus i don't want to fry anything. Would I still get good performance without over clocking my CPU? Even if I were to get the, say, i5 2500k?
I'm planning on building a new computer and was looking around on Newegg for some CPUs. My buddy told me how much cheaper AMD was compared to Intel. I head over to Newegg and he was right. The most expensive AMD CPU I could find was around $190. What's the deal with this? How come Intel is so much more expensive?
The same reason Nvidia is more expensive than ATi.
AMD is no where near as good as Intel,i would never own an AMD machine or an ATi graphics card especially with the drivers for ATi.
I'm planning on building a new computer and was looking around on Newegg for some CPUs. My buddy told me how much cheaper AMD was compared to Intel. I head over to Newegg and he was right. The most expensive AMD CPU I could find was around $190. What's the deal with this? How come Intel is so much more expensive?
The same reason Nvidia is more expensive than ATi.
AMD is no where near as good as Intel,i would never own an AMD machine or an ATi graphics card especially with the drivers for ATi.
I have AMD Phenom II X6 and ATi HD3870x2 ... It's amazing what I can run on this... Catia opens up in 0.5 secs, I can have Pro-E simulation running and still work in AutoCAD and Catia if needed. Not to mention I can run every game on ultra-high and it works flawlessly without glitches. But if I compare my other two PC-s where I have almost the same configuration with AMD/ATi and Intel/Nvidia - they are really not different... Anything I run there works the same.
Most of this garbage with Intel being better than AMD is commercial bitching and 80% of it ain't true.... besides i7 has 4 cores and uses those 4 cores to simulate another 4 cores so it appears as an 8 core CPU when it really is quad core with 8 threads... AMD has 6 REAL cores.. 6 > 4 when I last checked, and for the price intel isn't worth the money...
"Happiness is not a destination. It is a method of life." -------------------------------
I'm planning on building a new computer and was looking around on Newegg for some CPUs. My buddy told me how much cheaper AMD was compared to Intel. I head over to Newegg and he was right. The most expensive AMD CPU I could find was around $190. What's the deal with this? How come Intel is so much more expensive?
The same reason Nvidia is more expensive than ATi.
AMD is no where near as good as Intel,i would never own an AMD machine or an ATi graphics card especially with the drivers for ATi.
I have AMD Phenom II X6 and ATi HD3870x2 ... It's amazing what I can run on this... Catia opens up in 0.5 secs, I can have Pro-E simulation running and still work in AutoCAD and Catia if needed. Not to mention I can run every game on ultra-high and it works flawlessly without glitches. But if I compare my other two PC-s where I have almost the same configuration with AMD/ATi and Intel/Nvidia - they are really not different... Anything I run there works the same.
Most of this garbage with Intel being better than AMD is commercial bitching and 80% of it ain't true.... besides i7 has 4 cores and uses those 4 cores to simulate another 4 cores so it appears as an 8 core CPU when it really is quad core with 8 threads... AMD has 6 REAL cores.. 6 > 4 when I last checked, and for the price intel isn't worth the money...
In most cases, if anyone is just trying to build a gaming pc to run games at max settings and 60+ FPS. The AMD X6s will be more than enough to do this if you match with a decent graphic card.
The most important thing you could ever do for a PC is to make sure it is well balanced, as Quizzical will back me up on im sure.
There is no point in having a nice processor you plan to OC and getting a horrid mainboard. There's no point in getting stupid rated RAM when there is next to zero performance increase in real world tests. There's no point going for any build without a good quality PSU..
It's all about the balance between components and realizing what things your stressing. The only two parts everyone should be insisting on nothing but quality is the mainboard and psu. You can get away with cheaper graphics cards or processors or ram and all it will do is be a little slower.. but if you buy a crappy mainboard or psu you're shooting yourself in the foot.
Think about the big picture and ask yourself some questions..
What resolution am I playing games at? (This is SUPER important, so many overlook it. Here's an example - I play at 5280x1050 so games often go above 2gb of ram usage and I also multi-client so 12gb of ram should have been the right choice rather than 6gb. Also in addition I went for SLI cards with 1gb of RAM each, that's fine... until I up the AA levels or add Ambient Occlusion etc and saturate the available gpu ram on some games like Metro 2033 and then my fps dies. I really should have went for 1.25gb or more per card or go for a single faster card with 2gb.)
Do I want bells and whistles in games? (eg Ambient Occlusion, Higher levels of AA or even Super Sampling AA, Depth of Field and other effects.)
How many cores are my games and applications likely to use? How much use will I get out of Hyperthreading or a hex-core cpu or should I favour instructions per clock?
How much RAM am I going to need?
SSD or Hard disk?
All these things tie in with one another and the most important part of putting together a build is that you balance it all correctly. It's a learning curve alright and you'll get it wrong but there's a few people here with their heads screwed on right and if you give them the time and info they should keep you pretty straight.
As to why AMD are cheaper, let's just say you roughly get what you pay for in most cases. Here's a link that demonstrates this nicely. As you will see the 3.7Ghz clocked quad core AMD gets rocked on single threaded apps by the significantly slower clocked i5's and on multi-threaded apps it just about manages to trade blows with the i7-920 which was released way back in November 2008 and is clocked a full 1Ghz lower than the 980. Even overclocked it still doesn't beat out the i5 and is nowhere near the power consumption levels of it either. It's cheaper, but then you see why.
op:in static test you might see a difference in number yes,but while you are gaming you will be hard pressed to find a diff,why?cause 90% of game could run just as well on pentium 4 then on a i7 latest monster like i said cpu play very little role now a day and i foresee the futur that ati will release a graphic card that will make cpu obsolete .a lot try to say nono it aint so but the sad truth is i bet with not many mods they could do all the cpu does and more even today i bet they could the only reasobn it hasnt happened isnt because the techno isnt there but because they would need to learn how to program in gpu.hell they need to learn it to multithread anyway lol. i heard some say a cpu and a gpu is so diff it could never happen.ya right.lol! the sad truth is it can be done and i bet lot of user do it today.
op:in static test you might see a difference in number yes,but while you are gaming you will be hard pressed to find a diff,why?cause 90% of game could run just as well on pentium 4 then on a i7 latest monster like i said cpu play very little role now a day
That just isn't true, im sorry...
Nice example, my flatmate has a problem with Battlefield: Bad Company 2. When he loads up a map in multiplayer he stutters and lags for the first 20 seconds or so. He has a quad core cpu and a 4870 playing at 1920x1080. I throw him over my old 4870x2 and the lag is worse. I tell him to go overclock his CPU which he really, really doesn't want to do.
I catch him in the BIOS a few weeks later setting his ram timings manually after a bios reset. I show him what to change and he overclocks the CPU to about 3.7Ghz without any issues.
Boots up Battlefield and the stutter/lag is gone when loading into maps. Our other flatmate it turns out gets the same thing happen to him when loading maps. He's using a core 2 cpu, I believe the stock speed on it is 2.8Ghz and he hasnt touched it.. Funnily enough even at 40-50fps (as opposed to the constant 60fps they get) playing across 3 monitors I have no lag or stutter at all at any point in the game.
You're also wrong with your second statement as well at least for the forseeable future; CPU's and GPU's are good at totally different things. GPU's are good at dealing with many simultaneous requests that, for example, require some calculations. Like breaking some cryptography or things like simulating folding proteins (Folding@Home) or Bitcoin mining. While CPU's on the other hand are very good at carrying out orders and instructions extremely efficiently.
i dont play that game so i wouldnt know but i did say 90% of the game you found 1 of the one that are in the 10% !cheer at you!
It also happens with the same guy when we host LAN games of Sins of a Solar Empire. His PC starts lagging behind as the number of active units in the game increases well before mines does. As a result my game is constantly waiting for his game to sync back up.
Really, you can go on believing CPU's have little impact if you want but I'm telling you that just isn't the case. At least nowhere near the scope you are making it out to be.
Oh, I did forget to say. I believe he has a first gen Phenom quad core at either 3.7 or 3.8. I have a i7-950 running at 4.3Ghz. I do have faster GPU's than him but due to me playing at much higher resolution and often running 2x2 SSAA I actually get lower overall FPS than he does. It's a classic example of his CPU bottlenecking his system since his GPU has more headroom than mine yet I don't suffer from it. In these situations he essentially has an imbalance between his GPU and CPU meaning that he could be increasing the detail in game without suffering overall while I'm doing the exact opposite by keeping spare CPU headroom as the GPU portion of the load can be a lot more flexable.
i dont play that game so i wouldnt know but i did say 90% of the game you found 1 of the one that are in the 10% !cheer at you!
It also happens with the same guy when we host LAN games of Sins of a Solar Empire. His PC starts lagging behind as the number of active units in the game increases well before mines does. As a result my game is constantly waiting for his game to sync back up.
Really, you can go on believing CPU's have little impact if you want but I'm telling you that just isn't the case. At least nowhere near the scope you are making it out to be.
Oh, I did forget to say. I believe he has a first gen Phenom quad core at either 3.7 or 3.8. I have a i7-950 running at 4.3Ghz. I do have faster GPU's than him but due to me playing at much higher resolution and often running 2x2 SSAA I actually get lower overall FPS than he does. It's a classic example of his CPU bottlenecking his system since his GPU has more headroom than mine yet I don't suffer from it. In these situations he essentially has an unbalance between his GPU and CPU meaning that he could be increasing the detail in game without suffering overall while I'm doing the exact opposite.
Zezda How much is your system? Honestly you sound like you're running a beast but if its 4-5 grand not many people would be interested beside the obvious bleeding edge tech guys.
My System runs everything on Max ultra (60fps+) yet seems to be way crappy compared to what your running.
What are the benefits of running a system that wont be fully supported by the game industry until 2020?
Not having to upgrade for a while would be the only benefit? I dont even see that as a valid argument because i can replace my Cpu and Gpu three times over that time period and not spend as much.
Honestly i dont even see the point in these AMD vs Intel debates, when you get to the point of running a 1-1.5k rig and over your basically using hardware that can run shit on max ultra anyway and is hardly being optimized your hardware.
i dont play that game so i wouldnt know but i did say 90% of the game you found 1 of the one that are in the 10% !cheer at you!
It also happens with the same guy when we host LAN games of Sins of a Solar Empire. His PC starts lagging behind as the number of active units in the game increases well before mines does. As a result my game is constantly waiting for his game to sync back up.
Really, you can go on believing CPU's have little impact if you want but I'm telling you that just isn't the case. At least nowhere near the scope you are making it out to be.
Oh, I did forget to say. I believe he has a first gen Phenom quad core at either 3.7 or 3.8. I have a i7-950 running at 4.3Ghz. I do have faster GPU's than him but due to me playing at much higher resolution and often running 2x2 SSAA I actually get lower overall FPS than he does. It's a classic example of his CPU bottlenecking his system since his GPU has more headroom than mine yet I don't suffer from it. In these situations he essentially has an unbalance between his GPU and CPU meaning that he could be increasing the detail in game without suffering overall while I'm doing the exact opposite.
Zezda How much is your system? Honestly you sound like you're running a beast but if its 4-5 grand not many people would be interested beside the obvious bleeding edge tech guys.
My System runs everything on Max ultra (60fps+) yet seems to be way crappy compared to what your running.
What are the benefits of running a system that wont be fully supported by the game industry until 2020?
Not having to upgrade for a while would be the only benefit? I dont even see that as a valid argument because i can replace my Cpu and Gpu three times over that time period and not spend as much.
Honestly i dont even see the point in these AMD vs Intel debates, when you get to the point of running a 1-1.5k rig and over your basically using hardware that can run shit on max ultra anyway and is hardly being optimized your hardware.
It's due for a refresh once socket 2011 is here as well as the next gen GPU's which should be quite nice.
For now though, everything but the tower
3x Dell 2209WA IPS monitors (5280x1050 with bezel correction)
Creative s750 Gigaworks 7.1 speakers
Razer Mamba mouse, Destructor pad, Tarantula keyboard & Megalodon headset (Honestly, no idea how I ended up with all Razer stuff, it was all bought over a long time period to replace other things)
The tower itself
Silverstone FT02
i7-950 with a Thermolab Baram cooler (Bit long in the tooth these days but it still does a nice job) running anywhere from stock at 3.06 to 4.3-4.4Ghz
6GB 1600Mhz OCZ Ram (This is about as close to it's officially rated speed as I can get it when OC'ing)
Creative X-Fi Fatality pro gamer
2x EVGA Superclocked GTX460's running about 875/1950 each
Corsair AX1200 PSU
Price? Some stuff I've used for years like the 7.1 speakers, other stuff is quite new like the mouse. In total though including all the little bits and pieces it probably comes in at over £3000 - A lot of the is in accessories though, like I didn't need to buy IPS panels at £200 each and the 7.1 speakers were over £300 when new.
Two things I should have avoided though was the 1gb gpu's and I should have got 12gb of ram rather than 6gb. But rather than try remedy that just now I'll sell what I won't reuse and wait for the new stuff to hit the shelves before making anymore changes. I do have a massively underused PSU but I got it since it was clearly the best 1k+ watt unit going and I had to replace my old PSU which I broke in a fit of stupidity (It was also responsible for powering a hefty OC on the cpu as well as a 4870x2 which was a hell of a power hog). The other good excuse is that I do PC and Server network support for a living and I use the PC at home to help with things on the side outside my main job like setting up Networks remotely or fixing up companies sharepoint sites etc.
As for the other questions.. If you want to continue playing at your current level regardless of what that is then unless you are willing to pay over the odds you're going to last about as long as anyone else. There are gains to be made by making smart choices (The original i7 series was a good buy imo) and for the higher end stuff a lot of the investment is made in the beginning since you can sell on older parts, like I have a 4870x2 sitting next to me gathering dust when really I should have sold it on when I got the 460's. I dare say you still end up paying more overall to maintain that higher level of performance but the difference itsn't quite as big as it would first seem, at least in the long term.
At the end of the day I need more expensive parts because my gaming habit requires it. I like playing with the detail very high and I like playing on 3 monitors. Had I not been playing across the three monitors then I wouldn't even dream of updating anything for a heck of a long time yet. It's like I said before, it's all in the balance. If it turns out I can't afford the motherboard/cpu/ram upgrade then I can swap in new GPU's and still be cool since I left myself that CPU headroom.
AMD CPUs are far less expensive and will give you a great value. However, they're top of the line CPUs barely hold a candle to Intels top. You will pay a lot more for Intel, you will get a little bit more power from Intel. You just have to decide if that bit of extra power is worth 200-400 dollars more.
I would just build a Phenom X6 system, 12gb RAM DDR3(I guess DDR4 is coming out), use the extra cash on a SSD for your OS, and get two Radeon 6970s for crossfire. That, at the moment, will run you somewhere around 1500 or so. I would wait until the end of the Quarter to build a new PC though.
If you take same performance intel and amd cpu, for price of intel you get said amd cpu and good mainboard. But sure chose intel becosue " its better"
That's only really true if you're talking lower end processors, a cheaper i5 or i3 even can stomp all over higher clocked AMD parts. The only thing AMD's got going for it right now as we talk is that they can get 3 or 4 cores onto a CPU cheaper than Intel but that doesn't mean the performance is there just becuase it's a x3 or x4. Again, all about the balance baby!
The same reason Nvidia is more expensive than ATi.
AMD is no where near as good as Intel,i would never own an AMD machine or an ATi graphics card especially with the drivers for ATi.
Nonsense. In some market segments, the reason Nvidia is more expensive than AMD is that Nvidia's parts cost far more to build. If they tried to match AMD in price for a given level of performance, they'd lose money on every card sold. So they have to charge high enough prices that the main reason people pay them is not knowing any better. In about the $140-$600 range (loosely, anything between a GeForce GTX 460 and two GeForce GTX 570s in SLI), both vendors are competitive, but outside of that, if you want something that is a good value for the money, AMD is the only game in town.
AMD's desktop processors aren't as good as Intel's today, but that could change. Six or seven years ago, the situation was reversed. Indeed, in the low power laptop space, AMD is way ahead of Intel already. The only reason anyone buys an Intel Atom system is not knowing any better. But that, too, could change, with the launch of Silvermont in 2013.
In the consumer video card space, AMD's video cards are way ahead of Nvidia's in just about any metric you want to use. Nvidia does have the fastest top end GPU, because they're willing to go a lot larger on die size, but that's about all that Nvidia has in their favor. This also could change in the future, perhaps as early as the release of Kepler next year. Four years ago, Nvidia was way ahead.
-----
"Physical cores > Simulated cores any day of year."
Only if you run programs that can take advantage of the extra cores. It will likely be a while before games see much advantage to having more than four cores available. Also, one Sandy Bridge core beats two Atom cores even in programs that can use both cores.
"I remember way back when some Intel CPUs were just rebadged AMDs."
When was this, exactly? Because I'm rather skeptical, though I'll concede that I have no clue what Intel did more than about 30 years ago.
"AMD has 6 REAL cores.. 6 > 4 when I last checked"
Six weaker cores don't beat 4 more powerful cores in programs that don't scale past four cores. It also matters how powerful the cores are. A Core i7 2600K with four cores will tend to beat a Phenom II X6 1100T with six cores even in programs that can take advantage of all six cores.
"In most cases, if anyone is just trying to build a gaming pc to run games at max settings and 60+ FPS. The AMD X6s will be more than enough to do this if you match with a decent graphic card."
Except that for gaming purposes, there isn't much advantage of a Phenom II X6 over a Phenom II X4, and the quad core is much cheaper.
"What resolution am I playing games at? (This is SUPER important, so many overlook it. Here's an example - I play at 5280x1050 so games often go above 2gb of ram usage and I also multi-client so 12gb of ram should have been the right choice rather than 6gb. Also in addition I went for SLI cards with 1gb of RAM each, that's fine... until I up the AA levels or add Ambient Occlusion etc and saturate the available gpu ram on some games like Metro 2033 and then my fps dies. I really should have went for 1.25gb or more per card or go for a single faster card with 2gb.)"
You probably mean 5240x1050. Anyway, at super high resolutions, AMD handily crushes Nvidia. At resolutions like that, two Radeon HD 6950s in CrossFire can sometimes beat two GeForce GTX 580s in SLI, at barely half the price tag and power consumption. That's partially due to 2 GB of video memory per card, but also because AMD's shader-heavy architecture scales better to higher resolutions.
"As to why AMD are cheaper, let's just say you roughly get what you pay for in most cases."
In many cases, that's true. But it's very possible to massively overpay. For example, this isn't even competitive with a Phenom II X4 that you can get for half its price:
"cause 90% of game could run just as well on pentium 4 then on a i7 latest monster like i said cpu play very little role now a day"
Actually, no. A Pentium 4 is old enough that it will severely bottleneck an awful lot of games.
"and i foresee the futur that ati will release a graphic card that will make cpu obsolete .a lot try to say nono it aint so but the sad truth is i bet with not many mods they could do all the cpu does and more even today i bet they could the only reasobn it hasnt happened isnt because the techno isnt there but because they would need to learn how to program in gpu.hell they need to learn it to multithread anyway lol."
Video cards completely choke if you try to run something single-threaded on them. CPUs are far, far better at that. Some algorithms are intrinsically single-threaded, and trying to run them on a video card isn't going to work. In order to properly exploit a video card, you need to not merely be able to do several things in parallel, but several hundred.
"I believe he has a first gen Phenom quad core at either 3.7 or 3.8."
That's unlikely, as a Phenom (not Phenom II) overclocking to 3.7 GHz on air would be extremely rare. The top bin quad core was 2.6 GHz, and they didn't have much overclocking headroom.
"i7-950 with a Thermolab Baram cooler (Bit long in the tooth these days but it still does a nice job) running anywhere from stock at 3.06 to 4.3-4.4Ghz"
"2x EVGA Superclocked GTX460's running about 875/1950 each"
I'm skeptical of your claimed overclocks. Bloomfield at 4 GHz was pretty common, but 4.4 GHz on air takes more than a little luck, and even then, is likely to fry the thing.
Even the press edition EVGA GeForce GTX 460 FTW at 850 MHz core was often unstable. Two of them both at 875 MHz without a premium cooler to handle that? Could happen, but I'm skeptical. And there is absolutely no way that you're clocking anything in a GTX 460 at 1950 MHz on air. Even on liquid nitrogen, the shaders are the only thing with any hope of coming near that clock speed, and that's probably not what you meant, as the shader clock speed is locked at twice the core clock speed.
"Two things I should have avoided though was the 1gb gpu's and I should have got 12gb of ram rather than 6gb."
You do realize that 32-bit programs, such as basically every game on the market, are hard-capped at 2 GB for the entire program, I hope.
"The original i7 series was a good buy imo"
Was, until about September 2009. After that, it was no longer a good buy for more than a tiny handful of people.
"Some people like bang for buck while getting less performance but i have always gone with Intel and Nvidia."
Sometimes getting maximum performance requires going with AMD for the processor and/or video card. Intel does dominate the top end performance in processors today, but if you're willing to go multi-GPU, then AMD wins in video cards.
"I would just build a Phenom X6 system, 12gb RAM DDR3"
Thuban has a dual channel memory controller, so your amount of system memory should be a power of 2. 12 is not a power of 2.
"That's only really true if you're talking lower end processors, a cheaper i5 or i3 even can stomp all over higher clocked AMD parts."
Some of Intel's Core i7s and Core i5s do handily beat AMD's processors. But the Core i3s do not. Even in purely single-threaded performance, Intel's fastest Core i3 may beat a Phenom II X6 1100T, but not often by much. And once you start using a third or fourth core, it's all over for the Core i3.
The same reason Nvidia is more expensive than ATi.
AMD is no where near as good as Intel,i would never own an AMD machine or an ATi graphics card especially with the drivers for ATi.
@ Quizzical
'You probably mean 5240x1050. Anyway, at super high resolutions, AMD handily crushes Nvidia. At resolutions like that, two Radeon HD 6950s in CrossFire can sometimes beat two GeForce GTX 580s in SLI, at barely half the price tag and power consumption. That's partially due to 2 GB of video memory per card, but also because AMD's shader-heavy architecture scales better to higher resolutions.'
Nah, it's 5280x1050 once bezel corrected, 5040x1050 without. But yes, AMD's would probably have been a better shout due to both MLAA (rather than SSAA... god that kills FPS so hard) and higher ram per card. I was so so so close to going for the 2gb 460's when they were just released and probably should have with the benefit of hindsight. Having said that a pair of AMD cards out just not so long afterwards would have been better still. It's always the same though when it comes to these things!
As for the cards, they sort of trade blows between the games I play for the most part. I think the AMD ones would have made the most sense due to them having MLAA but then I was in the mood to get something then and there and the 460's did exactly what the 470 and 480 couldn't do and there wasn't really an AMD alternative at the time.
'That's unlikely, as a Phenom (not Phenom II) overclocking to 3.7 GHz on air would be extremely rare. The top bin quad core was 2.6 GHz, and they didn't have much overclocking headroom.'
Yeah I agree, I thought it was a Phenom II as well and suspect it is if he is messing with BIOS settings for RAM as well. Perhaps he just said Phenom and I assumed he meant first gen. I will admit I have a very sketchy memory as to what the usual clocks for those where so it is indeed probably some form of Phenom II.
'I'm skeptical of your claimed overclocks. Bloomfield at 4 GHz was pretty common, but 4.4 GHz on air takes more than a little luck, and even then, is likely to fry the thing.'
The D0 stepping of the 920 and later models really lowered the voltage required to get a good clock on the CPU by a huge amount. Indeed anything toward 4.5Ghz on air is indeed very good and the reason I'm running just now at 4.3 is simply that it isn't quite as extreme to run '24/7' with the voltage and my temps are still quite nice all things considered. Pushing it an extra 100Mhz while probably possible just isn't practical for me at all. Hell, 4.3Ghz is probably overkill but I don't run my overclocks all day every day, if I'm going to be going a few days of only using the PC for work etc they will get turned back to stock.
The 460's are blower style coolers matched up with the FT02. A rather insane amount of airflow goes straight through them and straight out the top of the case. They also have a slot between them so they aren't sitting right on top of one another. Right now the room temp is 25 degrees celsius and after 15 minutes of 100% load on both cards the top card is bouncing between 82/83 and the lower is 79/80, that's with a 1.025 voltage across the core (Max temp is 105 degrees if memory serves me right). And I did mean 1950 for the memory sorry, not shader as you rightfully said it's locked to x2 of the core. Various stress tests and what have you don't pick anything up and I can't see any visible artifacts while playing games etc. Increasing either the core or memory much past these will cause problems after a while of testing/playing but not within the first 5 minutes usually.
'You do realize that 32-bit programs, such as basically every game on the market, are hard-capped at 2 GB for the entire program, I hope.'
Yes, generally for any game running that might creep up on the limit such as WoW I would add the LARGEADDRESSAWARE flag to stop windows smashing it over the head. It has certainly done the trick for at least one or two games (The main culprit being WoW, I got to fly around for about 20 seconds before getting an Out of Memory error while on the 3 monitors, the largeaddress flag fixed it instantly after their technical support told me there was no need to do so because the game would never consume anywhere near 2gb of ram). But the main reason for the 12GB of ram was that I often multi-client games like EvE online. For EvE I have to turn down the settings in game to run 3 clients at once whereas if I had the 12GB instead I would have been fine (That's for alt-tabbing between the three clients.. one client on each screen is easier on the RAM but harder on the GPU it seems).
'Was, until about September 2009. After that, it was no longer a good buy for more than a tiny handful of people.'
Yeah but you would still have had something that arguably was 'better' overall until Sandy Bridge hit, or perhaps one of the later Lynfields at least. Either way as far as CPU's go it wasn't that bad an investment compared to other options to be fair as far as longevity goes.
'Some of Intel's Core i7s and Core i5s do handily beat AMD's processors. But the Core i3s do not. Even in purely single-threaded performance, Intel's fastest Core i3 may beat a Phenom II X6 1100T, but not often by much. And once you start using a third or fourth core, it's all over for the Core i3.'
Yes, sorry. I was focusing more on games but yeah if you can use those cores then unless you're splashing out on a i5 or i7 the more cores approach from AMD is worth a look indeed. It just depends on how much of that you think you'll use. Having said that they did at least put HT on the i3's so while not real quad cores they'll still get a small benefit from the extra threads.
Comments
There's nothing wrong with AMD CPU's. The better ones will run just about everything fine. If you're building a monster gaming rig with dual video cards and 12 gigs of high quality ram and pushing a huge display then go Intel. If you're building a budget gaming rig, I'd go AMD so you can drop a little more on a video card and good RAM. The AMD Black Editions overclock quite nicely on stock cooling if you get a case with good ventilation.
One trade-off is power consumption. It turns out that most people don't want a processor that puts out 200 W of heat in their system.
Another trade-off is reliability. If you clock it at 4.2 GHz at the stock voltage and cool it well, and it's stable like that, and you have a good motherboard and power supply to feed it power properly, then there isn't too much risk of frying it. But a lot of people won't get a motherboard, power supply, or heatsink appropriate to a large overclock, because those things all cost money. If you want to push it to 5 GHz, then you take a serious risk of frying the processor after some months or years, or possibly sooner if you get unlucky or had to nudge the voltage too far.
"i bet if amd listened to ati a bit more instead of being stuborn they would pass ahead of intel in term of speed even with their bigger die size."
AMD can't use Intel's fabs, so they're usually behind in process nodes. Even if they're on the same process node size, AMD has to use a very new process node, while Intel has a much more mature one. That's the main reason why AMD tends to be behind Intel in CPU performance. Right now, they're also behind because their architecture is simply worse. Bulldozer may change that later this month, however.
AMD doesn't have that disadvantage in video cards, as both AMD and Nvidia use other foundries, and currently mainly TSMC. If they're using exactly the same process node to produce their GPU chips, then the best architecture wins. And currently that's AMD's.
Not to get off topic.... but whoever said nVidia VS AMD/ATI is the same thing in the GPU Platform is COMPLETELY WRONG.
An ATI/AMD GPU thats the same price as an nVidia similar is always better. ATI/AMD GPUs are better right now on the market than nVidia. Just find 2 cards that are similar in price and run the benchmarks, the ATI/AMD will always win.
I'm seeing a lot of posts mentioning "Over clocking." I didn't plan on over clocking my CPU. It seems pretty advanced and I don't know much about all of this stuff. Plus i don't want to fry anything. Would I still get good performance without over clocking my CPU? Even if I were to get the, say, i5 2500k?
The same reason Nvidia is more expensive than ATi.
AMD is no where near as good as Intel,i would never own an AMD machine or an ATi graphics card especially with the drivers for ATi.
http://realhistoryww.com/world_history/ancient/Misc/Jesus/Jesus.htm
OP, Because the Intel CPUs are better.
However, My current system is a AMD Phenom II x 6 1100T 3.3 (OC 4.0), AMD Radeon HD 6970 2gig, 16gigs ram
The damn thing runs anything i toss at it on Max Ultra. And for the price? i paid 800-900 bucks in parts.
Honestly i would be shocked if i had to put any game released in the next three years under High settings.
Well worth the investment for a cheapish gaming rig.
Playing: Nothing
Looking forward to: Nothing
http://lenzfire.com/2011/07/amd-bulldozer-release-date-finalised-24475/
Physical cores > Simulated cores any day of year.
Intel may be better than AMD right now, but it was not always the case, and may not always be the case
I remember way back when some Intel CPUs were just rebadged AMDs.
A creative person is motivated by the desire to achieve, not the desire to beat others.
I have AMD Phenom II X6 and ATi HD3870x2 ... It's amazing what I can run on this... Catia opens up in 0.5 secs, I can have Pro-E simulation running and still work in AutoCAD and Catia if needed. Not to mention I can run every game on ultra-high and it works flawlessly without glitches. But if I compare my other two PC-s where I have almost the same configuration with AMD/ATi and Intel/Nvidia - they are really not different... Anything I run there works the same.
Most of this garbage with Intel being better than AMD is commercial bitching and 80% of it ain't true.... besides i7 has 4 cores and uses those 4 cores to simulate another 4 cores so it appears as an 8 core CPU when it really is quad core with 8 threads... AMD has 6 REAL cores.. 6 > 4 when I last checked, and for the price intel isn't worth the money...
"Happiness is not a destination. It is a method of life."
-------------------------------
In most cases, if anyone is just trying to build a gaming pc to run games at max settings and 60+ FPS. The AMD X6s will be more than enough to do this if you match with a decent graphic card.
The most important thing you could ever do for a PC is to make sure it is well balanced, as Quizzical will back me up on im sure.
There is no point in having a nice processor you plan to OC and getting a horrid mainboard. There's no point in getting stupid rated RAM when there is next to zero performance increase in real world tests. There's no point going for any build without a good quality PSU..
It's all about the balance between components and realizing what things your stressing. The only two parts everyone should be insisting on nothing but quality is the mainboard and psu. You can get away with cheaper graphics cards or processors or ram and all it will do is be a little slower.. but if you buy a crappy mainboard or psu you're shooting yourself in the foot.
Think about the big picture and ask yourself some questions..
What resolution am I playing games at? (This is SUPER important, so many overlook it. Here's an example - I play at 5280x1050 so games often go above 2gb of ram usage and I also multi-client so 12gb of ram should have been the right choice rather than 6gb. Also in addition I went for SLI cards with 1gb of RAM each, that's fine... until I up the AA levels or add Ambient Occlusion etc and saturate the available gpu ram on some games like Metro 2033 and then my fps dies. I really should have went for 1.25gb or more per card or go for a single faster card with 2gb.)
Do I want bells and whistles in games? (eg Ambient Occlusion, Higher levels of AA or even Super Sampling AA, Depth of Field and other effects.)
How many cores are my games and applications likely to use? How much use will I get out of Hyperthreading or a hex-core cpu or should I favour instructions per clock?
How much RAM am I going to need?
SSD or Hard disk?
All these things tie in with one another and the most important part of putting together a build is that you balance it all correctly. It's a learning curve alright and you'll get it wrong but there's a few people here with their heads screwed on right and if you give them the time and info they should keep you pretty straight.
As to why AMD are cheaper, let's just say you roughly get what you pay for in most cases. Here's a link that demonstrates this nicely. As you will see the 3.7Ghz clocked quad core AMD gets rocked on single threaded apps by the significantly slower clocked i5's and on multi-threaded apps it just about manages to trade blows with the i7-920 which was released way back in November 2008 and is clocked a full 1Ghz lower than the 980. Even overclocked it still doesn't beat out the i5 and is nowhere near the power consumption levels of it either. It's cheaper, but then you see why.
http://www.anandtech.com/show/4310/amd-phenom-ii-x4-980-black-edition-review
op:in static test you might see a difference in number yes,but while you are gaming you will be hard pressed to find a diff,why?cause 90% of game could run just as well on pentium 4 then on a i7 latest monster like i said cpu play very little role now a day
and i foresee the futur that ati will release a graphic card that will make cpu obsolete .a lot try to say nono it aint so but the sad truth is i bet with not many mods they could do all the cpu does and more even today i bet they could the only reasobn it hasnt happened isnt because the techno isnt there but because they would need to learn how to program in gpu.hell they need to learn it to multithread anyway lol.
i heard some say a cpu and a gpu is so diff it could never happen.ya right.lol!
the sad truth is it can be done and i bet lot of user do it today.
That just isn't true, im sorry...
Nice example, my flatmate has a problem with Battlefield: Bad Company 2. When he loads up a map in multiplayer he stutters and lags for the first 20 seconds or so. He has a quad core cpu and a 4870 playing at 1920x1080. I throw him over my old 4870x2 and the lag is worse. I tell him to go overclock his CPU which he really, really doesn't want to do.
I catch him in the BIOS a few weeks later setting his ram timings manually after a bios reset. I show him what to change and he overclocks the CPU to about 3.7Ghz without any issues.
Boots up Battlefield and the stutter/lag is gone when loading into maps. Our other flatmate it turns out gets the same thing happen to him when loading maps. He's using a core 2 cpu, I believe the stock speed on it is 2.8Ghz and he hasnt touched it.. Funnily enough even at 40-50fps (as opposed to the constant 60fps they get) playing across 3 monitors I have no lag or stutter at all at any point in the game.
You're also wrong with your second statement as well at least for the forseeable future; CPU's and GPU's are good at totally different things. GPU's are good at dealing with many simultaneous requests that, for example, require some calculations. Like breaking some cryptography or things like simulating folding proteins (Folding@Home) or Bitcoin mining. While CPU's on the other hand are very good at carrying out orders and instructions extremely efficiently.
i dont play that game so i wouldnt know but i did say 90% of the game you found 1 of the one that are in the 10% !cheer at you!
It also happens with the same guy when we host LAN games of Sins of a Solar Empire. His PC starts lagging behind as the number of active units in the game increases well before mines does. As a result my game is constantly waiting for his game to sync back up.
Really, you can go on believing CPU's have little impact if you want but I'm telling you that just isn't the case. At least nowhere near the scope you are making it out to be.
Oh, I did forget to say. I believe he has a first gen Phenom quad core at either 3.7 or 3.8. I have a i7-950 running at 4.3Ghz. I do have faster GPU's than him but due to me playing at much higher resolution and often running 2x2 SSAA I actually get lower overall FPS than he does. It's a classic example of his CPU bottlenecking his system since his GPU has more headroom than mine yet I don't suffer from it. In these situations he essentially has an imbalance between his GPU and CPU meaning that he could be increasing the detail in game without suffering overall while I'm doing the exact opposite by keeping spare CPU headroom as the GPU portion of the load can be a lot more flexable.
Zezda How much is your system? Honestly you sound like you're running a beast but if its 4-5 grand not many people would be interested beside the obvious bleeding edge tech guys.
My System runs everything on Max ultra (60fps+) yet seems to be way crappy compared to what your running.
What are the benefits of running a system that wont be fully supported by the game industry until 2020?
Not having to upgrade for a while would be the only benefit? I dont even see that as a valid argument because i can replace my Cpu and Gpu three times over that time period and not spend as much.
Honestly i dont even see the point in these AMD vs Intel debates, when you get to the point of running a 1-1.5k rig and over your basically using hardware that can run shit on max ultra anyway and is hardly being optimized your hardware.
Playing: Nothing
Looking forward to: Nothing
It's due for a refresh once socket 2011 is here as well as the next gen GPU's which should be quite nice.
For now though, everything but the tower
3x Dell 2209WA IPS monitors (5280x1050 with bezel correction)
Creative s750 Gigaworks 7.1 speakers
Razer Mamba mouse, Destructor pad, Tarantula keyboard & Megalodon headset (Honestly, no idea how I ended up with all Razer stuff, it was all bought over a long time period to replace other things)
The tower itself
Silverstone FT02
i7-950 with a Thermolab Baram cooler (Bit long in the tooth these days but it still does a nice job) running anywhere from stock at 3.06 to 4.3-4.4Ghz
6GB 1600Mhz OCZ Ram (This is about as close to it's officially rated speed as I can get it when OC'ing)
Creative X-Fi Fatality pro gamer
2x EVGA Superclocked GTX460's running about 875/1950 each
Corsair AX1200 PSU
Price? Some stuff I've used for years like the 7.1 speakers, other stuff is quite new like the mouse. In total though including all the little bits and pieces it probably comes in at over £3000 - A lot of the is in accessories though, like I didn't need to buy IPS panels at £200 each and the 7.1 speakers were over £300 when new.
Two things I should have avoided though was the 1gb gpu's and I should have got 12gb of ram rather than 6gb. But rather than try remedy that just now I'll sell what I won't reuse and wait for the new stuff to hit the shelves before making anymore changes. I do have a massively underused PSU but I got it since it was clearly the best 1k+ watt unit going and I had to replace my old PSU which I broke in a fit of stupidity (It was also responsible for powering a hefty OC on the cpu as well as a 4870x2 which was a hell of a power hog). The other good excuse is that I do PC and Server network support for a living and I use the PC at home to help with things on the side outside my main job like setting up Networks remotely or fixing up companies sharepoint sites etc.
As for the other questions.. If you want to continue playing at your current level regardless of what that is then unless you are willing to pay over the odds you're going to last about as long as anyone else. There are gains to be made by making smart choices (The original i7 series was a good buy imo) and for the higher end stuff a lot of the investment is made in the beginning since you can sell on older parts, like I have a 4870x2 sitting next to me gathering dust when really I should have sold it on when I got the 460's. I dare say you still end up paying more overall to maintain that higher level of performance but the difference itsn't quite as big as it would first seem, at least in the long term.
At the end of the day I need more expensive parts because my gaming habit requires it. I like playing with the detail very high and I like playing on 3 monitors. Had I not been playing across the three monitors then I wouldn't even dream of updating anything for a heck of a long time yet. It's like I said before, it's all in the balance. If it turns out I can't afford the motherboard/cpu/ram upgrade then I can swap in new GPU's and still be cool since I left myself that CPU headroom.
My i5 Overclocked to 4.02 GHz is a beast. I'd recommend an i5 not an i7 if you do look at intel.
(\ /) ?
( . .)
c('')('')
Some people like bang for buck while getting less performance but i have always gone with Intel and Nvidia.
My current spec is..
AlienWare Aurora ALX
Windows 7 64bit Ultimate
Intel Core i7 Extreme Edition 975 3.3GHz OC at 4.2GHz
AlienWare MicroATX LGA 1366 Intel X58
Corsair DOMINATOR 12GB DDR3 1600MHz
2x GeForce GTX 580 1536MB SLi
Liquid Cooling - 120mm Single Radiator
Seagate Barracuda 1TB 7200 RPM HDD RAID 0
512GB SSD DUAL HDD 2X 256GB
AlienWare 875 Watt PSU
http://realhistoryww.com/world_history/ancient/Misc/Jesus/Jesus.htm
I am a big fan of AMD.
AMD CPUs are far less expensive and will give you a great value. However, they're top of the line CPUs barely hold a candle to Intels top. You will pay a lot more for Intel, you will get a little bit more power from Intel. You just have to decide if that bit of extra power is worth 200-400 dollars more.
I would just build a Phenom X6 system, 12gb RAM DDR3(I guess DDR4 is coming out), use the extra cash on a SSD for your OS, and get two Radeon 6970s for crossfire. That, at the moment, will run you somewhere around 1500 or so. I would wait until the end of the Quarter to build a new PC though.
I used to have the i5 750 2.66GHz in my previous which was a beast for overclocking.
http://realhistoryww.com/world_history/ancient/Misc/Jesus/Jesus.htm
If you take same performance intel and amd cpu, for price of intel you get said amd cpu and good mainboard. But sure chose intel becosue " its better"
That's only really true if you're talking lower end processors, a cheaper i5 or i3 even can stomp all over higher clocked AMD parts. The only thing AMD's got going for it right now as we talk is that they can get 3 or 4 cores onto a CPU cheaper than Intel but that doesn't mean the performance is there just becuase it's a x3 or x4. Again, all about the balance baby!
Nonsense. In some market segments, the reason Nvidia is more expensive than AMD is that Nvidia's parts cost far more to build. If they tried to match AMD in price for a given level of performance, they'd lose money on every card sold. So they have to charge high enough prices that the main reason people pay them is not knowing any better. In about the $140-$600 range (loosely, anything between a GeForce GTX 460 and two GeForce GTX 570s in SLI), both vendors are competitive, but outside of that, if you want something that is a good value for the money, AMD is the only game in town.
AMD's desktop processors aren't as good as Intel's today, but that could change. Six or seven years ago, the situation was reversed. Indeed, in the low power laptop space, AMD is way ahead of Intel already. The only reason anyone buys an Intel Atom system is not knowing any better. But that, too, could change, with the launch of Silvermont in 2013.
In the consumer video card space, AMD's video cards are way ahead of Nvidia's in just about any metric you want to use. Nvidia does have the fastest top end GPU, because they're willing to go a lot larger on die size, but that's about all that Nvidia has in their favor. This also could change in the future, perhaps as early as the release of Kepler next year. Four years ago, Nvidia was way ahead.
-----
"Physical cores > Simulated cores any day of year."
Only if you run programs that can take advantage of the extra cores. It will likely be a while before games see much advantage to having more than four cores available. Also, one Sandy Bridge core beats two Atom cores even in programs that can use both cores.
"I remember way back when some Intel CPUs were just rebadged AMDs."
When was this, exactly? Because I'm rather skeptical, though I'll concede that I have no clue what Intel did more than about 30 years ago.
"AMD has 6 REAL cores.. 6 > 4 when I last checked"
Six weaker cores don't beat 4 more powerful cores in programs that don't scale past four cores. It also matters how powerful the cores are. A Core i7 2600K with four cores will tend to beat a Phenom II X6 1100T with six cores even in programs that can take advantage of all six cores.
"In most cases, if anyone is just trying to build a gaming pc to run games at max settings and 60+ FPS. The AMD X6s will be more than enough to do this if you match with a decent graphic card."
Except that for gaming purposes, there isn't much advantage of a Phenom II X6 over a Phenom II X4, and the quad core is much cheaper.
"What resolution am I playing games at? (This is SUPER important, so many overlook it. Here's an example - I play at 5280x1050 so games often go above 2gb of ram usage and I also multi-client so 12gb of ram should have been the right choice rather than 6gb. Also in addition I went for SLI cards with 1gb of RAM each, that's fine... until I up the AA levels or add Ambient Occlusion etc and saturate the available gpu ram on some games like Metro 2033 and then my fps dies. I really should have went for 1.25gb or more per card or go for a single faster card with 2gb.)"
You probably mean 5240x1050. Anyway, at super high resolutions, AMD handily crushes Nvidia. At resolutions like that, two Radeon HD 6950s in CrossFire can sometimes beat two GeForce GTX 580s in SLI, at barely half the price tag and power consumption. That's partially due to 2 GB of video memory per card, but also because AMD's shader-heavy architecture scales better to higher resolutions.
"As to why AMD are cheaper, let's just say you roughly get what you pay for in most cases."
In many cases, that's true. But it's very possible to massively overpay. For example, this isn't even competitive with a Phenom II X4 that you can get for half its price:
http://www.newegg.com/Product/Product.aspx?Item=N82E16819115041
"cause 90% of game could run just as well on pentium 4 then on a i7 latest monster like i said cpu play very little role now a day"
Actually, no. A Pentium 4 is old enough that it will severely bottleneck an awful lot of games.
"and i foresee the futur that ati will release a graphic card that will make cpu obsolete .a lot try to say nono it aint so but the sad truth is i bet with not many mods they could do all the cpu does and more even today i bet they could the only reasobn it hasnt happened isnt because the techno isnt there but because they would need to learn how to program in gpu.hell they need to learn it to multithread anyway lol."
Video cards completely choke if you try to run something single-threaded on them. CPUs are far, far better at that. Some algorithms are intrinsically single-threaded, and trying to run them on a video card isn't going to work. In order to properly exploit a video card, you need to not merely be able to do several things in parallel, but several hundred.
"I believe he has a first gen Phenom quad core at either 3.7 or 3.8."
That's unlikely, as a Phenom (not Phenom II) overclocking to 3.7 GHz on air would be extremely rare. The top bin quad core was 2.6 GHz, and they didn't have much overclocking headroom.
"i7-950 with a Thermolab Baram cooler (Bit long in the tooth these days but it still does a nice job) running anywhere from stock at 3.06 to 4.3-4.4Ghz"
"2x EVGA Superclocked GTX460's running about 875/1950 each"
I'm skeptical of your claimed overclocks. Bloomfield at 4 GHz was pretty common, but 4.4 GHz on air takes more than a little luck, and even then, is likely to fry the thing.
Even the press edition EVGA GeForce GTX 460 FTW at 850 MHz core was often unstable. Two of them both at 875 MHz without a premium cooler to handle that? Could happen, but I'm skeptical. And there is absolutely no way that you're clocking anything in a GTX 460 at 1950 MHz on air. Even on liquid nitrogen, the shaders are the only thing with any hope of coming near that clock speed, and that's probably not what you meant, as the shader clock speed is locked at twice the core clock speed.
"Two things I should have avoided though was the 1gb gpu's and I should have got 12gb of ram rather than 6gb."
You do realize that 32-bit programs, such as basically every game on the market, are hard-capped at 2 GB for the entire program, I hope.
"The original i7 series was a good buy imo"
Was, until about September 2009. After that, it was no longer a good buy for more than a tiny handful of people.
"Some people like bang for buck while getting less performance but i have always gone with Intel and Nvidia."
Sometimes getting maximum performance requires going with AMD for the processor and/or video card. Intel does dominate the top end performance in processors today, but if you're willing to go multi-GPU, then AMD wins in video cards.
"I would just build a Phenom X6 system, 12gb RAM DDR3"
Thuban has a dual channel memory controller, so your amount of system memory should be a power of 2. 12 is not a power of 2.
"That's only really true if you're talking lower end processors, a cheaper i5 or i3 even can stomp all over higher clocked AMD parts."
Some of Intel's Core i7s and Core i5s do handily beat AMD's processors. But the Core i3s do not. Even in purely single-threaded performance, Intel's fastest Core i3 may beat a Phenom II X6 1100T, but not often by much. And once you start using a third or fourth core, it's all over for the Core i3.
@ Quizzical
'You probably mean 5240x1050. Anyway, at super high resolutions, AMD handily crushes Nvidia. At resolutions like that, two Radeon HD 6950s in CrossFire can sometimes beat two GeForce GTX 580s in SLI, at barely half the price tag and power consumption. That's partially due to 2 GB of video memory per card, but also because AMD's shader-heavy architecture scales better to higher resolutions.'
Nah, it's 5280x1050 once bezel corrected, 5040x1050 without. But yes, AMD's would probably have been a better shout due to both MLAA (rather than SSAA... god that kills FPS so hard) and higher ram per card. I was so so so close to going for the 2gb 460's when they were just released and probably should have with the benefit of hindsight. Having said that a pair of AMD cards out just not so long afterwards would have been better still. It's always the same though when it comes to these things!
As for the cards, they sort of trade blows between the games I play for the most part. I think the AMD ones would have made the most sense due to them having MLAA but then I was in the mood to get something then and there and the 460's did exactly what the 470 and 480 couldn't do and there wasn't really an AMD alternative at the time.
'That's unlikely, as a Phenom (not Phenom II) overclocking to 3.7 GHz on air would be extremely rare. The top bin quad core was 2.6 GHz, and they didn't have much overclocking headroom.'
Yeah I agree, I thought it was a Phenom II as well and suspect it is if he is messing with BIOS settings for RAM as well. Perhaps he just said Phenom and I assumed he meant first gen. I will admit I have a very sketchy memory as to what the usual clocks for those where so it is indeed probably some form of Phenom II.
'I'm skeptical of your claimed overclocks. Bloomfield at 4 GHz was pretty common, but 4.4 GHz on air takes more than a little luck, and even then, is likely to fry the thing.'
The D0 stepping of the 920 and later models really lowered the voltage required to get a good clock on the CPU by a huge amount. Indeed anything toward 4.5Ghz on air is indeed very good and the reason I'm running just now at 4.3 is simply that it isn't quite as extreme to run '24/7' with the voltage and my temps are still quite nice all things considered. Pushing it an extra 100Mhz while probably possible just isn't practical for me at all. Hell, 4.3Ghz is probably overkill but I don't run my overclocks all day every day, if I'm going to be going a few days of only using the PC for work etc they will get turned back to stock.
The 460's are blower style coolers matched up with the FT02. A rather insane amount of airflow goes straight through them and straight out the top of the case. They also have a slot between them so they aren't sitting right on top of one another. Right now the room temp is 25 degrees celsius and after 15 minutes of 100% load on both cards the top card is bouncing between 82/83 and the lower is 79/80, that's with a 1.025 voltage across the core (Max temp is 105 degrees if memory serves me right). And I did mean 1950 for the memory sorry, not shader as you rightfully said it's locked to x2 of the core. Various stress tests and what have you don't pick anything up and I can't see any visible artifacts while playing games etc. Increasing either the core or memory much past these will cause problems after a while of testing/playing but not within the first 5 minutes usually.
'You do realize that 32-bit programs, such as basically every game on the market, are hard-capped at 2 GB for the entire program, I hope.'
Yes, generally for any game running that might creep up on the limit such as WoW I would add the LARGEADDRESSAWARE flag to stop windows smashing it over the head. It has certainly done the trick for at least one or two games (The main culprit being WoW, I got to fly around for about 20 seconds before getting an Out of Memory error while on the 3 monitors, the largeaddress flag fixed it instantly after their technical support told me there was no need to do so because the game would never consume anywhere near 2gb of ram). But the main reason for the 12GB of ram was that I often multi-client games like EvE online. For EvE I have to turn down the settings in game to run 3 clients at once whereas if I had the 12GB instead I would have been fine (That's for alt-tabbing between the three clients.. one client on each screen is easier on the RAM but harder on the GPU it seems).
'Was, until about September 2009. After that, it was no longer a good buy for more than a tiny handful of people.'
Yeah but you would still have had something that arguably was 'better' overall until Sandy Bridge hit, or perhaps one of the later Lynfields at least. Either way as far as CPU's go it wasn't that bad an investment compared to other options to be fair as far as longevity goes.
'Some of Intel's Core i7s and Core i5s do handily beat AMD's processors. But the Core i3s do not. Even in purely single-threaded performance, Intel's fastest Core i3 may beat a Phenom II X6 1100T, but not often by much. And once you start using a third or fourth core, it's all over for the Core i3.'
Yes, sorry. I was focusing more on games but yeah if you can use those cores then unless you're splashing out on a i5 or i7 the more cores approach from AMD is worth a look indeed. It just depends on how much of that you think you'll use. Having said that they did at least put HT on the i3's so while not real quad cores they'll still get a small benefit from the extra threads.