I don't know what Quizz has done to you that you have to always go after him and his opinions
1) There is no word in the article about Fury X 2) He makes an argument that Fury X, that isn't even being the topic, is much faster than GTX 980 and that the speed difference will make up for power consumption difference.
It does not bother him that Fury X also cost almost 50% more of GTX 980, completely uncomparable cards and silly argument he is making.
From the article:
"The Sapphire Radeon R9 Fury X Nitro's power consumption plummets from 279W all the way down to a much more moderate 213W."
Upon further review, it looks like they're talking about a Fury non-X, and my mistake was grabbing the card name from that sentence of the review rather than a different one. Regardless, the non-X Radeon R9 Fury is also a much faster card than a GeForce GTX 980.
Furthermore, the argument was purely about performance per watt, not performance per dollar. But if you want to talk about prices, the latest New Egg prices including shipping and before rebate are $518 on a Radeon R9 Fury, $468 on a GeForce GTX 980, and $635 on a Radeon R9 Fury X. As compared to a GeForce GTX 980, the Radeon R9 Fury costs about 11% more and the Fury X about 36% more. I wouldn't think of 36% as "almost" 50%, but then, you think of two CPU cores as almost six and one memory channel as almost two, so at least you're consistent there.
It's an interesting article. It's also extremely easy to take it way out of it's intended context.
You have to deal with the effects of PowerTune (AMD's power management) and Turbo Boost (nVidia's power management). Neither of which will let you directly control voltage. They could have just as easily played with the Target Power or Target Temperature settings and got similar results, because they, along with a bunch of other factors, go into determining how the chip actually runs.
P = I*E, that's derived from Ohm's law. I don't think anyone disputes that, at least for DC circuits. (Makes a nice acronym to help you remember, particularly for those in elementary school physics).
I don't know why people dispute frequency as a function of power... it blows my mind. The CMOS power frequency equation (which, if Ohm's law is elementary school, CMOS power would be middle school level I guess) clearly shows the effect. P = F*A*C*(E^2) (hit the link to see what the variables are and how it's relevant) (Also makes a nice acronym to help you remember). It's a variation of the AC power equation, which makes sense, since an DC stepping frequency is not dissimilar to an AC signal.
The article linked, doesn't really bust any myths. It really just shows that the power management software implemented by major vendors works as advertised. nVidia's starts low and ramps up until you hit a limit; AMD's starts high and ramps down if it hits a limit.
All these electricians talking about digital @.@ hehe.
Both voltage and frequency are relevant to power consumption. On the subject of efficiency and "stable" operation voltage takes the cake a bit more. There are dead zones in TTL and CMOS logic, CMOS is designed to operate at lower voltages (3.3 for most common devices is considered high, less than 1v is considered low, anything inbetween is not valid.) and TTL operating at 5v respectively, but irrelevant here as modern computers operate purely on CMOS chips.
Undervolting should only be done if there is good error checking in the chip, which there is but there isn't in the case of graphics cards, your best bet really is to neither overclock the frequency or undercut the voltages, this will cause errors, though you won't see them.
Edit: Well, I kind of take back the "you won't see them" part, your operating system won't see them or report them as what a user may think of as an error... But, you may "see" these errors in the form of black screens, screen tearing, "fast-forwarding" (in this example try overclocking a processor to an extreme level in a Tomb Raider benchmark, quite amusing to play the game at double and triple it's normal operating speed), and in some cases intermittent operation (blue screen of death usually would occur on this.)
And you know who knows even better than the posters here? The engineers that make the cards, you know the ones that design the environmental specs appropriate to requirements and performance.
And you know who knows even better than the posters here? The engineers that make the cards, you know the ones that design the environmental specs appropriate to requirements and performance.
I don't know why, but I feel I needed to justify why I clicked the agree button...
Quite simply, an old saying: "Just because you can, doesn't mean you should." and "If it ain't broke, don't fix it."
The article you linked says that GTX 980 is more efficient than Fury.
Actually, it says that the undervolted Fury X uses somewhat more power than a stock GTX 980, but the Fury X is a much faster card, making it on net more efficient when undervolted. Until it crashes because it was undervolted too far.
You're right that if same undervolting trick works as effectively on Fury X, it's likely more efficient than GTX 980.
But talking about Radeon Fury, the article has "Efficiency Per Watt Per FPS" -chart, where GTX 980 beats Fury in 7 out of 9 games, and the article says "AMD's GPU can be almost as efficient as Nvidia's when the company isn't flogging it".
I think that counts as GTX 980 being more efficient, or argument could also be made that they're about as efficient when Radon Fury is undervolted, but I don't see how that article could be used to claim that Fury is more efficient than GTX 980.
He still did not read the article(besides that 1 liner that fits his AMD bias) and just openly make baseless claims that are in direct disagreement with data and conclusions presented in the article. (see Vrika's post or Ridelynn's part about PowerTune above)
Yes, his posts are nicely written and all but they are utterly flawed and wrong, pure theorycrafting with no support in data...
The rule of thumb is to not listen to guys that can't back their claims.
I also don't listen to people that favor only one side of the market too if that helps.
If I'm looking for a new card and I didn't say I want an nvidia card, I know Quizz will list all the cards in my budget regardless of brand. When people are looking to build new systems, this childish debate between amd and nvidia is just off topic and a waste of time for the creator of the thread and readers.
Unless they specifically say they want from x company, coming out of nowhere in the thread saying x sucks, go y is just trolling. I also never said he's always right, no one is. I said he posts a lot of information on what he's talking about (The majority of times he's right tough).
Instead looking at tens of thousands of real world test well flogg same ol' formula as proof of something.
Frequency mattres little to none. Get that through your heads already.
AND i would like those that write stuff about power tune and boost stop writing if they dont have experience with BOTH. Because they DO NOT work same way and cannot be compared.
Gdemani here (and few others) definitely dont have experience with both. Or with anything that matters at all.
"Underclocking/undervolting/overclocking/overvolting will make your PC explode". No it wont. Today you cant even burn CPU/GPU any more because it will shut down if you go to extremes. And if your PC is not working correctly youre doing something wrong.
And yes i speak with experience from all camps be it on CPU or GPU side. And 15+ years of expereince doing it from good ol' pentium/celeren/duron/athlon days. All the way to physical mods.
If youre a dummy youll pay more for less. Thats a fact. You can educate yourself and profit, but thats not the only perk, youll PC will run faster, more stable use less power and be cooler than dummies next door. Which all translates to longer health of your hardware.
And you know who knows even better than the posters here? The engineers that make the cards, you know the ones that design the environmental specs appropriate to requirements and performance.
Same engineers who make chips calculate wrong?
Just looking at what crap those same engineers did over the years will make you shit your pants. They are not all knowing gods walking the earth, i have to dissapoint you.
If you can make your PC use less power and run EXACTLY SAME as before, give me 1 good reason why you should not.
I also don't listen to people that favor only one side of the market too if that helps.
If I'm looking for a new card and I didn't say I want an nvidia card, I know Quizz will list all the cards in my budget regardless of brand. When people are looking to build new systems, this childish debate between amd and nvidia is just off topic and a waste of time for the creator of the thread and readers.
Unless they specifically say they want from x company, coming out of nowhere in the thread saying x sucks, go y is just trolling. I also never said he's always right, no one is. I said he posts a lot of information on what he's talking about (The majority of times he's right tough).
In that case you should not listen to OP
It is not that people or myself are way too much pro Intel/Nvidia, the point is, AMD isn't currently competitive in vast range of their products - they recycle old technology for too long and it can't keep up any more. Thus unless you sport AMD bias, there aren't many reasons to recommend their products - NVidia can mostly do what AMD can, just better.
That is the thing, he is not right most of the time, most of the time he is off - trapped in his bubble of theorycrafting that nothing from real world can pierce through... The problem is, unless you are enough savvy, you won't be able to tell
And you know who knows even better than the posters here? The engineers that make the cards, you know the ones that design the environmental specs appropriate to requirements and performance.
Engineers leave quite large safety margins for their different chip specs. Both in voltage and in frequency. See my previous example of a rock stable overclocked 4790k while it was also undervolted. Engineers provide 100% safe specs, that doesn't mean that deviating a bit from those specs for a reason or another will make the chip suddenly stop working.
100%? Its 120+%. Thats what some people refuse to understand. And that "+" can go to very high values.
The more time company spends testing the more accurate data, the more "optimized" HW. Thats why Intel and NVidia have very low safety margins but on AMD (on CPUs and GPUs) you can do wild undervolting/OCing (whichever path you take)
All this complicated crap, boring. Volt both card so they perform the SAME. Then measure the temps of the GPU. The one that gets hotter is less efficient. Simple.
I also don't listen to people that favor only one side of the market too if that helps.
If I'm looking for a new card and I didn't say I want an nvidia card, I know Quizz will list all the cards in my budget regardless of brand. When people are looking to build new systems, this childish debate between amd and nvidia is just off topic and a waste of time for the creator of the thread and readers.
Unless they specifically say they want from x company, coming out of nowhere in the thread saying x sucks, go y is just trolling. I also never said he's always right, no one is. I said he posts a lot of information on what he's talking about (The majority of times he's right tough).
In that case you should not listen to OP
It is not that people or myself are way too much pro Intel/Nvidia, the point is, AMD isn't currently competitive in vast range of their products - they recycle old technology for too long and it can't keep up any more. Thus unless you sport AMD bias, there aren't many reasons to recommend their products - NVidia can mostly do what AMD can, just better.
That is the thing, he is not right most of the time, most of the time he is off - trapped in his bubble of theorycrafting that nothing from real world can pierce through... The problem is, unless you are enough savvy, you won't be able to tell
Again, proof is right in this thread.
You see this nonsense? So, NVidia that now has to use "software emulated DX12 features" is now somehow "more advanced"
rofl, the only one that fanbois one side is you.
Its not AMDs fault that 4 years old GPUs perform better/have actual DX12 support opposed to NVidias "new tech". The only thing NVidia has is a bit more efficiency. And if you actually knew how they managed to do that you wouldnt be talking about "new tech" because you sound completely stupid since AMDs architecure is STILL more advanced.
But then, its you so what else to expect.
Theres also one juicy thing about NVidia GPUs is that new line of GPUs wont use anything new except smaller manufacturing process over maxwell which will still make it DX12 gimp.
All this complicated crap, boring. Volt both card so they perform the SAME. Then measure the temps of the GPU. The one that gets hotter is less efficient. Simple.
Only 1 thing you said is true is that its simple. Rest is false.
It is very simple nowadays. Its not like people are pioneering stuff these days as it was 20 years ago (on one hand manufacturers made sure you cant do any more "exotic stuff" like physical mods, OTOH there are tools that make everything you want to do very simple).Silly thing is there are still people who refuse to face the facts.
Thre are still slip ups though. Like unlocking fury to fury x and "SkyOC". Makes me wonder which dummy would go out and buy "k" CPU (judjing by sales, not many which is a bit encouraging). OTOH it has been only skaylake's redeeming quality as every CPU is an "k" CPU.
Do some reading. less efficient electronic "things" dump energy into heat.
Yeah maby agree to the point where you should measure the gpu temps, and the over/downlock till they are the same temp and the look at the performance.
But then again we already know Nvidia is more power efficient Not that i give a shit about it anyways.
Call me when u have done years of running GPU's from AMD and Nvidia for 24/7. And i am not talking about playing some shitty mmo a few hours. I am talking about running them 24/7 to the max they can go. Using OpenCL and Floating point and Dual precision floating point.
Here some Quad SLI on my SR2 with 2 Xeons 5670's (Silverstone 1500Wat PSU)
Do some reading. less efficient electronic "things" dump energy into heat.
Yeah maby agree to the point where you should measure the gpu temps, and the over/downlock till they are the same themp and the look at the performance.
AND you should have exactly same cooling.
Much easier to just measure power consumtion and compare it to performance.
In the end, end user is only interested in whole product and its performance.
Do some reading. less efficient electronic "things" dump energy into heat.
Yeah maby agree to the point where you should measure the gpu temps, and the over/downlock till they are the same themp and the look at the performance.
AND you should have exactly same cooling.
Much easier to just measure power consumtion and compare it to performance.
In the end, end user is only interested in whole product and its performance.
Got no clue what your talking about, what does cooling have to to with power efficient ?
#1 Tom's is being given the hardware and likely MOST likely paid to advertise the product. #2 That product was specifically made "so the manufacturer" says so to take advantage of under volting.
Thing is this,manufacturer's of ALL products have been looking for some gimmick or marketing ploy to make THEIR product more relevant than the other guy.I spent a lot of years in the past believing too much of what i read.Then over time i realized just how corrupt marketing is in general. Point being Tom;'s was given that hardware for a very specific reason and told what that reason was,so of course Tom's was going to promote it as such. The best part of all this,99% of people ho buy this stuff are not going to be fiddling with ANYTHING on their gpu's,they insert,turn it on,then play games and that is it.
Never forget 3 mile Island and never trust a government official or company spokesman.
Just trying to live long enough to play a new, released MMORPG, playing New Worlds atm
Fools find no pleasure in understanding but delight in airing their own opinions. Pvbs 18:2, NIV
Don't just play games, inhabit virtual worlds™
"This is the most intelligent, well qualified and articulate response to a post I have ever seen on these forums. It's a shame most people here won't have the attention span to read past the second line." - Anon
Do some reading. less efficient electronic "things" dump energy into heat.
Yeah maby agree to the point where you should measure the gpu temps, and the over/downlock till they are the same temp and the look at the performance.
But then again we already know Nvidia is more power efficient Not that i give a shit about it anyways.
Call me when u have done years of running GPU's from AMD and Nvidia for 24/7. And i am not talking about playing some shitty mmo a few hours. I am talking about running them 24/7 to the max they can go. Using OpenCL and Floating point and Dual precision floating point.
Here some Quad SLI on my SR2 with 2 Xeons 5670's (Silverstone 1500Wat PSU)
Case. (specialy for DMKano :P )
Yeah, because maxwell based Teslas and Quadros are a blast. Right.
Do some reading. less efficient electronic "things" dump energy into heat.
Yeah maby agree to the point where you should measure the gpu temps, and the over/downlock till they are the same themp and the look at the performance.
AND you should have exactly same cooling.
Much easier to just measure power consumtion and compare it to performance.
In the end, end user is only interested in whole product and its performance.
Got no clue what your talking about, what does cooling have to to with power efficient ?
He's saying that the stock nvidia cooler is better than the AMD stock cooler, as in ... the nvidia one manages to blow more heat out of the die, so if you were to measure heat alone, even if nvidia was consuming more, you wouldn't feel it because of a better cooler. AMD reference coolers are really loud and kind of bad. And I admit this as an owner of a reference overclocked R9 290X. This shit throws fire even on minecraft. Few years ago I had a Sapphire 6870 in crossfire and I never had heat issues, while they were consuming a lot more power.
#1 Tom's is being given the hardware and likely MOST likely paid to advertise the product. #2 That product was specifically made "so the manufacturer" says so to take advantage of under volting.
Thing is this,manufacturer's of ALL products have been looking for some gimmick or marketing ploy to make THEIR product more relevant than the other guy.I spent a lot of years in the past believing too much of what i read.Then over time i realized just how corrupt marketing is in general. Point being Tom;'s was given that hardware for a very specific reason and told what that reason was,so of course Tom's was going to promote it as such. The best part of all this,99% of people ho buy this stuff are not going to be fiddling with ANYTHING on their gpu's,they insert,turn it on,then play games and that is it.
You mean just as they were given that 980 and paid to advertise it some time ago?
And by just looking at history Intel and NVidia have much more stains on their record than ATI/AMD. Much much more.
What would you say about NVidia sending devs cheat sheats with instructions not to use most important DX12 features because your HW sucks at it (and it will continue to suck for another generation)?
And your notion that people should be ignorant fools is....
I buy a gaming laptop, I turn it on, I change nothing. I play 6 clients of EVE at the highest resolution with no lag or issue.
My eyes glaze over at this conversation.
My thanks to Quizzical for recommending Sager USA and nvidia for the graphics card when I bought it last year.
Soon I'll go into my driveway, start my totally stock Nissan Frontier, and drive to work......
I enjoy my simple life, it seems less "angry" this way.
Talking about power efficiency but doing absolutely nothing about it, or, in most cases doing lot of stuff (not just PC stuff) tottaly opposite makes you what exactly?
This isnt about laptops (although you could benefit a bit on a laptop too, you DO want your battery to last longer, right?)
Fact is that this whole "power efficiency issue" on desktop is 99% PR crap. AMD GPUs on the whole use bit more power, but are a bit faster. And cost the same.
Comments
Is this one of those games? Testing and trolling power consumption?
"This may hurt a little, but it's something you'll get used to. Relax....."
"The Sapphire Radeon R9 Fury X Nitro's power consumption plummets from 279W all the way down to a much more moderate 213W."
Upon further review, it looks like they're talking about a Fury non-X, and my mistake was grabbing the card name from that sentence of the review rather than a different one. Regardless, the non-X Radeon R9 Fury is also a much faster card than a GeForce GTX 980.
Furthermore, the argument was purely about performance per watt, not performance per dollar. But if you want to talk about prices, the latest New Egg prices including shipping and before rebate are $518 on a Radeon R9 Fury, $468 on a GeForce GTX 980, and $635 on a Radeon R9 Fury X. As compared to a GeForce GTX 980, the Radeon R9 Fury costs about 11% more and the Fury X about 36% more. I wouldn't think of 36% as "almost" 50%, but then, you think of two CPU cores as almost six and one memory channel as almost two, so at least you're consistent there.
You have to deal with the effects of PowerTune (AMD's power management) and Turbo Boost (nVidia's power management). Neither of which will let you directly control voltage. They could have just as easily played with the Target Power or Target Temperature settings and got similar results, because they, along with a bunch of other factors, go into determining how the chip actually runs.
P = I*E, that's derived from Ohm's law. I don't think anyone disputes that, at least for DC circuits. (Makes a nice acronym to help you remember, particularly for those in elementary school physics).
I don't know why people dispute frequency as a function of power... it blows my mind. The CMOS power frequency equation (which, if Ohm's law is elementary school, CMOS power would be middle school level I guess) clearly shows the effect.
P = F*A*C*(E^2) (hit the link to see what the variables are and how it's relevant)
(Also makes a nice acronym to help you remember). It's a variation of the AC power equation, which makes sense, since an DC stepping frequency is not dissimilar to an AC signal.
The article linked, doesn't really bust any myths. It really just shows that the power management software implemented by major vendors works as advertised. nVidia's starts low and ramps up until you hit a limit; AMD's starts high and ramps down if it hits a limit.
Both voltage and frequency are relevant to power consumption. On the subject of efficiency and "stable" operation voltage takes the cake a bit more. There are dead zones in TTL and CMOS logic, CMOS is designed to operate at lower voltages (3.3 for most common devices is considered high, less than 1v is considered low, anything inbetween is not valid.) and TTL operating at 5v respectively, but irrelevant here as modern computers operate purely on CMOS chips.
Undervolting should only be done if there is good error checking in the chip, which there is but there isn't in the case of graphics cards, your best bet really is to neither overclock the frequency or undercut the voltages, this will cause errors, though you won't see them.
Edit: Well, I kind of take back the "you won't see them" part, your operating system won't see them or report them as what a user may think of as an error... But, you may "see" these errors in the form of black screens, screen tearing, "fast-forwarding" (in this example try overclocking a processor to an extreme level in a Tomb Raider benchmark, quite amusing to play the game at double and triple it's normal operating speed), and in some cases intermittent operation (blue screen of death usually would occur on this.)
Quite simply, an old saying: "Just because you can, doesn't mean you should." and "If it ain't broke, don't fix it."
But talking about Radeon Fury, the article has "Efficiency Per Watt Per FPS" -chart, where GTX 980 beats Fury in 7 out of 9 games, and the article says "AMD's GPU can be almost as efficient as Nvidia's when the company isn't flogging it".
I think that counts as GTX 980 being more efficient, or argument could also be made that they're about as efficient when Radon Fury is undervolted, but I don't see how that article could be used to claim that Fury is more efficient than GTX 980.
Am I missing something from the article?
See, I was right.
He still did not read the article(besides that 1 liner that fits his AMD bias) and just openly make baseless claims that are in direct disagreement with data and conclusions presented in the article.
(see Vrika's post or Ridelynn's part about PowerTune above)
Yes, his posts are nicely written and all but they are utterly flawed and wrong, pure theorycrafting with no support in data...
The rule of thumb is to not listen to guys that can't back their claims.
If I'm looking for a new card and I didn't say I want an nvidia card, I know Quizz will list all the cards in my budget regardless of brand. When people are looking to build new systems, this childish debate between amd and nvidia is just off topic and a waste of time for the creator of the thread and readers.
Unless they specifically say they want from x company, coming out of nowhere in the thread saying x sucks, go y is just trolling. I also never said he's always right, no one is. I said he posts a lot of information on what he's talking about (The majority of times he's right tough).
Instead looking at tens of thousands of real world test well flogg same ol' formula as proof of something.
Frequency mattres little to none. Get that through your heads already.
AND i would like those that write stuff about power tune and boost stop writing if they dont have experience with BOTH. Because they DO NOT work same way and cannot be compared.
Gdemani here (and few others) definitely dont have experience with both. Or with anything that matters at all.
"Underclocking/undervolting/overclocking/overvolting will make your PC explode". No it wont. Today you cant even burn CPU/GPU any more because it will shut down if you go to extremes. And if your PC is not working correctly youre doing something wrong.
And yes i speak with experience from all camps be it on CPU or GPU side. And 15+ years of expereince doing it from good ol' pentium/celeren/duron/athlon days. All the way to physical mods.
If youre a dummy youll pay more for less. Thats a fact. You can educate yourself and profit, but thats not the only perk, youll PC will run faster, more stable use less power and be cooler than dummies next door. Which all translates to longer health of your hardware.
Same engineers who make chips calculate wrong?
Just looking at what crap those same engineers did over the years will make you shit your pants. They are not all knowing gods walking the earth, i have to dissapoint you.
If you can make your PC use less power and run EXACTLY SAME as before, give me 1 good reason why you should not.
It is not that people or myself are way too much pro Intel/Nvidia, the point is, AMD isn't currently competitive in vast range of their products - they recycle old technology for too long and it can't keep up any more. Thus unless you sport AMD bias, there aren't many reasons to recommend their products - NVidia can mostly do what AMD can, just better.
That is the thing, he is not right most of the time, most of the time he is off - trapped in his bubble of theorycrafting that nothing from real world can pierce through... The problem is, unless you are enough savvy, you won't be able to tell
Again, proof is right in this thread.
The more time company spends testing the more accurate data, the more "optimized" HW. Thats why Intel and NVidia have very low safety margins but on AMD (on CPUs and GPUs) you can do wild undervolting/OCing (whichever path you take)
Volt both card so they perform the SAME.
Then measure the temps of the GPU.
The one that gets hotter is less efficient.
Simple.
rofl, the only one that fanbois one side is you.
Its not AMDs fault that 4 years old GPUs perform better/have actual DX12 support opposed to NVidias "new tech". The only thing NVidia has is a bit more efficiency. And if you actually knew how they managed to do that you wouldnt be talking about "new tech" because you sound completely stupid since AMDs architecure is STILL more advanced.
But then, its you so what else to expect.
Theres also one juicy thing about NVidia GPUs is that new line of GPUs wont use anything new except smaller manufacturing process over maxwell which will still make it DX12 gimp.
It is very simple nowadays. Its not like people are pioneering stuff these days as it was 20 years ago (on one hand manufacturers made sure you cant do any more "exotic stuff" like physical mods, OTOH there are tools that make everything you want to do very simple).Silly thing is there are still people who refuse to face the facts.
Thre are still slip ups though. Like unlocking fury to fury x and "SkyOC". Makes me wonder which dummy would go out and buy "k" CPU (judjing by sales, not many which is a bit encouraging). OTOH it has been only skaylake's redeeming quality as every CPU is an "k" CPU.
Yeah maby agree to the point where you should measure the gpu temps, and the over/downlock till they are the same temp and the look at the performance.
But then again we already know Nvidia is more power efficient
Not that i give a shit about it anyways.
Call me when u have done years of running GPU's from AMD and Nvidia for 24/7.
And i am not talking about playing some shitty mmo a few hours.
I am talking about running them 24/7 to the max they can go.
Using OpenCL and Floating point and Dual precision floating point.
Here some Quad SLI on my SR2 with 2 Xeons 5670's (Silverstone 1500Wat PSU)
Case. (specialy for DMKano :P )
Much easier to just measure power consumtion and compare it to performance.
In the end, end user is only interested in whole product and its performance.
#2 That product was specifically made "so the manufacturer" says so to take advantage of under volting.
Thing is this,manufacturer's of ALL products have been looking for some gimmick or marketing ploy to make THEIR product more relevant than the other guy.I spent a lot of years in the past believing too much of what i read.Then over time i realized just how corrupt marketing is in general.
Point being Tom;'s was given that hardware for a very specific reason and told what that reason was,so of course Tom's was going to promote it as such.
The best part of all this,99% of people ho buy this stuff are not going to be fiddling with ANYTHING on their gpu's,they insert,turn it on,then play games and that is it.
Never forget 3 mile Island and never trust a government official or company spokesman.
My eyes glaze over at this conversation.
My thanks to Quizzical for recommending Sager USA and nvidia for the graphics card when I bought it last year.
Soon I'll go into my driveway, start my totally stock Nissan Frontier, and drive to work......
I enjoy my simple life, it seems less "angry" this way.
"True friends stab you in the front." | Oscar Wilde
"I need to finish" - Christian Wolff: The Accountant
Just trying to live long enough to play a new, released MMORPG, playing New Worlds atm
Fools find no pleasure in understanding but delight in airing their own opinions. Pvbs 18:2, NIV
Don't just play games, inhabit virtual worlds™
"This is the most intelligent, well qualified and articulate response to a post I have ever seen on these forums. It's a shame most people here won't have the attention span to read past the second line." - Anon
http://www.anandtech.com/show/9059/the-nvidia-geforce-gtx-titan-x-review/15
You seem to be a little behind on...things.
And by just looking at history Intel and NVidia have much more stains on their record than ATI/AMD. Much much more.
What would you say about NVidia sending devs cheat sheats with instructions not to use most important DX12 features because your HW sucks at it (and it will continue to suck for another generation)?
And your notion that people should be ignorant fools is....
This isnt about laptops (although you could benefit a bit on a laptop too, you DO want your battery to last longer, right?)
Fact is that this whole "power efficiency issue" on desktop is 99% PR crap. AMD GPUs on the whole use bit more power, but are a bit faster. And cost the same.