There is no real question that a Core i7-6700K is better than a Core i5-6600K. Just like there is no real question that a GeForce GTX 980 is better than a GeForce GTX 970. The real question is, is it better by enough to justify paying more for the better option? And the answer is, it depends on your circumstances.
There is no real question that a Core i7-6700K is better than a Core i5-6600K. Just like there is no real question that a GeForce GTX 980 is better than a GeForce GTX 970. The real question is, is it better by enough to justify paying more for the better option? And the answer is, it depends on your circumstances.
I think very few would disagree with such sound logic
As far as comparing a 5.7Ghz overclock to a 4.9GHz overclock
Ok. The flaw in that is assuming that because you found this sample, that all other cases must follow this example.
You found one i7 that overclocks to 5.7. That doesn't mean all will. It also doesn't mean that no i5's can. As a matter of fact, I've found an i5 overclocked to 6.8Ghz. It's just that this guy didn't submit a 3DMark score.
Can't really determine much by looking at outliers, or by looking at isolated use cases (like benchmarks).
In short: buy an i7 if you're an elite arsehole who simply must have the best of everything(nevermind the thing called cost efficiency)
Doesn't that make you a bitter person jealous of those who can afford a I7? Or the typical person who belittles other to compensate for his insecurity about his own choices? Just saying, since you turned this into a mud slinging contest...
Just for info, the comparison of the best score of a 6700k with a single 980ti compared to the best score of a 6600k with a single 980 ti:
Ooops... the I5 gets crushed. And 3DMark is a gaming benchmark...
Ok, fair enough. But do you NEED that? You know, human eye can't see the difference above 60fps unless you've been trained for such a thing.
I'd wait for it to cheapen a bit and the real need for it to arise. Currently, I don't see any need for it. I just don't.
And why would I be jealous of anyone? Actually I'd pity people who throw their money around like mud, but it's their money and their life, so it's no business of mine. Unless it impacts me in an indirect way(like, say, purchasing shitty games, so shitty games keep being made). Then it is my business.
In all reality, proc producers gave up on us. And why wouldn't they when most games are programmed in such a way so as to offload most of the workload onto the gfx. And every proc producer has its own gfx card brand last I checked. Intel + nVidia and AMD + ATi. So, they focus procs on professional use(with a haphazard proc being thrown here or there towards us gamers) and gfx arm focuses on games. Perfectly logical.
edit: I don't UNDERSTAND WHY anyone would need 5,0GHz+. It's CURRENTLY wasteful imo. Except PROFESSIONAL use, ofc.
The track record of "no one will ever need a computer this fast" claims is quite poor. Remember "640K ought to be enough for anyone"? Or was that so long ago that you have no idea what it's talking about?
There is a story, possibly apocryphal, about someone retiring from the US Patent Office in the early 19th century and saying that he wasn't really needed anymore because everything that would ever be invented already had been.
There is no real question that a Core i7-6700K is better than a Core i5-6600K. Just like there is no real question that a GeForce GTX 980 is better than a GeForce GTX 970. The real question is, is it better by enough to justify paying more for the better option? And the answer is, it depends on your circumstances.
The other issue a lot of people consistently fail to look at is that some people aren't purchasing items purely based off of performance per dollar. For example, a Z06 corvette will smoke a 911 porche Carrera S in just about any performance measure, whether its lap times, 0-60, 0-100, braking, etc etc etc. Its also significantly cheaper. That doesn't mean the Porsche is a bad value.
The last proc I bought was an i7-2600k which I got pretty much the day they came out. That was almost 4 years ago. 4 years. Now yes, while we haven't made extreme strides in CPU performance, clock for clock an i7-6700k is around 35-40% faster, that's at identical clock speeds, factor in that 4.8 is generally very attainable for a new 6700k, and you're getting into the realm of something that 50-60% better. That's enough of an upgrade for me to justify. And again, I don't want to save 150 dollars now, so that I end up having to upgrade say 2.5 years from now instead of maybe 3.5 years from now.
"The surest way to corrupt a youth is to instruct him to hold in higher esteem those who think alike than those who think differently."
nVidia is unofficially affiliated with Intel. Everyone knows that they made this agreement the moment AMD purchased ATi. So, you are correct because I didn't phrase myself as I should've.
Also, I didn't say that 6700k is going to be useless(in fact I think that i5 6600{k} is a phenomenal investment). I said that it currently IS an OVERKILL unless you use it for pro uses. Bad phrasing on my part, again.
Really.
This is why nVidia is so interested in making their own CPUs, and why Intel is putting so much effort into their GPU technology.
Those two companies pretty much despise each other as much as two companies can without declaring war.
nVidia is unofficially affiliated with Intel. Everyone knows that they made this agreement the moment AMD purchased ATi. So, you are correct because I didn't phrase myself as I should've.
Also, I didn't say that 6700k is going to be useless(in fact I think that i5 6600{k} is a phenomenal investment). I said that it currently IS an OVERKILL unless you use it for pro uses. Bad phrasing on my part, again.
I sometimes get the impression that Intel hates Nvidia more than they hate AMD.
Remember all those lawsuits that Intel and Nvidia threw at each other several years back over whether Nvidia could legally make chipsets for Intel CPUs? Sounds like something that allied companies do all the time, right?
Seriously, i was holding out waiting for the 20-30% premium to drop down, but they're still retailing 15-20% above MSRP. Anyone know if its still just a supply/demand issue or if its just retailers figuring "hell, they're still selling at this price, so might as well make the extra money"
It has been a supply/demand issue so far. I built a 6700K machine with a 980 TI for video and had to wait quite a bit when they launched. (Asus Hero Mobo which is a great board) The other machine in the house is Ivy Bridge with an AMD 290 card for video. Is the new build faster? Yes. Does it perform noticeably faster? Not really...Sure you can measure it and it benches higher but in actual practice, it isn't all that noticeable.
Now the video is a big boost for me when driving 2 monitors but at 1920 x 1080 either screams.
DDR 4 RAM is nice but it isn't cheap.
The new chip uses less voltage and runs cooler but is also less overclockable i.e. heat isn't an issue but voltage is. It isn't able to achieve the levels we thought it would (but doesn't have to either) so here's my take:
If you are doing a new build absolutely go with the latest. If running Haswell or Ivy Bridge, I don't see a compelling reason to upgrade.
We are at that stage in computer evolution where boosts are incremental at best. Gone are the days when one jumped from a 386 into a Pentium and went.. OMG!
I honestly don't think anyone is really out to crush AMD any longer.
They seem to have crushed themselves fairly well.
It's to the point that Intel can actually release slower chips, and publicly announce they are doing so, and people still say "How great" - because they don't even consider there to be any competition. So Intel is pushing hard to get IoT-style processors and low power "bigger-than-mobile" devices, like MS Surface tablets, hybrids, and AIO units.
And nVidia has branched out into many more markets than PC discrete graphics - even if AMD takes a technological edge on PC video cards, that still doesn't touch their supercomputing and HPC marketshare, the commercial marketshare, and they are pushing into automotive and mobile markets now which pretty much are still wide open fields, particularly for markets that want or require media-rich applications in a low power envelope.
AMD doesn't really compete in any of those areas, their bread and butter are in APUs that go into low cost, low margin machines, and low power server cores, and that is what they are doubling down on going into the future.
Looking at SEC annual reports, nVidia pulled in almost as much just with Tegra Automotive, as AMD did with it's entire x86 and GPU revenue combined in 2014. nVidia managed to pull off a 7%revenue growth against a 10% declining PC market, AMD fell by 16% (so faster than the PC market did).
AMD isn't doing well financially, and hasn't been for a while. However, I wouldn't take that as a non-recommendation of their products. The old Avis marketing line comes to mind: "We're #2, so we try harder"
Comments
거북이는 목을 내밀 때 안 움직입니다
Ok. The flaw in that is assuming that because you found this sample, that all other cases must follow this example.
You found one i7 that overclocks to 5.7. That doesn't mean all will. It also doesn't mean that no i5's can. As a matter of fact, I've found an i5 overclocked to 6.8Ghz. It's just that this guy didn't submit a 3DMark score.
Can't really determine much by looking at outliers, or by looking at isolated use cases (like benchmarks).
There is a story, possibly apocryphal, about someone retiring from the US Patent Office in the early 19th century and saying that he wasn't really needed anymore because everything that would ever be invented already had been.
¯\_(ツ)_/¯
The other issue a lot of people consistently fail to look at is that some people aren't purchasing items purely based off of performance per dollar. For example, a Z06 corvette will smoke a 911 porche Carrera S in just about any performance measure, whether its lap times, 0-60, 0-100, braking, etc etc etc. Its also significantly cheaper. That doesn't mean the Porsche is a bad value.
The last proc I bought was an i7-2600k which I got pretty much the day they came out. That was almost 4 years ago. 4 years. Now yes, while we haven't made extreme strides in CPU performance, clock for clock an i7-6700k is around 35-40% faster, that's at identical clock speeds, factor in that 4.8 is generally very attainable for a new 6700k, and you're getting into the realm of something that 50-60% better. That's enough of an upgrade for me to justify. And again, I don't want to save 150 dollars now, so that I end up having to upgrade say 2.5 years from now instead of maybe 3.5 years from now.
"The surest way to corrupt a youth is to instruct him to hold in higher esteem those who think alike than those who think differently."
- Friedrich Nietzsche
This is why nVidia is so interested in making their own CPUs, and why Intel is putting so much effort into their GPU technology.
Those two companies pretty much despise each other as much as two companies can without declaring war.
Remember all those lawsuits that Intel and Nvidia threw at each other several years back over whether Nvidia could legally make chipsets for Intel CPUs? Sounds like something that allied companies do all the time, right?
It has been a supply/demand issue so far. I built a 6700K machine with a 980 TI for video and had to wait quite a bit when they launched. (Asus Hero Mobo which is a great board) The other machine in the house is Ivy Bridge with an AMD 290 card for video. Is the new build faster? Yes. Does it perform noticeably faster? Not really...Sure you can measure it and it benches higher but in actual practice, it isn't all that noticeable.
Now the video is a big boost for me when driving 2 monitors but at 1920 x 1080 either screams.
DDR 4 RAM is nice but it isn't cheap.
The new chip uses less voltage and runs cooler but is also less overclockable i.e. heat isn't an issue but voltage is. It isn't able to achieve the levels we thought it would (but doesn't have to either) so here's my take:
If you are doing a new build absolutely go with the latest. If running Haswell or Ivy Bridge, I don't see a compelling reason to upgrade.
We are at that stage in computer evolution where boosts are incremental at best. Gone are the days when one jumped from a 386 into a Pentium and went.. OMG!
Seaspite
Playing ESO on my X-Box
They seem to have crushed themselves fairly well.
It's to the point that Intel can actually release slower chips, and publicly announce they are doing so, and people still say "How great" - because they don't even consider there to be any competition. So Intel is pushing hard to get IoT-style processors and low power "bigger-than-mobile" devices, like MS Surface tablets, hybrids, and AIO units.
And nVidia has branched out into many more markets than PC discrete graphics - even if AMD takes a technological edge on PC video cards, that still doesn't touch their supercomputing and HPC marketshare, the commercial marketshare, and they are pushing into automotive and mobile markets now which pretty much are still wide open fields, particularly for markets that want or require media-rich applications in a low power envelope.
AMD doesn't really compete in any of those areas, their bread and butter are in APUs that go into low cost, low margin machines, and low power server cores, and that is what they are doubling down on going into the future.
Looking at SEC annual reports, nVidia pulled in almost as much just with Tegra Automotive, as AMD did with it's entire x86 and GPU revenue combined in 2014. nVidia managed to pull off a 7%revenue growth against a 10% declining PC market, AMD fell by 16% (so faster than the PC market did).
AMD isn't doing well financially, and hasn't been for a while. However, I wouldn't take that as a non-recommendation of their products. The old Avis marketing line comes to mind: "We're #2, so we try harder"