http://anandtech.com/show/10183/intels-tick-tock-seemingly-dead-becomes-process-architecture-optimizationIntels Tick-Tock philosophy is dead. What this means for gamers is nothing new. The Intel 2500k is already legendary, but it will be even moreso now.
We've hit a plateau which means that if you have a good processor right now, nothing better will come out in the foreseeable future. And when new architecture comes out, it will be in improvements to power consumption and thermals. These are more significant for laptops who are limited by space and power capacity.
For gamers on desktop this effectively means that the only thing that is worth upgrading now, is your GPU. If you're planning to go out and build a new desktop- Hold your horses. You might only see gains from upgrading the GPU (pascal within 6-8 months). Maybe get a new case, a good SSD, a new power supply or a really nice monitor.
What will this mean? For one, we are still looking at the majority of apps being dual core. We still haven't gotten into the mainstream of quad-cores being utilized across most apps on PC. This means that games are going to continue to most likely be optimized for what is available.
In 5 years time, the 4770k is still going to be king of the crop it seems. With DDR4 also giving very negible gains, it seems it just got a lot more affordable being a pc gamer.
It's possible we will see a trend towards quadcores. Steam hardware survey says that 48% of all gamers are still on dual core machines;
http://store.steampowered.com/hwsurveyWhen you hear console gamers say that buying and upgrading a pc is too much of a hassle, what do you think when you see things like this? Do you think this intel stoppage is good. Is it good that the power creep advancement takes a backseat? how will this impact pricing? does this mean that GPU is becoming a lot more important than CPUs? Does it mean that developers will put more effort into using the CPUs we have?
MMOs have always been notorious for being some of the most CPU heavy games. The CPU helps render the large landmass, and calculating the many objects in the game world. it helps process many of the underlaying systems, while the GPU renders everything with nice bump mapping, shaders, shadows, and other effects.
We do have games that low a nice beefy CPU. Modding community for many games rely on it. GTA5 loves the CPU power. As do BF4.
Comments
I would recommend looking up what "mores law" is before you use it.
One could think that you dont know what your talking about...
And i ordered a 8.9 billion transistors (AMD Nano) it is so CUTE !
(OMFG did i realy call a videocard cute ?!?!?!?!... oohh... it IS)
I always understood the tick-tock philosophy as a pass through of Moores law and the prediction of exponential growth (at least in the context of how Intel have adapted it) and seeing it coming to an end seemed like a logical conclusion (to my bird brain) to conclude that moores law is no more for Intel, because they are running away from the principle of the exponential growth in moores law.
Could you explain why Intel moving away from tick-tock is not a break of moores law?
Even a Xbox one main CPU (5 billion) has more transistors then a Intel 6400k
Even the 18 core Xeon haswell e5 has 5.5 Billion :P
My new videocard has 8.6 billion :P
A GTX980TI has around 8 Billion.
Thisone has 10 Billion transistors.
But this one aint so cute as my AMD nano :P
My first pc had 29,000 transistors :P (4.75Mhz IBM PC/XT 8088)
https://en.wikipedia.org/wiki/Transistor_count
Don't want to call it moore's law, then call it something else.
But the freebies of miniaturization were over once the benefits of FinFET were exhausted, that happened about 2 years ago for Intel, it will happen in 2 years from now for GPU too.
The party wasn't going to last forever, everyone knew this, not only is further miniaturization extremely difficult and costly (see delays of EUV), many consumers are happy with the speeds they have. Once consumers could play HD video and could browse with 40 tabs open, most had everything they needed, that happened several years ago.
There is still money to be made from mobile, ironically because another tech has run out of steam, battery power, but even there the growth is limited, because the screen is consuming most battery power, not the processor.
"True friends stab you in the front." | Oscar Wilde
"I need to finish" - Christian Wolff: The Accountant
Just trying to live long enough to play a new, released MMORPG, playing New Worlds atm
Fools find no pleasure in understanding but delight in airing their own opinions. Pvbs 18:2, NIV
Don't just play games, inhabit virtual worlds™
"This is the most intelligent, well qualified and articulate response to a post I have ever seen on these forums. It's a shame most people here won't have the attention span to read past the second line." - Anon
It's also a RISC processor made for servers, it's going to be good at a very specific server task, it is not going to be a good processor for general purpose computing.
It's like comparing a tractor to a car, the car might have 10 times as much horsepower as a tractor, but the tractor has a different purpose and is much better than the car at some specific things.
Intel simply hasn't had any stiff competition over the last four years from AMD so there was no reason to push the bounds of current CPU tech. Remember, Intel is in the business of making money and, there is no money in simply competing with yourself. Hopefully, with the release of Zen this October, competition will return to the CPU industry and we will again have a reason to get excited over new releases.
As to the comments on games (or apps in general) only using two cores, this isn't necessarily true either. I've played several games lately that, while they may still favor one core over the others, did spread a significant portion of the work between all six cores on my rig. It may still be a few years out before we see quad-core as a minimum requirement but they are already getting utilized by many of the more recent titles. Beyond the games themselves, simply having the additional cores to spread out other tasks that may be running in the background, things like web browsers, chat programs and video streaming, is already very beneficial.
On to the subject of GPU's, I agree that GPU's are often the better upgrade when gaming improvements are the key factor but, even GPU's seem to have slowed in development recently. Both AMD and nVidia have had several re-badges and even a year without a new series release over the past five years. Don't get me wrong, they have still been pushing the top end but, mid-to-high end cards from six years ago are still able to push playable frame rates in most titles today. Case in point, I run a pair of Radeon HD 6870's in my system (rel. date: Oct 2010) and most games I still get 30+ fps with all settings on high and less demanding titles average 45+ on Max settings. Again, this is all set to change over the next year as I believe both AMD and nVidia are set to release all new series (read: no re-badges) this year with the promise of significant gains over the existing stock.
This is all a double-edged sword so to speak for consumers. It's great that our hardware investments are seeing longer life cycles but, this also results in slower depreciation so people on a tight budget may have to wait longer to afford that upgrade they've had their eye on. As far as software development goes, I don't see this having any impact in the short term. As you accurately pointed out, the hardware is still in most cases capable of more than is being demanded of it so software development will proceed as usual. Now, if this upcoming generation does not realize any significant gains, then we may begin to see a push for more optimized coding but that would still be 2-3 years from now.
1. It is taking longer to complete a "tick-tock" cycle.
2. You have to sell enough of whatever comes out of the pipeline to justify the expense of research, building new fabrication plants and so forth.
They are keeping the tick (process), keeping the tock (architecture) but now have added a toe (optimisation).
And that graph - 180nm in 2001 compared to 14nm today. Written another way: 140å (angstroms).
Which may take a while.
Some argue "Moore's law has slowed down", which is just another way of saying Moore's law is dead.
I don't think it's down to competition, silicon has hit the physical limits of what is possible. Beyond 5nm....you get so much leakage that it is impossible to continue, and that limit is hard coded by nature and has to do with the size of molecules, the limitations of silicon is not something you overcome less you can walk on water.
Lots of competition; lots of growth potential. The growth areas for Intel though are phones, tablets, wearables, home appliances etc. Whether we need e.g. fridges with LED doors is another question but TVs, "watches / fitbits" are coming.
And companies like ARM, NVidia, Qualcom and many others operate in these markets.
And Intel's cutting "edge" is size; as Kiyorsis said above
"It's 10 billion transistors because the size of the chip is physically large, not because they increased the amount of transistors on a similarly sized integrated circuit"
And the flip side is true: Intel can put more chips on a smaller space. So if <<insert non PC product here>> needs X transistors Intel will be able to supply it in a smaller package. Whether they can do so at a low enough cost and how far they will push is another matter; they are pushing however - hence low power consumption solutions etc.
Other alternatives are spintronics and quantum computing, but they can only be used in a very limited way, they are not good for general computing, especially quantum computing isn't.
The next leap in computing power will be ushered in by a materials advance or other change to the technological conditions.
Could be a lot of things.