Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Intel: CPU will become slower but more energy efficient, Moore's law is over.

KiyorisKiyoris Member RarePosts: 2,130
edited February 2016 in Hardware
https://thestack.com/iot/2016/02/05/intel-william-holt-moores-law-slower-energy-efficient-chips/



Intel has said that new technologies in chip manufacturing will favour better energy consumption over faster execution times – effectively calling an end to ‘Moore’s Law’, which successfully predicted the doubling of density in integrated circuits, and therefore speed, every two years.


Intel: “We’re going to see major transitions. The new technology will be fundamentally different. The best pure technology improvements we can make will bring improvements in power consumption but will reduce speed.”


Everyone who follows this tech a bit saw this coming though, benefits from FinFet are over, EUV is years behind, ASML stocks got a massive hit.

The investment needed for more speed are way too high, together with PC sales tanking, there is simply no money to keep moore's law alive.

There is still a lot of money to be found in the mobile market, it's a very profitable market, and Intel its focus is shifting to there. Also, "the internet of things" like Intel mentions, basically putting chips in..everything..is a growing market.
«1

Comments

  • KiyorisKiyoris Member RarePosts: 2,130
    edited February 2016
    Also, ASML stocks.

    Stock traders knew this news was coming when the costs and delays of EUV became apparent.


  • MalaboogaMalabooga Member UncommonPosts: 2,977
    edited February 2016
    We already had a laugh at this spin.

    "We wasted billions upon billions to try to produce something faster but managed only 5% in some specific tasks and now we have to spin it to look good"

    And all those who said Intel was just "holding off faster chips" lol. Nope. Nothing new for Intel for forseeable future (what everyone with at least modicum of interest inthe area knew), except "lets change a socket every year and not update previous platforms, theres enough suckers out there"
  • breadm1xbreadm1x Member UncommonPosts: 374
    edited February 2016
    "doubling of density in integrated circuits" "and therefore speed"

    Witch is bullshit they forgot something very important, they become more energy efficient to...

    More never spoke about speed, no clue where they got that from.
    He was talking about the number of transistors in an integrated circuit.
    Donno who wrote the article but its a complete moron.

    Here a little tip.

    "NVIDIA Pascal GPU Feature 17 Billion Transistors, Almost Twice The Transistors of Fiji"

    Or does more's law only goes for Intell cpu's ?  (rotfl)

    Fury X has a total of 8.9 Billion transistors.
    GTX Titan X comes with 8.0 Billion transistors.
    (x-box cpu has 5.0  btw )








    Post edited by breadm1x on

  • KiyorisKiyoris Member RarePosts: 2,130
    edited February 2016
    breadm1x said:

    Moore never spoke about speed
    He was talking about the number of transistors in an integrated circuit.
    Donno who wrote the article but its a complete moron.

    The increase in transistors is what causes the increase in speed pumpkin.

    Article: "effectively calling an end to "Moore’s Law’, which successfully predicted the doubling of density in integrated circuits, and therefore speed"

    Nothing wrong with the article.
  • PhryPhry Member LegendaryPosts: 11,004
    Do CPU's even need to be faster though? as far as i can see, what we need more of now is processors on a die, the speed of the processor is probably less important now than its ever been. So, focusing more on increasing the number of processors than engaging into raising clock speeds, would seem to me to be far more productive direction, than engaging into some increasingly diminishing returns arms race as to who can have the highest clock speeds, after all, we're already in the position where for new builds a quad core processor is an absolute minimum standard.
  • KiyorisKiyoris Member RarePosts: 2,130
    Phry said:
    Do CPU's even need to be faster though? as far as i can see, what we need more of now is processors on a die, the speed of the processor is probably less important now than its ever been. So, focusing more on increasing the number of processors than engaging into raising clock speeds, would seem to me to be far more productive direction, than engaging into some increasingly diminishing returns arms race as to who can have the highest clock speeds, after all, we're already in the position where for new builds a quad core processor is an absolute minimum standard.
    I think regular people stopped caring years ago. Once you could play HD video and browse the web without any lag, which happened like 6 years ago, I think most people were satisfied.

    4k isn't enough to entice those people to upgrade, I'm a gamer and I don't even care about it, I'd rather have 1080P at 4x the framerate

    Intel has been very active in USB 3.1 and NVMe SSD, but that won't be enough for people to upgrade their PC.


  • KiyorisKiyoris Member RarePosts: 2,130
    edited February 2016
    breadm1x said:

    Here a little tip.

    "NVIDIA Pascal GPU Feature 17 Billion Transistors, Almost Twice The Transistors of Fiji"

    Or does more's law only goes for Intell cpu's ?  (rotfl)

    GPU are right now going through the benefits of FinFET. The next GPU (like Pascal) will be the first that get the benefits of FinFET.

    But you have to understand that Intel already got those benefits, FinFET was introduced with Ivy Bridge and Haswell, it took Intel from 32nm to 22nnm.

    Those beneifts are already "used up" by Intel. Nanowire FET and TFET are more focused on energy efficiency, and nowhere near as promising as FinFET.

    FinFET is not a new technology that just came around the courner, it has been around a long time, DARPA was talking about multigate transistors decades ago. You finally see it appearing in GPU, but Intel has long enjoyed those benefits, that increase in transistors is already "used up" by Intel.
  • SquishydewSquishydew Member UncommonPosts: 1,107
    As someone who only knows about the amount of cores and the number listed behind "Ghz" can someone explain what exactly this means?

    basically the Ghz speed wont be going up..?

    Sorry for the ridiculously stupid question >.<
  • KiyorisKiyoris Member RarePosts: 2,130
    edited February 2016
    As someone who only knows about the amount of cores and the number listed behind "Ghz" can someone explain what exactly this means?

    basically the Ghz speed wont be going up..?

    Sorry for the ridiculously stupid question >.<
    It simply means Intel sees its future more in mobile and "the internet of things".

    (the internet of things is putting a chip in everything, so your coffee maker can talk to your PC)

    One of the reasons is that it has become incredibly costly to keep making CPU faster, and the market to support PC CPU has kinda been undermined by smartphones and mobile devices (PC sales have plummeted).

    They see a future in making chips more power efficient, instead of more powerful.
  • breadm1xbreadm1x Member UncommonPosts: 374
    edited February 2016
    So witch one is faster, an Ivy-Bridge or Haswell at the same clock speed ?
    (both have 1.4 billion transistors)

    "One of the reasons is that it has become incredibly costly to keep making CPU faster, and the market to support PC CPU has kinda been undermined by smartphones and mobile devices (PC sales have plummeted).

    Ah okey and smart phones dont have IC's ? :pleased:


    "Sparc M7" 10 billion transistors b.t.w.
    http://siliconangle.com/blog/2015/10/26/oracle-debuts-first-systems-with-10-billion-transistor-sparc-m7-chip/
     



  • KyleranKyleran Member LegendaryPosts: 44,093
    edited February 2016
    Kiyoris said:
    Phry said:
    Do CPU's even need to be faster though? as far as i can see, what we need more of now is processors on a die, the speed of the processor is probably less important now than its ever been. So, focusing more on increasing the number of processors than engaging into raising clock speeds, would seem to me to be far more productive direction, than engaging into some increasingly diminishing returns arms race as to who can have the highest clock speeds, after all, we're already in the position where for new builds a quad core processor is an absolute minimum standard.
    I think regular people stopped caring years ago. Once you could play HD video and browse the web without any lag, which happened like 6 years ago, I think most people were satisfied.

    4k isn't enough to entice those people to upgrade, I'm a gamer and I don't even care about it, I'd rather have 1080P at 4x the framerate

    Intel has been very active in USB 3.1 and NVMe SSD, but that won't be enough for people to upgrade their PC.


    As a "regular" person I stopped caring when game developers stopped creating new titles designed for the next generation of processors and instead focused on the previous generation so everyone and their brother could play the game.

    This all took place around the time WOW and Vanguard released, and since then my gaming laptops have been upgraded to take advantage of better graphic processing capabilities and not so much in CPU processing.

    On the business side back in the 90s spreadsheets could take so long recalculating people put up dummy screens of their please wait message to hide the fact they were playing games at work.

    Now no matter how large my spreadsheets get, or how many I have open, recalculating takes only an instant, with the only real limitations or slowdowns being the time it takes to open them from a network drive.

    No need to increase CPU speeds until a new product comes out that needs the extra power, so I can see why Intel is making this move.

    "True friends stab you in the front." | Oscar Wilde 

    "I need to finish" - Christian Wolff: The Accountant

    Just trying to live long enough to play a new, released MMORPG, playing New Worlds atm

    Fools find no pleasure in understanding but delight in airing their own opinions. Pvbs 18:2, NIV

    Don't just play games, inhabit virtual worlds™

    "This is the most intelligent, well qualified and articulate response to a post I have ever seen on these forums. It's a shame most people here won't have the attention span to read past the second line." - Anon






  • KiyorisKiyoris Member RarePosts: 2,130
    edited February 2016
    Kyleran said:
    As a "regular" person I stopped caring when game developers stopped creating new titles designed for the next generation of processors and instead focused on the previous generation so everyone and their brother could play the game.
    Probably helped by consoles migrating to x86, and developers are simply making games for consoles and porting them to PC (or the other way around, whatever you want to call it).

    So no real reason to try to get the most out of games on PC, porting is so much cheaper.

    Those PC vs PS4 vs Xbox One comparison videos on Youtube...are a bit depressing to look at sometimes, often they look exactly the same on all 3 platforms (with some ridiculously minor differences). There's the FPS difference, but ok, graphically they look the same.
  • PhryPhry Member LegendaryPosts: 11,004
    Kiyoris said:
    As someone who only knows about the amount of cores and the number listed behind "Ghz" can someone explain what exactly this means?

    basically the Ghz speed wont be going up..?

    Sorry for the ridiculously stupid question >.<
    It simply means Intel sees its future more in mobile and "the internet of things".

    (the internet of things is putting a chip in everything, so your coffee maker can talk to your PC)

    One of the reasons is that it has become incredibly costly to keep making CPU faster, and the market to support PC CPU has kinda been undermined by smartphones and mobile devices (PC sales have plummeted).

    They see a future in making chips more power efficient, instead of more powerful.
    Which is the definitive stupid answer tbh.

    Ghz is just refers to how many cycles per second its rated at, or hertz, as Mhz = millions of cycles per second, Ghz = billions of cycles per second.

    Ignore Kyoris's doom and gloom however, as it bears no relation to actual hardware sales, only to complete unit sales, which is not the entirety of the PC market by a long shot, as Intel and AMD's CPU sales are not solely dependent on PC unit sales, as is also the case with Nvidia and AMD's GPU sales not being dependent on PC unit sales.
  • KiyorisKiyoris Member RarePosts: 2,130
    breadm1x said:
    yes, and it's massive

    not hard to increase transistor count if your die size is the size of a small pancake




  • breadm1xbreadm1x Member UncommonPosts: 374
    edited February 2016
    Intel says a lot btw here from 2014 :pleased:
    http://www.pocket-lint.com/news/126289-intel-claims-that-by-2026-processors-will-have-as-many-transistors-as-there-are-neurons-in-a-brain

    B.T.W. that SPARC 7 cpu reminds me of my Dec Alpha RISC cpu i have somwhere.....
    Found it even has a motherboard on it and it works to even has PCI slots :pleased:
    (and thats 1992 1.7 million transistors)

  • KiyorisKiyoris Member RarePosts: 2,130
    edited February 2016
    Gorwe said:
    So, I can safely stay with my 3470 and hopefully 6600k in the future.

    Good to know :D
    Pretty much, and even if you want USB 3.1, there are tons of USB 3.1 cards already out. Including type C cards, the reversible USB (like Apple's whatever it's called)

    Just put a card in an open slot and you have USB 3.1, no need to change your mobo.

    (unless you filled your PCI Express slots with dual GPU cards, in which case you're out of luck lol)




  • KiyorisKiyoris Member RarePosts: 2,130
    edited February 2016
    Gorwe said:
    ...I know I know! They did something insignificant to the central part of a die so as to force people to buy MoBos lol! ...whatever
    Yah, which is why Intel are big sponsors of everything that might entice you to upgrade a mobo, like...USB 3.1, NVMe.

    But....I have never really felt I needed a faster SSD or USB slot. The only time I really transfer data over USB, like large data, is when I plug in an SD card reader to transfer picture from my DSLR...and that's fast enough.

    A back up through USB is something I only do like once a month, it's fast enough.
  • PhryPhry Member LegendaryPosts: 11,004
    Gorwe said:
    So, I can safely stay with my 3470 and hopefully 6600k in the future.

    Good to know :D
    Its reasonably okay for now, but for the future, you might find that a quad core is insufficient.
  • KiyorisKiyoris Member RarePosts: 2,130
    edited February 2016
    There is conflicting evidence about quantum computing.

    For general computing, it doesn't seem to be faster, it actually seems significantly slower.

    It is only for specific, very specific algorithms, for very very specific problems that you see quantum speedup.


  • KiyorisKiyoris Member RarePosts: 2,130
    edited February 2016
    Immediate future is EUV, lithography with extreme ultraviolet light.

    IMEC made the first 5nm chips: http://www.electronicsweekly.com/news/business/manufacturing/imec-and-cadence-tape-out-first-5nm-ic-2015-10/ thanks to EUV.

    But, EUV got delayed, a lot, and the machinese are more expensive than anticipated, and not too many are buying them yet, which is why you saw ASML's stock tumble like crazy.

    Also, uptime was low at first (uptime is how long a machine will keep running putting out chips), it's now better, at 70%.


    ASML EUV machine:


  • Leon1eLeon1e Member UncommonPosts: 791
    edited February 2016
    Oh well call me hipster but my boyfriend and I still play on FX-8350 and i5-2500k@4.5Ghz and we still max out games. (1080p@60fps)

    We don't upgrade because 1) We don't really feel the need to (0 hype about VR or 4K gaming) and 2) The value of the US dollar is so damn high, the electronics prices have almost doubled in our country.

    Can't wait for the next dollar crash. xD 

    Hopefully the upcoming AMD architecture would be able to compete with Intel so they would lower the prices. 600$ for a CPU is a bit over the top, given you get a 980ti for that much money, or you know, the latest iPhone with all of its camera  and display gimmicks and processing and graphics power. 
  • PhryPhry Member LegendaryPosts: 11,004
    In the next years, Intel processor will still gain 5 to 15% performance each new generation, and doom mongers like the OP will have to eat their hat...
    The number of times that people are proclaiming the doom or end of the PC is only equalled by the number of times that its been proven that it isnt.
  • KiyorisKiyoris Member RarePosts: 2,130
    edited February 2016
    In the next years, Intel processor will gain 5 to 15% performance each new generation

  • KiyorisKiyoris Member RarePosts: 2,130
    edited February 2016
    Leon1e said:
    We don't upgrade because 1) We don't really feel the need to (0 hype about VR or 4K gaming)
    I don't see the benefit of 4k, you are effectively decreasing your FPS by 2/3.

    Increase in resolution by 4 = 1/3th of the FPS.

    When we went from 800*600 to Full HD, we had the increases in hardware speed to support the move. And the increase in resolution was easily apparent. That's not the case with 4k.

    From my "4k is pointless" thread, increasing resolution by 4 drops your framerate by about 70%:


  • MalaboogaMalabooga Member UncommonPosts: 2,977
    edited February 2016
    Leon1e said:
    Oh well call me hipster but my boyfriend and I still play on FX-8350 and i5-2500k@4.5Ghz and we still max out games. (1080p@60fps)

    We don't upgrade because 1) We don't really feel the need to (0 hype about VR or 4K gaming) and 2) The value of the US dollar is so damn high, the electronics prices have almost doubled in our country.

    Can't wait for the next dollar crash. xD 

    Hopefully the upcoming AMD architecture would be able to compete with Intel so they would lower the prices. 600$ for a CPU is a bit over the top, given you get a 980ti for that much money, or you know, the latest iPhone with all of its camera  and display gimmicks and processing and graphics power. 
    The only thing you need for VR and 4k is better GPU, and CPU will be even less relevant as it will be several times more GPU bound.

    http://www.pcper.com/reviews/Systems/Quad-Core-Gaming-Roundup-How-Much-CPU-Do-You-Really-Need

    Just compare different CPU results from 1080 -> 1440, and those are already old and outdated games.

    Also mantle results as a preview of dx12 and vulcan make CPUs even less important. When there was CPU bottleneck mantle helped a lot bringing 65$ CPU and 400$ CPU withn 30% on 1080p. On 4k there would be 0 difference.

    Unless youre doing some tri or 4 way high end GPU setup....no you dont have to worry about CPU at all.
Sign In or Register to comment.