Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Intel: CPU will become slower but more energy efficient, Moore's law is over.

2»

Comments

  • jesteralwaysjesteralways Member RarePosts: 2,560
    edited February 2016
    Currently clockspeed is pretty pointless, there was a time when running a cpu instruction would require several clocks cycle since back then there were no cache memory, instructions were about 4Bytes size, RAM were about 54KB size. CPU would have to read which instruction it would have to execute from RAM and then read CPU ROM for a matching instruction, if there is a match start executing step by step. It was a long process and it would feel longer due to the size or RAM, CPU ROM and instruction size. For a long period of time CPU design followed this rule, increased the size of RAM for more instruction, increased the size of instructions from time to time but above all they tried to increase number of clockspeed so that they could execute more instructions. As they worked more and more on improving CPU they eventually came to current conclusion, it is to have a set of amount of instruction of bigger size to be executed in limited number of clock cycle. So they invented cache memory and changed design in a way that how many times a cpu can clear out the instructions in the cache memory. So at this point they are trying to achieve a high amount of cache memory to be cleared in a very low number of cycle, for example if they could achieve 256MB cache memory cleared in 1MHz clockspeed, then in 1GHz they could clear 256000MB or 250GB cache memory, that means 256GB instructions could be executed with 1GHz. So now you tell me, if 250GB size instructions could be cleared with 1Gz clockspeed do we really need more? And is it financially profitable to go for more clockspeed or more cache memory and faster execution of instruction? I think Intel is going for latter, so is AMD with ZEN.

    Boobs are LIFE, Boobs are LOVE, Boobs are JUSTICE, Boobs are mankind's HOPES and DREAMS. People who complain about boobs have lost their humanity.

  • KyleranKyleran Member LegendaryPosts: 44,093
    Well, Moore's law may be coming to an end, but as far as I can tell Machrone's law is still alive and kicking.

    The Sager laptop I really want would cost this much, but I'll be settling for one at about 1/2 of that for now.

    "True friends stab you in the front." | Oscar Wilde 

    "I need to finish" - Christian Wolff: The Accountant

    Just trying to live long enough to play a new, released MMORPG, playing New Worlds atm

    Fools find no pleasure in understanding but delight in airing their own opinions. Pvbs 18:2, NIV

    Don't just play games, inhabit virtual worlds™

    "This is the most intelligent, well qualified and articulate response to a post I have ever seen on these forums. It's a shame most people here won't have the attention span to read past the second line." - Anon






  • RidelynnRidelynn Member EpicPosts: 7,383
    I gotta agree with @breadm1x on this one:

    Moore's law just talks about component density. It doesn't say anything about speed directly. It doesn't even specifically state transistors, and acknowledges that one day ICs may be made from material other than silicon.

    Moore's paper, from which Moore's Law was derived, is an interesting read.
    https://www.cs.utexas.edu/~fussell/courses/cs352h/papers/moore.pdf

    Moore's law isn't dead, yet.

    Just because Intel's next CPU may be slower doesn't mean it may not contain more transistors. In fact, I would bet that it will have more, because Intel is pushing competitive intregrated graphics, and those graphics cores have a good number of transistors. Especially if you consider something like Crystal Well. So the CPU may be slower, but the die will probably have more transistors overall, continuing with the trend of Moore's Law.
  • WizardryWizardry Member LegendaryPosts: 19,332
    So in other words,they still need an angle to produce sales,except this angle is a shallow one,that won't encourage anything because it ONLY matters to the consumer and they will aim for value,speed versus cost.

    So best thing Intel can do is jimmy rig their chips to start failing faster because nobody is going to spend 2-3k on an upgrade to get better power consumption,so all sales will halt.

    Never forget 3 mile Island and never trust a government official or company spokesman.

  • The user and all related content has been deleted.

    거북이는 목을 내밀 때 안 움직입니다












  • tawesstawess Member EpicPosts: 4,227
    Phry said:
    In the next years, Intel processor will still gain 5 to 15% performance each new generation, and doom mongers like the OP will have to eat their hat...
    The number of times that people are proclaiming the doom or end of the PC is only equalled by the number of times that its been proven that it isnt.
    Only this is not really calling for the doom of PC, just the end of a processor arms race (only to replace it with another. =P ) but it is true that the rise of mobile computing have pretty much killed laptops. Keeping the processors cheap on battery will make mobile units much better. teh PC will always have it´s place but even i have put away the laptop for a tablet unit as my companion tech. 

    This have been a good conversation

  • QuizzicalQuizzical Member LegendaryPosts: 25,531
    Moore's Law only says more transistors on a die.  It doesn't say what you'll do with them.
  • QuizzicalQuizzical Member LegendaryPosts: 25,531
    DMKano said:
    Kiyoris said:
    There is conflicting evidence about quantum computing.

    For general computing, it doesn't seem to be faster, it actually seems significantly slower.

    It is only for specific, very specific algorithms, for very very specific problems that you see quantum speedup.



    That's with quantum computers TODAY.

    The thing with technology - is - it gets better, and we learn how to overcome those limits eventually.

    Look back at the infancy of transistor based computing - they sucked frankly, but they knew the ceiling was really high.

    Same with quantum tech today - it sucks, but the ceiling so high we can't even imagine it yet.

    It is contested as to whether the claimed quantum computers are real quantum computers.  Even if they are, it doesn't immediately follow that there will eventually be quantum computers that are really good, or even good at anything.  And even if there eventually are really good quantum computers, "quantum" doesn't mean "really fast".  It's a different type of tool from a classical computer entirely, and would supplement and not replace it.

    It's also possible that quantum computers would only be for HPC use, and never widely used for consumer use.  You can keep the quantum computing part within a tiny fraction of a degree of absolute zero if you've got a huge apparatus in a pristine environment, but that's rather harder to fit inside of a cell phone.  Even if they got it to the point where it "only" took liquid nitrogen cooling (which is still hot enough that random heat effects will wreak havoc on the quantum effects that you're looking for), that's a non-starter for consumer use.
  • MalaboogaMalabooga Member UncommonPosts: 2,977
    DMKano said:
    Kiyoris said:
    Leon1e said:
    We don't upgrade because 1) We don't really feel the need to (0 hype about VR or 4K gaming)
    I don't see the benefit of 4k, you are effectively decreasing your FPS by 2/3.

    Increase in resolution by 4 = 1/3th of the FPS.

    When we went from 800*600 to Full HD, we had the increases in hardware speed to support the move. And the increase in resolution was easily apparent. That's not the case with 4k.

    From my "4k is pointless" thread, increasing resolution by 4 drops your framerate by about 70%:



    I am happily gaming at 2K - so are many others ;)
    The person you quote is apparently another one who didn't understand that you don't use the same graphic card(s) to game at 1080p and to game at 4K...

    It's a bit like the people who can't afford a sports car and therefore would say sports cars are crap ;)
    Speed limit is still 40-50 Km/h, so buying that sports car for speed is pointless (unless you want to visit jail for couple of months).

    Also, many sport cars cant even go outside the track (not suitable, register it and are not alowwed to drive it except on a track). So buying one for commuting to work is kinda retarded, and yes, they are crap....for that.
  • IncomparableIncomparable Member UncommonPosts: 1,138
    I thought all they had to do was add more transitors and that was the next gen which ofc gets more complocated by it being more and more microscopic.

    May be its part of the Intel straategy to first make their chips power effecient so when they add more transistors they do not over heat to quickly by packing so many in a small space?

    Sounds like PC gaming wont improve much in the foreseeable future. I guess SLI cpu is the future as well?

    “Write bad things that are done to you in sand, but write the good things that happen to you on a piece of marble”

  • MalaboogaMalabooga Member UncommonPosts: 2,977
    edited February 2016
    Nope, "luxus car" is still subject to 50 Km/h speed limit.

    And theres no equuivalent of "luxus car" in CPUs. the only difference between CPUs is speed.

    The "speed limit" is your GPU.

    You fail in the concept of want and need. If the speed limit is 50 Km/h, all youll ever need is car that can do 50 Km/h. And most people consider their PC as a tool and all they want is their PC doing 50 Km/h. They dont really care that there are cars capable of doing 300 Km/h as they never go to race track and need it just to go from point A to point B at speed limit.

    Even if they go to highway 1/year they dont really see it worth it several times more in price (and couple of other benefits)

    To put it differently, its better to buy better GPU (as in raising "speed limit") as even 65$ CPUs are capable of doing much more than 50 Km/h
  • PhryPhry Member LegendaryPosts: 11,004
    edited February 2016
    Kiyoris said:
    Leon1e said:
    We don't upgrade because 1) We don't really feel the need to (0 hype about VR or 4K gaming)
    I don't see the benefit of 4k, you are effectively decreasing your FPS by 2/3.

    Increase in resolution by 4 = 1/3th of the FPS.

    When we went from 800*600 to Full HD, we had the increases in hardware speed to support the move. And the increase in resolution was easily apparent. That's not the case with 4k.

    From my "4k is pointless" thread, increasing resolution by 4 drops your framerate by about 70%:


    /facepalm

    I run my PC at 2560x1440, it can run at 4k at 60fps, if the game supports it, so far the only one that does is Eve Online, i play FFXIV:ARR at 2560x1440 at 60 fps because i locked it at that framerate, otherwise it would be around 70 to 80 fps, unfortunately my monitor is only capable of 60hz at 4k so i don't tend to run games at over 60 fps, even though its more than capable of doing so.

    Unigine Heaven Benchmark 4.0
    FPS:    160.9
    Score:  4052
    Min FPS: 28.3
    Max FPS: 283.1
    System
    Platform: Windows 7 (build 7601, Service Pack 1) 64bit
    CPU model: AMD FX(tm)-9590 Eight-Core Processor (4018MHz) x4
    GPU model: NVIDIA GeForce GTX 980 Ti 10.18.13.5891 (4095MB) x1
    Settings
    Render:Direct3D11
    Mode:    1920x1080 fullscreen
    Preset    Custom
    Quality    High
    Tessellation:    Disabled
    Powered by UNIGINE Engine
    Unigine Corp. © 2005-2013

    Unigine Heaven Benchmark 4.0
    FPS:    107.7
    Score:    2713
    Min FPS: 27.6
    Max FPS: 221.5
    System
    Platform: Windows 7 (build 7601, Service Pack 1) 64bit
    CPU model: AMD FX(tm)-9590 Eight-Core Processor (4018MHz) x4
    GPU model: NVIDIA GeForce GTX 980 Ti 10.18.13.5891 (4095MB) x1
    Settings
    Render: Direct3D11
    Mode:    2560x1440 fullscreen
    Preset    Custom
    Quality    High
    Tessellation:    Disabled
    Powered by UNIGINE Engine
    Unigine Corp. © 2005-2013

    I even did a 4k benchmark just to see what came of it, must admit for a benchmark tool thats 3 years old, i was a bit surprised.

    Unigine Heaven Benchmark 4.0
    FPS:    42.3
    Score:    1066
    Min FPS: 19.1
    Max FPS: 104.3
    System
    Platform: Windows 7 (build 7601, Service Pack 1) 64bit
    CPU model: AMD FX(tm)-9590 Eight-Core Processor (4018MHz) x4
    GPU model: NVIDIA GeForce GTX 980 Ti 10.18.13.5891 (4095MB) x1
    Settings
    Render: Direct3D11
    Mode: 3840x2160 fullscreen
    Preset Custom
    Quality    High
    Tessellation:    Disabled
    Powered by UNIGINE Engine
    Unigine Corp. © 2005-2013

    think its fairly safe to say that i probably would need to run dual 980's if i want to game at 4k, but at 1440, perfectly fine, and very little real difference in performance from 1080.



    Post edited by Phry on
  • gervaise1gervaise1 Member EpicPosts: 6,919
    edited February 2016
    I can remember a "serious" magazine article - years ago - that decided a new quad speed cd player was useless because a) we didn't need it and b) it was more expensive.

    However as far as "processing" power goes there is more than enough today to support 4k gaming.  NVidia's Shield TV will stream 4k @ 60fps for example and allow a (limited) number of games to be played from their cloud platform.  (It will actually run c.8k) XB1s and PS4s have more power but are bigger / cost more. Not touting NVidia btw just pointing out that a tiny, inexpensive device (c.$200) will allow 4k gaming. And what you can fit in a tablet or small box you can certainly fit in a PC! Or maybe an NUC solution.

    The future? Even IF no further hardware development took place we can look at "supercomputers" dual board graphic solutions to see what could happen. AS Incomparable says above SLI cpus could be used - and the end result would almost certainly be smaller and more power efficient than at the turn of the century. Two big IFs there; unlikely but possible.
  • MalaboogaMalabooga Member UncommonPosts: 2,977
    gervaise1 said:
    I can remember a "serious" magazine article - years ago - that decided a new quad speed cd player was useless because a) we didn't need it and b) it was more expensive.

    However as far as "processing" power goes there is more than enough today to support 4k gaming.  NVidia's Shield TV will stream 4k @ 60fps for example and allow a (limited) number of games to be played from their cloud platform.  (It will actually run c.8k) XB1s and PS4s have more power but are bigger / cost more. Not touting NVidia btw just pointing out that a tiny, inexpensive device (c.$200) will allow 4k gaming. And what you can fit in a tablet or small box you can certainly fit in a PC! Or maybe an NUC solution.

    The future? Even IF no further hardware development took place we can look at "supercomputers" dual board graphic solutions to see what could happen. AS Incomparable says above SLI cpus could be used - and the end result would almost certainly be smaller and more power efficient than at the turn of the century. Two big IFs there; unlikely but possible.
    And sometime later CD/DvD manufacturers settled for max speed, and it stopped at 16x?

    Yes, sometimes more speed IS pointless.

    Currently more speed is pointless for 99 % of things. Because current software (it finally started to change) doesnt even use what is available. But then there is a question what exacly should that software do to actually require that speed.

    Bottom line is intel threw billions upon billions and they are still stuck and just prolonging their "raodmap" to any significant change. In anything.
Sign In or Register to comment.