Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Prediction: future desktop CPUs will be less overclockable

QuizzicalQuizzical Member LegendaryPosts: 25,531

Let's start with some background.  The original Pentium processor ran notoriously hot--with a TDP of 5.5 W.  For comparison, many modern chips have a TDP in the ballpark of 100 W, and 5.5 W would be an ultra low power version intended for tablets or fanless ultraportable laptops.

That was an era when CPU designers just tried to make the chip go as fast as possible and basically didn't care about heat.  After all, how many people would have wanted a chip that used half as much power at the expense of running 10% slower?  A difference of 3 W doesn't matter much, but 10% performance sure did; the concept of CPUs being fast enough at 60 MHz was rather ridiculous.

Since then, a die shrink every two years or so has allowed CPU designers to pack in twice as many transistors as before.  But a naive die shrink would mean each transistor used about 70% as much power as before, so the entire chip on net used 1.4 times as much power.  Exponentially increasing power would make chips get much hotter and very soon.

As time passed, people figured out that they should put a heatsink on CPUs to keep them cool, and then later attach a fan to that heatsink.  Eventually heatpipes became necessary in desktops on all but low end coolers.  But better cooling apparatuses could only compensate for exponentially increasing heat output for so long.  Eventually there was bound to be trouble.

Trouble came with the arrival of the Pentium 4 in 2001.  By 2004, Intel was pushing laptop chips with a TDP of 88 W.  At considerable risk of stating the obvious, 88 W is a problem in a laptop.  Around this time, it became apparent that future chips would have to care about power consumption (equivalent to heat output by conservation of energy), not just performance.

High power consumption meant that CPUs would often be clocked well below the maximum speed they could have reached if power and heat weren't problems.  Core 2 Duo chips overclocked very well.  Bloomfield overclocked legendarily well, as many people overclocked a chip that wouldn't reach 3 GHz even with its turbo boost to run at 4 GHz or higher.

The downside of such enormous overclocks was runaway power consumption.  A Core i7-920 overclocked to 4 GHz might well pull 250 W under heavy loads, as compared to its TDP of 130 W.  Even in a desktop, 250 W is a lot.

As time has passed, subsequent die shrinks have continued to mean more transistors available, and that would continue to mean ever increasing power consumption for CPUs if chip designers used those extra transistors to add more cores.  Thus, the latest Xeon E5-2699 v3 with its 18 cores has to have a stock clock speed of only 2.3 GHz, though it can turbo some of them up as high as 3.6 GHz.  If Intel released a version with an unlocked mutliplier and you built plenty of power delivery and cooling infrastructure around it, it's highly probable that you could overclock the chip to 4 GHz--though you'd surely blow far past the TDP of 145 W.

But that only happens if you use the extra space to add more cores.  Two cores are a lot better than one, and four is better than two.  But is eight cores better than four?  Sales of AMD's FX-8000 series chips answer that with a resounding "no" from the market--and for most consumer use, that's correct.  And would eighteen cores really be better than eight for any plausible consumer use?

So what happens to CPU power consumption with successive die shrinks if each core uses about as many transistors as before and each new chip sports about as many cores as before?  It goes down, that's what.  Thus, Haswell quad cores built to be a very nice 47 W laptop chip can also take the guise of an 88 W desktop chip clocked at very near the chip's maximum potential.  Stock turbo goes up to 4.4 GHz, and Intel's promise of overclocks to 5 GHz on air proved illusory even on the cherry-picked samples in the hands of expert overclockers at review sites.

Now, 250 W in a desktop for a Core i7-920 at 4 GHz was a problem.  220 W for an FX-9590 at 5 GHz is also a problem.  But 88 W in a desktop?  That's not a problem; we're only a decade removed from Intel trying to put that very same 88 W TDP into laptops.  If clocking a chip near its maximum potential still keeps the chip inside of 100 W, then why not?  And remember that that 220 W AMD chip was an 8 core CPU; clock a 4-core part the same and it probably has a much desktop-friendly power consumption.

We got a preview of this future some years ago with the Intel Atom.  It was originally meant to be a cell phone chip, but overclocking nearly it as far as it could go left the TDP below 20 W, so that's what Intel did in desktops.  One site used liquid nitrogen to try to overclock the chip further and only got about 2.4 GHz out of it, from a stock speed of 1.8 GHz or so.  And that's with liquid nitrogen; the maximum overclock on air cooling would have been much milder.

Today, AMD's Kabini chip is in roughly the same boat, as well as newer generations of Intel's Atom chips.  AMD's Kaveri chip seems targeted at a 19 W TDP even for a quad core version--and with the GPU eating up much of that.  In a desktop, the difference between letting it use 65 W versus 100 W barely makes a dent in performance.  While Intel's Haswell quad core was primarily targeted at 47 W laptop chips, I wouldn't be surprised in the slightest if the most common TDP on quad core Broadwell laptop chips is below 20 W.

If you take a chip that is meant to use 20 W and let it have 40 W, then yes, it can probably run quite a bit faster.  But how much more power than this actually gives more performance?  Higher clock speeds mean more heat and higher temperatures, and that reduces the clock speed at which the chip can run stable.  Thus, I expect that overclocking the top bin of most future mainstream desktop CPUs even by 10% will often take a very beefy cooling system, as happened with past generations of processors clocked near their maximum, such as Intel's Pentium 4 and AMD's Phenom 1.

Now, the many-core chips can still use a lot of power because they have so many cores.  I expect that the Core i7-5960X with its 8 cores and max turbo of only 3.5 GHz will have a lot of overclocking headroom.  Rumors have AMD working on a chip with 16 Steamroller cores; if the chip exists and AMD releases a desktop version of it, that will probably have a lot of overclocking headroom, too.

But such overclocks will only be to the same speeds that chips with fewer cores can run at.  Haswell cores in a Core i7-4790K can run at 4.4 GHz, so effectively the same Haswell cores in a Core i7-5960X can probably do so, too.  But all that such overclocking will get you is a chip with more cores running at the same clock speed that a chip with fewer cores offers at stock settings.

And perhaps more to the point, these aren't mainstream consumer chips that we're talking about.  The Core i7-5960X is a $1000 chip.  The Xeon E5-2699 v3 surely costs several times that.  If AMD makes a chip with 16 Steamroller cores it will be far superior to their current Abu Dhabi chip with 16 Piledriver cores whose cheapest bin costs $700--so the new chip will probably not be cheaper than that, even if it does offer a desktop version.

Now, I don't think overclocking is going to go away entirely.  But I do expect that on most future desktop chips, Intel and AMD will offer a version stock clocked at near its maximum potential, and will hamper overclocking on lower bins as they long have to make it harder to buy a cheaper, lower bin and overclock it to match the higher bin.

«1

Comments

  • grndzrogrndzro Member UncommonPosts: 1,163

    It's a matter of transistor density vs heat dissipation. The heat transistors generate does not decrease proportionaly to the chips ability to dissipate heat with node shrinks.

    So eventually you are left with performance gains strictly from lower transistor costs, and density.

  • DamonVileDamonVile Member UncommonPosts: 4,818

    This is kind of off topic but still about the future of desktops, so...

    As the internet connections get better and better don't you think the future of PCs is going to move away from physical boxes all together and putting everything into a cloud. A PC could one day be nothing more than a keyboard mouse and screen with an internet connection. Or do you think that's something we'll never actually see and that desktops will always out perform anything like that.

  • QuizzicalQuizzical Member LegendaryPosts: 25,531
    Originally posted by DamonVile

    This is kind of off topic but still about the future of desktops, so...

    As the internet connections get better and better don't you think the future of PCs is going to move away from physical boxes all together and putting everything into a cloud. A PC could one day be nothing more than a keyboard mouse and screen with an internet connection. Or do you think that's something we'll never actually see and that desktops will always out perform anything like that.

    Which do you think is easier:  to send some amount of data a few millimeters across a chip or to send the same amount of data a few hundred miles across the Internet?  If you change hundreds of miles to tens of miles, would you expect that to change the answer?  Do you think adding another decade or so of technological advances will change the answer?  I sure don't.

    If you can readily do some computations locally, why would you do them remotely, even if you could?  You lose performance by doing the computations in "the cloud" if you mean passing data over the public Internet, in addition to losing security and reliability.  Doing stuff off in the cloud makes sense when what you need would overwhelm a single computer, or in some enterprise situations where you have a bunch of computers on a LAN (not the Internet!) and can hire full time IT people to manage them.  Neither of those are descriptive of consumer uses.

    I wouldn't be surprised if some companies do try to push consumers to use cloud stuff even where it makes no sense at all.  That companies try to push you to buy something stupid doesn't mean that consumers will fall for it, however.  OnLive has already tried to push gaming into the cloud--and failed miserably.  Once technology is "good enough", people lose incentives to upgrade, and vendors in the business of selling you upgrades don't like that.  Thus, they have to try to convince you to stop using the old technology that works and instead adopt some new technology that doesn't entirely work right so that you'll need to upgrade it again.  See, for example, Ultrabooks, gaming laptops, wireless mice, or Kinect, all of which are efforts at replacing things that worked better than the new replacement.

  • TheLizardbonesTheLizardbones Member CommonPosts: 10,910
    Originally posted by Quizzical
    Originally posted by DamonVile

    This is kind of off topic but still about the future of desktops, so...

    As the internet connections get better and better don't you think the future of PCs is going to move away from physical boxes all together and putting everything into a cloud. A PC could one day be nothing more than a keyboard mouse and screen with an internet connection. Or do you think that's something we'll never actually see and that desktops will always out perform anything like that.

    Which do you think is easier:  to send some amount of data a few millimeters across a chip or to send the same amount of data a few hundred miles across the Internet?  If you change hundreds of miles to tens of miles, would you expect that to change the answer?  Do you think adding another decade or so of technological advances will change the answer?  I sure don't.

    If you can readily do some computations locally, why would you do them remotely, even if you could?  You lose performance by doing the computations in "the cloud" if you mean passing data over the public Internet, in addition to losing security and reliability.  Doing stuff off in the cloud makes sense when what you need would overwhelm a single computer, or in some enterprise situations where you have a bunch of computers on a LAN (not the Internet!) and can hire full time IT people to manage them.  Neither of those are descriptive of consumer uses.

    I wouldn't be surprised if some companies do try to push consumers to use cloud stuff even where it makes no sense at all.  That companies try to push you to buy something stupid doesn't mean that consumers will fall for it, however.  OnLive has already tried to push gaming into the cloud--and failed miserably.  Once technology is "good enough", people lose incentives to upgrade, and vendors in the business of selling you upgrades don't like that.  Thus, they have to try to convince you to stop using the old technology that works and instead adopt some new technology that doesn't entirely work right so that you'll need to upgrade it again.  See, for example, Ultrabooks, gaming laptops, wireless mice, or Kinect, all of which are efforts at replacing things that worked better than the new replacement.

     

    What works in the cloud comes down to latency*.  Humans, on average, have a 250ms response time.  The time it takes to perceive visual stimulus is somewhere over 50ms or so.  If a game is being run from the cloud and the latency is greater than 250ms or the responses to the players actions well exceed 50ms, players are not going to be super pleased, because players can compare cloud gaming to gaming locally, which has a very low latency.  It doesn't seem like this is going to change dramatically any time soon.

     

    * In my opinion.

     

    I can not remember winning or losing a single debate on the internet.

  • VrikaVrika Member LegendaryPosts: 7,999
    Originally posted by DamonVile

    This is kind of off topic but still about the future of desktops, so...

    As the internet connections get better and better don't you think the future of PCs is going to move away from physical boxes all together and putting everything into a cloud. A PC could one day be nothing more than a keyboard mouse and screen with an internet connection. Or do you think that's something we'll never actually see and that desktops will always out perform anything like that.

    I think that cloud will be able to give performance, but won't be used.

    Mobile devices will not be able to operate with everything requiring net connection, the network just isn't good enough to work everywhere as we move around. I think they will keep requiring programs that are run locally, and will be so much more popular than non-mobile devices that they'll keep computers out of cloud.

     

     

    On future overclocking: I agree with Quizzical, but for slightly different reasons. I think in future people don't want to overclock even if they could:

    Some years ago, computer speed was much more limited and overclocking meant visibly smoother framerates or being able to turn on that extra graphical effect that really made the game look much better. Today it's getting to the point where I'm not able to spot the difference between 16x and 8x anti-aliasing and can't remember the last time I had some game lagging. Computer speed isn't such a big thing for anyone but hard-core hardware enthusiasts.

    Some years ago, when I bought new computer parts I expected them to last 2 years before needing to be replaced with faster parts if I was lucky. Today it's more like 6 years. Since the part is supposed to last me many times longer, pushing it to very limits of its capability by overclocking seems less sensible than just letting it run safe at stock speeds.

    I think that overclocking is increasingly something fewer and fewer people want to do, and if only a small number of people want to do it then manufacturers will be less and less interested in supporting it.

     
  • Loke666Loke666 Member EpicPosts: 21,441

    Interesting post, Quizz.. And yes, it seems to look this way in the near future, after that who can tell?

    GPUs already been on and off there. But it is not impossible that the overclockers will focus on memory instead in the future and clock up the FSB more than earlier. It is not as common as overclocking the CPU and GPU right now since most computers doesn't have much cooling on the memory chips.

    Myself, I run a slightly overclocked watercooled Phenomenah 2 AMD hexacore right now but to be honest is overclocking today not near what we could do with the first generation of celeron processors and I am not sure I will even bother to OC the next rig I will get when the DDR4 ram hits us.

    The GPUs I havn't overclocked since Nvidias 8800 cards. But frankly don't I currently feel any need to OC my 780 card, and I did once burn out an Nvivdia 7950 (or at least it got weird and started to draw out odd shapes in games) so I prefer being careful there unless I plan to upgrade within a few months anyways.

  • Drunk-fuDrunk-fu Member UncommonPosts: 133

    I've had a few pc, but i haven't touched the bios with the intent of overclocking since my Pentium 2.

    I have my current machine since 2009 or so, that has 8g ddr3 and a core i7 920 on stock clocks.

    And it runs everything perfectly. Though, i have to mention i play only in HD.

    So i do believe many of us doesn't feel the need to overclock.

    Who is going to miss the ability to do so, besides pc freaks and gamers who thinks +2 FPS to their "900" will make them better?

  • QuizzicalQuizzical Member LegendaryPosts: 25,531

    My argument is not that fewer people will want to overclock.  Rather, it is that among those who choose to overclock, they typically won't be able to overclock the CPU as far, at least on a percentage basis.

    This applies only to CPUs, not GPUs.  For GPUs, I actually expect the opposite.  While CPU loads don't typically scale well to arbitrarily many cores, GPU loads sure do.  Thus, I expect the trend of ever increasing power consumption for a given die size if you push the chip as hard as it will go to continue in GPUs--and that that will convince GPU vendors to throttle back clock speeds more than they have in the past for the sake of reducing heat output.  That means more overclocking headroom available on GPUs, though you're going to need an awfully good cooler to handle a chip that puts out 400 W under heavy loads.

  • RidelynnRidelynn Member EpicPosts: 7,383

    I think your prediction has been reality for the past few years.

    Turbo/Boost technologies do the overclocking for you. They do it within warranty. THey do it automatically according to real-time parameters that matter to overclocking (heat, power, total load, load distribution, etc). As we see Turbo/Boost technologies continue to refine and evolve - the manufacturer will be the ones squeezing out every bit of performance from a chip.

    Which, actually, I welcome. I would much rather pay for a garanteed and warrantied clock speed than try to get lucky with a good bin and pray I can get more speed.

  • RidelynnRidelynn Member EpicPosts: 7,383


    Originally posted by DamonVile
    This is kind of off topic but still about the future of desktops, so...As the internet connections get better and better don't you think the future of PCs is going to move away from physical boxes all together and putting everything into a cloud. A PC could one day be nothing more than a keyboard mouse and screen with an internet connection. Or do you think that's something we'll never actually see and that desktops will always out perform anything like that.

    I think we will see a lot of things move out to "the cloud" (I'm betting that name changes soon, like we don't say "Information Superhighway" anymore, thank goodness).

    But there are a lot of things that aren't better in the cloud - either for performance, or security, or liability, or privacy, or any more of dozens of reasons. Computing power at your home, office, desktop, laptop, tablet, pocket, watch, glasses, etc - that will never totally go away.

  • QuizzicalQuizzical Member LegendaryPosts: 25,531
    Originally posted by Torvaldr

    I can't wait for the "cloud" buzzword to die. It illicits a lot of unreasonable fear in those who don't understand it. For those that don't know there has been a long history of cycling between thin and fat clients through technology eras and changes. Internet and cloud based computing is sort of an evolution of the thin client concept.

    There are some tasks that work well with web based applications and some that don't, and many that work well in some sort of hybrid state. Cloud 9 IDE is a task that works great as a cloud-based service. You can collaboratively develop in real time, and if you're doing website work you can see the updates in a broswer session as you program. Web mails services like Gmail are a classic example of a program that works better as a web service than a local application. Some tasks that have intensive data transfer, such as database ETL work, don't work well over the internet, at least not until pipes become fat and long distances become irrelevant. Some tasks like Microsoft Office 365 work well as a hybrid - you have a web based install and strong integration with your remotely served files, but can have local copies, and can use web-based interfaces to edit.

    The security debate with the "cloud" always puzzles me. Unless your data is in a walled garden with no remote access then it has a potential for compromise. The key is to have relationships with companies that employ good security models and practices. For some reason people think their local networks are more securely hardened against attacks than Amazon, Google, or Microsoft.

    You're thinking largely about enterprise use, not consumer use.  Most home users who edit a document don't need it to be immediately available to 10 other people who may also make edits.

    As for security, the more places you have copies of your stuff, the more places it can be stolen from.  Think of the recent Home Depot breach with many millions of credit cards stolen.  If you've got data on one credit card on your home computer and some big box retailer has 40 million on their network, you don't have to be just as secure as the big box retailer to be safer on your home computer.  If it takes 1000 times as much effort to steal 40 million credit cards from Home Depot as to steal 1 card from a random home user, cyber criminals will prefer to go after the former.

    Some services do make sense to have off in "the cloud".  But it's never going to be the case that it makes sense to have everything done remotely.  So long as some things that you do are better done locally, you need to have the hardware to do those things locally.

  • sacredfoolsacredfool Member UncommonPosts: 849
    Originally posted by Quizzical

    You're thinking largely about enterprise use, not consumer use.  Most home users who edit a document don't need it to be immediately available to 10 other people who may also make edits.

    At the same time Quizz, I think many people, me included, would enjoy having their PC always with them. 

    It would be good to have the same level and ease of access to files/games/applications no matter if I am at home, at uni, at the workplace and at my grandfathers house. 

    I think cloud computing will work when it allows us such an option. Security will be a serious issue, but I generally find people opt for comfortable, less secure solutions. 


    Originally posted by nethaniah

    Seriously Farmville? Yeah I think it's great. In a World where half our population is dying of hunger the more fortunate half is spending their time harvesting food that doesn't exist.


  • RidelynnRidelynn Member EpicPosts: 7,383


    Originally posted by sacredfool
    At the same time Quizz, I think many people, me included, would enjoy having their PC always with them. It would be good to have the same level and ease of access to files/games/applications no matter if I am at home, at uni, at the workplace and at my grandfathers house. I think cloud computing will work when it allows us such an option. Security will be a serious issue, but I generally find people opt for comfortable, less secure solutions. 

    This is a big benefit to cloud computing applications - and we have seen, that at least in the US, that convenience will trump privacy/security almost every time.

    However, this is mostly a derail - I don't think cloud computing will ever totally supplant local CPU power. Users will always want as much local power as they can afford and is convenient. And that is a good thing for cloud computing as well - the most stuff you can run locally, the less stuff that has to go over the pipes and to the cloud.

    And more to the point of the OP, I think future CPU's are largely less overclockable because the factory is doing it for you - via more aggressive clocking (which is possible due to better engineering, quality control, and power management) and Turbo/Boost type technologies.

  • syntax42syntax42 Member UncommonPosts: 1,385

    I think the prediction only works if you assume we will continue to use silicon-based technology.  We might be five to ten years away from something replacing silicon in processors.  Numerous technologies, including graphene and single-atom switches have been in research labs for five or more years.  If the tech could advance computing technology, you can bet Intel, AMD, and/or other processor companies are working on their version of it.

    If and when we move to a new technology, clock speeds may become less relevant, or overclocking may depend on something entirely different from heat dissipation.  If the new technology consumes so little power that heat isn't the issue, something like interference from outside electrical noise could determine your stable clock frequency.

  • g0m0rrahg0m0rrah Member UncommonPosts: 325
    Originally posted by Quizzical
    Originally posted by DamonVile

    This is kind of off topic but still about the future of desktops, so...

    As the internet connections get better and better don't you think the future of PCs is going to move away from physical boxes all together and putting everything into a cloud. A PC could one day be nothing more than a keyboard mouse and screen with an internet connection. Or do you think that's something we'll never actually see and that desktops will always out perform anything like that.

    Which do you think is easier:  to send some amount of data a few millimeters across a chip or to send the same amount of data a few hundred miles across the Internet?  If you change hundreds of miles to tens of miles, would you expect that to change the answer?  Do you think adding another decade or so of technological advances will change the answer?  I sure don't.

    If you can readily do some computations locally, why would you do them remotely, even if you could?  You lose performance by doing the computations in "the cloud" if you mean passing data over the public Internet, in addition to losing security and reliability.  Doing stuff off in the cloud makes sense when what you need would overwhelm a single computer, or in some enterprise situations where you have a bunch of computers on a LAN (not the Internet!) and can hire full time IT people to manage them.  Neither of those are descriptive of consumer uses.

    I wouldn't be surprised if some companies do try to push consumers to use cloud stuff even where it makes no sense at all.  That companies try to push you to buy something stupid doesn't mean that consumers will fall for it, however.  OnLive has already tried to push gaming into the cloud--and failed miserably.  Once technology is "good enough", people lose incentives to upgrade, and vendors in the business of selling you upgrades don't like that.  Thus, they have to try to convince you to stop using the old technology that works and instead adopt some new technology that doesn't entirely work right so that you'll need to upgrade it again.  See, for example, Ultrabooks, gaming laptops, wireless mice, or Kinect, all of which are efforts at replacing things that worked better than the new replacement.

      If the internet was no longer the bottleneck I could see an advantage to highly specialized servers connected to terminals. This would pretty much kill the low budget PC market.  Why spend 400 to 500 $ on a low budget gaming system when you can simply buy a decent terminal for 100$ and let the servers do the heavy lifting.  Like I said, this relies on  the internet not being the bottleneck and I can see that in the US 720P is already saturating the internet so I cant imagine what the leap to 4k would do.

      On the hardware/software side, its amazing to me that people keep pushing the hardware envelope but software always gets a pass.  Why the fuck are we working on 18 core chips when the software side is the real bottleneck.  If 90% of the software available uses 4 cores or less, it seems obvious to me that the hardware guys need to get with the software guys and simplify the means by which software developers can use multiple cores.  I would guess that for gaming a simple jump from 4 to 8 cores should eliminate the CPU as the bottleneck, which is all that really matters to budget gaming.

  • QuizzicalQuizzical Member LegendaryPosts: 25,531
    Originally posted by syntax42

    I think the prediction only works if you assume we will continue to use silicon-based technology.  We might be five to ten years away from something replacing silicon in processors.  Numerous technologies, including graphene and single-atom switches have been in research labs for five or more years.  If the tech could advance computing technology, you can bet Intel, AMD, and/or other processor companies are working on their version of it.

    If and when we move to a new technology, clock speeds may become less relevant, or overclocking may depend on something entirely different from heat dissipation.  If the new technology consumes so little power that heat isn't the issue, something like interference from outside electrical noise could determine your stable clock frequency.

    While today's CPUs are predominantly silicon, there are a lot of other materials in there.  Exactly what materials and their proportions varies from one process node to the next.  Figuring out which materials to use and how to put them where you want them is a considerable fraction of the work in creating new process nodes.

    Graphene isn't some magical material that will make everything miraculously possible.  It will have trade-offs of its own, just like every other material that has ever been used, and will inevitably be inferior to silicon in some ways.

  • QuizzicalQuizzical Member LegendaryPosts: 25,531
    Originally posted by g0m0rrah
    Originally posted by Quizzical
    Originally posted by DamonVile

    This is kind of off topic but still about the future of desktops, so...

    As the internet connections get better and better don't you think the future of PCs is going to move away from physical boxes all together and putting everything into a cloud. A PC could one day be nothing more than a keyboard mouse and screen with an internet connection. Or do you think that's something we'll never actually see and that desktops will always out perform anything like that.

    Which do you think is easier:  to send some amount of data a few millimeters across a chip or to send the same amount of data a few hundred miles across the Internet?  If you change hundreds of miles to tens of miles, would you expect that to change the answer?  Do you think adding another decade or so of technological advances will change the answer?  I sure don't.

    If you can readily do some computations locally, why would you do them remotely, even if you could?  You lose performance by doing the computations in "the cloud" if you mean passing data over the public Internet, in addition to losing security and reliability.  Doing stuff off in the cloud makes sense when what you need would overwhelm a single computer, or in some enterprise situations where you have a bunch of computers on a LAN (not the Internet!) and can hire full time IT people to manage them.  Neither of those are descriptive of consumer uses.

    I wouldn't be surprised if some companies do try to push consumers to use cloud stuff even where it makes no sense at all.  That companies try to push you to buy something stupid doesn't mean that consumers will fall for it, however.  OnLive has already tried to push gaming into the cloud--and failed miserably.  Once technology is "good enough", people lose incentives to upgrade, and vendors in the business of selling you upgrades don't like that.  Thus, they have to try to convince you to stop using the old technology that works and instead adopt some new technology that doesn't entirely work right so that you'll need to upgrade it again.  See, for example, Ultrabooks, gaming laptops, wireless mice, or Kinect, all of which are efforts at replacing things that worked better than the new replacement.

      If the internet was no longer the bottleneck I could see an advantage to highly specialized servers connected to terminals. This would pretty much kill the low budget PC market.  Why spend 400 to 500 $ on a low budget gaming system when you can simply buy a decent terminal for 100$ and let the servers do the heavy lifting.  Like I said, this relies on  the internet not being the bottleneck and I can see that in the US 720P is already saturating the internet so I cant imagine what the leap to 4k would do.

      On the hardware/software side, its amazing to me that people keep pushing the hardware envelope but software always gets a pass.  Why the fuck are we working on 18 core chips when the software side is the real bottleneck.  If 90% of the software available uses 4 cores or less, it seems obvious to me that the hardware guys need to get with the software guys and simplify the means by which software developers can use multiple cores.  I would guess that for gaming a simple jump from 4 to 8 cores should eliminate the CPU as the bottleneck, which is all that really matters to budget gaming.

    It's not like the Internet will keep improving while advances in local rendering will come to a screeching halt.  I'd argue that the latter will tend to increase faster than the former, as Internet latency is limited by the speed of light--and the speed of light traveling through a material such as a fiber optic cable is substantially slower than the speed of light in a vacuum.

    You're always going to need input devices or else you can't make a "thin client" do anything.  You need output devices (e.g., a monitor) or else you can't see what it's doing.  You need an OS and drivers to tell everything what to do.  So you're not going to be able to just plug a fiber optic cable into a monitor and have a working computer that way unless the "computer" is inside the monitor all-in-one style.

    Plenty of companies will sell you a "system on a chip" (SoC) today, which lets you build a computer with just one chip that has the CPU, GPU, memory controller, external ports, and whatever else the system needs.  That's not just cell phones anymore, but has scaled up as AMD's Kabini and at least one variant of an Intel Haswell dual core are SoCs.  If you need to have a chip to make the computer work and that one chip can do quite a bit in not much space, why not do some work locally?

    Look what you can get for $250 today, for example:

    http://www.newegg.com/Product/Product.aspx?Item=N82E16834317808

    Very low end, to be certain, but not a complete piece of junk.  Indeed, performance would have been competitive a decade ago.  And I'm betting that what you can get for $250 will keep improving--and that the baseline of the minimum to buy a functional computer will keep dropping.  Indeed, it wasn't that long ago that you couldn't get a new PC for $1000, even a low end piece of junk.  For comparison, look at the price tag on today's version of a low end piece of junk:

    http://www.newegg.com/Product/Product.aspx?Item=9SIA1M80MP8921

  • iridescenceiridescence Member UncommonPosts: 1,552
    Originally posted by DamonVile

    This is kind of off topic but still about the future of desktops, so...

    As the internet connections get better and better don't you think the future of PCs is going to move away from physical boxes all together and putting everything into a cloud. A PC could one day be nothing more than a keyboard mouse and screen with an internet connection. Or do you think that's something we'll never actually see and that desktops will always out perform anything like that.

    As someone who is somewhat concerned about my privacy and wishing to retain unambiguous ownership of what is on my hard drive this is a nightmare scenario for me. Some people just don't think of all the implications of moving everything into the cloud.

    I will certainly not use a cloud based computer as long as there is a physical alternative available.

     

  • g0m0rrahg0m0rrah Member UncommonPosts: 325
    Originally posted by Quizzical
    Originally posted by g0m0rrah
    Originally posted by Quizzical
    Originally posted by DamonVile

    This is kind of off topic but still about the future of desktops, so...

    As the internet connections get better and better don't you think the future of PCs is going to move away from physical boxes all together and putting everything into a cloud. A PC could one day be nothing more than a keyboard mouse and screen with an internet connection. Or do you think that's something we'll never actually see and that desktops will always out perform anything like that.

    Which do you think is easier:  to send some amount of data a few millimeters across a chip or to send the same amount of data a few hundred miles across the Internet?  If you change hundreds of miles to tens of miles, would you expect that to change the answer?  Do you think adding another decade or so of technological advances will change the answer?  I sure don't.

    If you can readily do some computations locally, why would you do them remotely, even if you could?  You lose performance by doing the computations in "the cloud" if you mean passing data over the public Internet, in addition to losing security and reliability.  Doing stuff off in the cloud makes sense when what you need would overwhelm a single computer, or in some enterprise situations where you have a bunch of computers on a LAN (not the Internet!) and can hire full time IT people to manage them.  Neither of those are descriptive of consumer uses.

    I wouldn't be surprised if some companies do try to push consumers to use cloud stuff even where it makes no sense at all.  That companies try to push you to buy something stupid doesn't mean that consumers will fall for it, however.  OnLive has already tried to push gaming into the cloud--and failed miserably.  Once technology is "good enough", people lose incentives to upgrade, and vendors in the business of selling you upgrades don't like that.  Thus, they have to try to convince you to stop using the old technology that works and instead adopt some new technology that doesn't entirely work right so that you'll need to upgrade it again.  See, for example, Ultrabooks, gaming laptops, wireless mice, or Kinect, all of which are efforts at replacing things that worked better than the new replacement.

      If the internet was no longer the bottleneck I could see an advantage to highly specialized servers connected to terminals. This would pretty much kill the low budget PC market.  Why spend 400 to 500 $ on a low budget gaming system when you can simply buy a decent terminal for 100$ and let the servers do the heavy lifting.  Like I said, this relies on  the internet not being the bottleneck and I can see that in the US 720P is already saturating the internet so I cant imagine what the leap to 4k would do.

      On the hardware/software side, its amazing to me that people keep pushing the hardware envelope but software always gets a pass.  Why the fuck are we working on 18 core chips when the software side is the real bottleneck.  If 90% of the software available uses 4 cores or less, it seems obvious to me that the hardware guys need to get with the software guys and simplify the means by which software developers can use multiple cores.  I would guess that for gaming a simple jump from 4 to 8 cores should eliminate the CPU as the bottleneck, which is all that really matters to budget gaming.

    It's not like the Internet will keep improving while advances in local rendering will come to a screeching halt.  I'd argue that the latter will tend to increase faster than the former, as Internet latency is limited by the speed of light--and the speed of light traveling through a material such as a fiber optic cable is substantially slower than the speed of light in a vacuum.

    You're always going to need input devices or else you can't make a "thin client" do anything.  You need output devices (e.g., a monitor) or else you can't see what it's doing.  You need an OS and drivers to tell everything what to do.  So you're not going to be able to just plug a fiber optic cable into a monitor and have a working computer that way unless the "computer" is inside the monitor all-in-one style.

    Plenty of companies will sell you a "system on a chip" (SoC) today, which lets you build a computer with just one chip that has the CPU, GPU, memory controller, external ports, and whatever else the system needs.  That's not just cell phones anymore, but has scaled up as AMD's Kabini and at least one variant of an Intel Haswell dual core are SoCs.  If you need to have a chip to make the computer work and that one chip can do quite a bit in not much space, why not do some work locally?

    Look what you can get for $250 today, for example:

    http://www.newegg.com/Product/Product.aspx?Item=N82E16834317808

    Very low end, to be certain, but not a complete piece of junk.  Indeed, performance would have been competitive a decade ago.  And I'm betting that what you can get for $250 will keep improving--and that the baseline of the minimum to buy a functional computer will keep dropping.  Indeed, it wasn't that long ago that you couldn't get a new PC for $1000, even a low end piece of junk.  For comparison, look at the price tag on today's version of a low end piece of junk:

    http://www.newegg.com/Product/Product.aspx?Item=9SIA1M80MP8921

     

     I believe for home use CPU's are going to go one way...

    1.  The CPU is simply good enough that its not the bottleneck.

           This I believe is what amd is trying to reach, a CPU that is just fast enough that it isnt whats slowing you down.  With gaming specifically, balancing a CPU with the correct GPU seems core to budget builds.  You can always tell when someone is new to building a PC because they buy a monster CPU or GPU but the other one is so inferior that they arent getting any real performance out of that expensive product.  With my wifes PC i tried to put together a CPU thats good enough to go through 1 gpu upgrade in the future before I need to possibly replace it.  New GPU and CPU are coming out slow enough that either should last a couple years.   

       It doesnt seem that software developers either arent good enough to create code that runs efficiently on the hardware that we have today or they simply code for lowest common denominator to get the most money possible.  I am not saying coding for the high end should be priority, but in all honesty if software is released today that cant use 4+ cores when available, these are the people holding computers back.  Skyrim is a good example of shit software.  They port Skyrim to the PC and I believe it was limited to addressing 1 gig of ram, I mean seriously 1 gig.  Of course modders take over and fix the problem that shouldnt have existed in the first place.

       I am sure due to physics, the CPU in its current form is limited on speed due to heat or simply due to size restrictions.  How small can you possibly make an insulator between two conductors without some sort of breakdown.  At some point smaller is no longer going to be better.  It seems adding cores and actually using them efficiently will be the interim step to some technology leap, or at least some change in insulator. 

       Everytime I see a decent looking game on an X bone or ps4 I begin to wonder how they managed to leverage the pretty mediocre hardware available.  Then I peer at a mediocre PC with a 1090t, a radeon 6950,  and 8 gig of 1600 and I wonder why the fuck cant they leverage the hardware in that system. 

  • zevianzevian Member UncommonPosts: 403
    Originally posted by g0m0rrah

       Everytime I see a decent looking game on an X bone or ps4 I begin to wonder how they managed to leverage the pretty mediocre hardware available.  Then I peer at a mediocre PC with a 1090t, a radeon 6950,  and 8 gig of 1600 and I wonder why the fuck cant they leverage the hardware in that system. 

     

     

    They manage to put games that look good on the PS4 or XBON due to the fact that device is standardized and specialized to perform that task.   Thats all it does.

     

    A  PC does a number of different things all the time, the CPU isnt just running the game, its managing every other thing the computer has loaded into memory,   this is why you see games on XBONE running at less FPS than say the PS4, the XBONE is also running a smaller modified operating system controlling the other home media features built in, so less of the devices power is dedicated to performing its main task (gaming).    

     

    The PC  gets better results mainly through brute force, its doing all these other things but at the same time its playing your game but its doing it all with more resources availiable so it can perform better.     If we saw a modern day gaming system with cutting edge specifications dedicated to only 1 purpose (playing games)   you would see some amazing things out of that device.                Steam is trying to accomplish something like that with their STEAMBOX  you would be amazed what stadardized parts allow a programmer to do,   Thats why you see on game consoles games get better as the console is out longer, programmers learn better and more efficient ways to make those specific items work better together.    The mish mash of parts in gaming PC's really do hold PC only game development back, instead of programming for x cpu and x video card they need to program for xxxxx Chips and xxxxx video cards (amongst all the other components).

     

    TLDR  component standardization is HUUUUGE is programming it not only makes a developers job easier but allows them to extract more power out of older components because they learn the limits and capabilities of that specific set.

     

  • g0m0rrahg0m0rrah Member UncommonPosts: 325
    Originally posted by zevian
    Originally posted by g0m0rrah

       Everytime I see a decent looking game on an X bone or ps4 I begin to wonder how they managed to leverage the pretty mediocre hardware available.  Then I peer at a mediocre PC with a 1090t, a radeon 6950,  and 8 gig of 1600 and I wonder why the fuck cant they leverage the hardware in that system. 

     

     

    They manage to put games that look good on the PS4 or XBON due to the fact that device is standardized and specialized to perform that task.   Thats all it does.

     

    A  PC does a number of different things all the time, the CPU isnt just running the game, its managing every other thing the computer has loaded into memory,   this is why you see games on XBONE running at less FPS than say the PS4, the XBONE is also running a smaller modified operating system controlling the other home media features built in, so less of the devices power is dedicated to performing its main task (gaming).    

     

    The PC  gets better results mainly through brute force, its doing all these other things but at the same time its playing your game but its doing it all with more resources availiable so it can perform better.     If we saw a modern day gaming system with cutting edge specifications dedicated to only 1 purpose (playing games)   you would see some amazing things out of that device.                Steam is trying to accomplish something like that with their STEAMBOX  you would be amazed what stadardized parts allow a programmer to do,   Thats why you see on game consoles games get better as the console is out longer, programmers learn better and more efficient ways to make those specific items work better together.    The mish mash of parts in gaming PC's really do hold PC only game development back, instead of programming for x cpu and x video card they need to program for xxxxx Chips and xxxxx video cards (amongst all the other components).

     

    TLDR  component standardization is HUUUUGE is programming it not only makes a developers job easier but allows them to extract more power out of older components because they learn the limits and capabilities of that specific set.

     

      I will always be against component standardization.  I hate apple and the PC market to me is about competition.  I think you put to much emphasis on hardware standardization and simply ignore developer laziness.  Skyrim is a great example of a shit developer port, addressing only 1 gig of ram, which is obvious developer laziness.  When modders fixed the ram issue fast, which should not have to happen.

      I would guess that it isnt the hardware standardization which promotes excellent games or otherwise we would see amazing games in a Mac, which we do not.  Its sony and microsoft probably forcing developers to use quality code for console games, where as there is no enforcement or rules when creating a game for an open system such as the PC.

  • EdliEdli Member Posts: 941
    Originally posted by zevian

     

    The mish mash of parts in gaming PC's really do hold PC only game development back, instead of programming for x cpu and x video card they need to program for xxxxx Chips and xxxxx video cards (amongst all the other components).

    And I hope the PC never gets standardized like consoles. Yes you can get better results if you target only one CPU/GPU/RAM but you know what else that means? That game will work only on that specific CPU/GPU/RAM.

    The beauty of PC is that games are not designed around specific hardware but APIs, which means I can just swap a GPU for another one anytime I want to and my games would still work. Look at PS4 and Xbox. All those games you bought for PS3 and 360 through the years? They're useless now because they were made for those specific pieces of hardware. Meanwhile I buy a brand new PC with the latest OS and every game released in the past 20 years still work like a champ.

    Whoever asks for standardization of PC asks for the death of PC gaming. If you want a console buy a console, you have three to choose from. On the other hand we have only one platform which is open so lets keep it like that.

  • QuizzicalQuizzical Member LegendaryPosts: 25,531
    Originally posted by g0m0rrah

     

     I believe for home use CPU's are going to go one way...

    1.  The CPU is simply good enough that its not the bottleneck.

           This I believe is what amd is trying to reach, a CPU that is just fast enough that it isnt whats slowing you down.  With gaming specifically, balancing a CPU with the correct GPU seems core to budget builds.  You can always tell when someone is new to building a PC because they buy a monster CPU or GPU but the other one is so inferior that they arent getting any real performance out of that expensive product.  With my wifes PC i tried to put together a CPU thats good enough to go through 1 gpu upgrade in the future before I need to possibly replace it.  New GPU and CPU are coming out slow enough that either should last a couple years.   

       It doesnt seem that software developers either arent good enough to create code that runs efficiently on the hardware that we have today or they simply code for lowest common denominator to get the most money possible.  I am not saying coding for the high end should be priority, but in all honesty if software is released today that cant use 4+ cores when available, these are the people holding computers back.  Skyrim is a good example of shit software.  They port Skyrim to the PC and I believe it was limited to addressing 1 gig of ram, I mean seriously 1 gig.  Of course modders take over and fix the problem that shouldnt have existed in the first place.

       I am sure due to physics, the CPU in its current form is limited on speed due to heat or simply due to size restrictions.  How small can you possibly make an insulator between two conductors without some sort of breakdown.  At some point smaller is no longer going to be better.  It seems adding cores and actually using them efficiently will be the interim step to some technology leap, or at least some change in insulator. 

       Everytime I see a decent looking game on an X bone or ps4 I begin to wonder how they managed to leverage the pretty mediocre hardware available.  Then I peer at a mediocre PC with a 1090t, a radeon 6950,  and 8 gig of 1600 and I wonder why the fuck cant they leverage the hardware in that system. 

    Let's not be quick to praise games for requiring higher end hardware by being inefficient.  In the case of Skyrim, if you can do everything you want inside of 1 GB, what's the advantage to requiring more memory?  The Xbox 360 and PS3 each had 512 MB of memory--for system memory and video memory added together.

  • QuizzicalQuizzical Member LegendaryPosts: 25,531
    Originally posted by zevian
    Originally posted by g0m0rrah

       Everytime I see a decent looking game on an X bone or ps4 I begin to wonder how they managed to leverage the pretty mediocre hardware available.  Then I peer at a mediocre PC with a 1090t, a radeon 6950,  and 8 gig of 1600 and I wonder why the fuck cant they leverage the hardware in that system. 

     

     

    They manage to put games that look good on the PS4 or XBON due to the fact that device is standardized and specialized to perform that task.   Thats all it does.

     

    A  PC does a number of different things all the time, the CPU isnt just running the game, its managing every other thing the computer has loaded into memory,   this is why you see games on XBONE running at less FPS than say the PS4, the XBONE is also running a smaller modified operating system controlling the other home media features built in, so less of the devices power is dedicated to performing its main task (gaming).    

     

    The PC  gets better results mainly through brute force, its doing all these other things but at the same time its playing your game but its doing it all with more resources availiable so it can perform better.     If we saw a modern day gaming system with cutting edge specifications dedicated to only 1 purpose (playing games)   you would see some amazing things out of that device.                Steam is trying to accomplish something like that with their STEAMBOX  you would be amazed what stadardized parts allow a programmer to do,   Thats why you see on game consoles games get better as the console is out longer, programmers learn better and more efficient ways to make those specific items work better together.    The mish mash of parts in gaming PC's really do hold PC only game development back, instead of programming for x cpu and x video card they need to program for xxxxx Chips and xxxxx video cards (amongst all the other components).

     

    TLDR  component standardization is HUUUUGE is programming it not only makes a developers job easier but allows them to extract more power out of older components because they learn the limits and capabilities of that specific set.

    Running other things at the same time isn't a problem unless something else you have running is really aggressive about claiming resources.  You can check Task Manager to see how much various programs are claiming, both in terms of CPU usage and memory.  You might have dozens of programs running at a time, but usually all of them added together add up to something like 1% of your CPU.  That's not a problem, and it usually isn't a problem for memory, either.

    The reason why Xbox One has to run at lower settings or get lower frame rates than PS4 is that the Xbox One has inferior hardware.  It has nothing to do with other stuff running at the same time.  8 GB of memory and an 8-core CPU are plenty to handle whatever else needs to run at the same time without making a meaningful dent in what is available to games.

    Component standardization does make some things easier.  For example, if you're making a game for PS4, it doesn't matter if the game is unplayable or even crashes outright on hardware 10% slower than a PS4.  It also doesn't matter if the game wouldn't run on a slightly older API than what the PS4 supports.

    But component standardization has its drawbacks, too, especially when whoever chooses the standard components botches the decision.  See, for example, the Xbox One or PS3.  Or just about any prebuilt desktop or laptop ever made, for that matter.

  • dave6660dave6660 Member UncommonPosts: 2,699
    Originally posted by DamonVile

    This is kind of off topic but still about the future of desktops, so...

    As the internet connections get better and better don't you think the future of PCs is going to move away from physical boxes all together and putting everything into a cloud. A PC could one day be nothing more than a keyboard mouse and screen with an internet connection. Or do you think that's something we'll never actually see and that desktops will always out perform anything like that.

    Personally, I don't want to go back to the days of dumb terminals and big iron.

    “There are certain queer times and occasions in this strange mixed affair we call life when a man takes this whole universe for a vast practical joke, though the wit thereof he but dimly discerns, and more than suspects that the joke is at nobody's expense but his own.”
    -- Herman Melville

Sign In or Register to comment.