Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Intel launches higher wattage Broadwell chips--and they're still slower than Haswell

QuizzicalQuizzical Member LegendaryPosts: 25,531

http://www.tomshardware.com/news/intel-broadwell-u-realsense,28321.html

So now we know how fast Broadwell can go in a laptop:  3.4 GHz.  And that's in a 28 W TDP.  For comparison, Haswell dual cores had turbo as high as 3.7 GHz:

http://ark.intel.com/products/80345/

Haswell quad cores went up to 4 GHz--and yes, that's in a laptop:

http://ark.intel.com/products/83503/Intel-Core-i7-4980HQ-Processor-6M-Cache-up-to-4_00-GHz

Broadwell will probably have a little better IPC than Haswell, but not enough to make up for the difference between 3.4 GHz and 3.7 GHz.  So well desktop Broadwell even matter?  Not likely.

Comments

  • tawesstawess Member EpicPosts: 4,227

    Ok being a processor scrublet i have to ask...

     

    Is speed everything?

     

    Or can it have other redeeming qualities.

    This have been a good conversation

  • TamanousTamanous Member RarePosts: 3,030
    Battery life. Intel doesn't really give a flying shit about the <1% of the market that buys laptops for gaming.

    You stay sassy!

  • Zarf42Zarf42 Member Posts: 250
    I don't think Broadwell is coming to desktops. That will be the next 14 nm microarchitecture after Broadwell.

     

  • QuizzicalQuizzical Member LegendaryPosts: 25,531

    What do you care about in a processor?  Other than that it actually works (which pretty much everything does), you want higher performance, lower price, and lower power consumption.  And in a desktop, you might not care much about the lower power consumption.

    So if you could buy Haswell or Broadwell, and the latter is both more expensive and slower, which do you pick?  Broadwell might be a nice chip for Windows 8 tablets, as it does get load power consumption low enough to be reasonable in a tablet--something that Haswell notably failed to do.

    But in a laptop that you want to play games on, it's a completely stupid chip.  If you want to play games, you want graphics that work.  AMD will sell you integrated graphics that work, while Intel graphics will probably kinda sometimes work.  Maybe.  If you're lucky.  You can also get a discrete card, except that Broadwell doesn't have enough PCI Express lanes to allow the CPU to properly communicate with that card.

    Maybe that will change in a desktop version or maybe it won't.  But if you can't get the proper performance from a discrete video card, you don't want Broadwell in a desktop.  And even if you can, if it's not faster than Haswell, you don't want it unless it's cheaper.  Which it won't be, as Intel is charging a fortune for Broadwell.

  • QuizzicalQuizzical Member LegendaryPosts: 25,531
    Originally posted by cura
    Originally posted by Zarf42
    I don't think Broadwell is coming to desktops. That will be the next 14 nm microarchitecture after Broadwell.

     

    Yep, thats the thing im waiting for

    What makes you so certain that Sky Lake will be able to clock high enough to be competitive with Haswell?  Both AMD and Intel have been seeing reduced maximum clock speeds with recent die shrinks, even though they're using totally independent process nodes.  Broadwell only continues that trend.

    Now, maybe Sky Lake will be able to beat Haswell in a desktop.  Maybe it will even be able to beat Haswell by enough to actually matter.  But it's hardly automatic.

  • PepeqPepeq Member UncommonPosts: 1,977

    This is like debating the importance of going from zero to sixty in the shortest amount of time when the traffic you are in won't even exceed 35 mph.

     

    Most every technological invention related to computers has the potential to do X but in reality will never be taxed beyond Y.  Sure, you can upgrade every nano-second they put something new out, but in the end, you waste more in potential than you will actually ever use.  They are in the business of selling you something new... doesn't necessarily mean it's something you absolutely need.  It's called planned obsolescence... they prefer you'd consider your hardware obsolete every week, they usually get you to bite every 2 to 5 years.  Those that hold out for 10 years are the only ones who truly maximize their investment.

  • QuizzicalQuizzical Member LegendaryPosts: 25,531
    Originally posted by Pepeq

    This is like debating the importance of going from zero to sixty in the shortest amount of time when the traffic you are in won't even exceed 35 mph.

     

    Most every technological invention related to computers has the potential to do X but in reality will never be taxed beyond Y.  Sure, you can upgrade every nano-second they put something new out, but in the end, you waste more in potential than you will actually ever use.  They are in the business of selling you something new... doesn't necessarily mean it's something you absolutely need.  It's called planned obsolescence... they prefer you'd consider your hardware obsolete every week, they usually get you to bite every 2 to 5 years.  Those that hold out for 10 years are the only ones who truly maximize their investment.

    If you think the performance difference doesn't matter, then why would you buy a $400 chip if you think the $50 alternative is just as good?  Broadwell doesn't exactly have budget-friendly price tags.

  • RidelynnRidelynn Member EpicPosts: 7,383

    In a laptop or other mobile device, I would make a strong argument that battery life is as, if not more, important than processor performance.

    Battery life is not just TDP of the chip, it has to take into account the entire energy management scheme. TDP is just the top maximum draw, it doesn't really state anything about how/when/where/what of the idle states and other power management features of the diet. One chip could have a much higher TDP but lower overall energy use because of better energy management.

    Is that worth the premium that Intel is charging for Broadwell? I don't know - I haven't seen any real world performance numbers.

    But the power savings changes in Haswell mobile dies were significant beyond Ivy and Sandy. Even though the chip wasn't necessarily any faster, the newer power management features were a huge benefit for mobile users, especially laptop users.

    The other bit of hoopla about Broadwell is integrated graphics. Intel sucks at graphics, that's a more or less given. Haswell graphics sucked less than traditional Intel graphics, but less suck is still suck, when compared to discrete GPUs. Broadwell promises to improve on that greatly - I doubt it will come up to the level of AMD APU graphics still, but on the scale of good to bad, less suck is generally preferable than more suck. This doesn't apply to gamers and pros who will use a discrete GPU regardless, but the vast majority of laptops and other mobile devices don't need discrete GPUs - and if you are arguing for faster graphics on ARM-based architectures for ultra-mobile (phones/tablets/etc) devices on the basis that the software can't leverage the compute power if it's not there, you can't come and argue against it in Intel products, especially those that are trying to aim for overlapping markets.

    Desktop intel architecture hasn't evolved much since Sandy really, or maybe you could even go back another generation to Nehalem. I had some hope for Haswell clocking better, but that didn't prove to be the case, and now I'm in agreement that Broadwell won't mean much for the desktop either.

    But I think Haswell was huge for laptops and AIOs, maybe not so much in other mobile devices, but maybe Broadwell gets Intel closer to that. Surface 3 was a big hit in no small part because of Haswell, and if Broadwell can take what made Haswell good and expand on that - great. Haswell wasn't good because it was faster, it is relevent because it had better energy management.

  • QuizzicalQuizzical Member LegendaryPosts: 25,531

    Absolutely, I agree that Haswell was a huge deal in laptops.  But that's much of why Broadwell isn't:  once you've got an SoC whose idle power consumption is a small fraction of a watt, where do you go from there to improve on that?  Idle power consumption can't go below zero, and nearly all of the improvement that was even theoretically possible a few years ago has since been had.

    As for graphics, Intel claims that Broadwell is "up to 22%" faster than Haswell in 3D graphics.  No one who wasn't happy with Haswell's graphics said that if only it were 22% better, it would be good enough.

    But while idle power consumption drives battery life, TDP still matters because it drives cooling needs.  The reason you don't put a 47 W chip in a tablet isn't because you're worried about battery life.  And there, the difference between 15 W Haswell and 4.5 W Broadwell matters a lot.  If you try to put a 15 W chip into a tablet, you need to make the tablet obnoxiously thick and heavy, come up with some very exotic and very expensive cooling system, or cross your fingers and hope that no one ever dares to play a game on it.  And I'm not a fan of that last option.

    So yes, Broadwell will matter in the Surface Pro 3, the next MacBook Air, and some stupidly priced Ultrabooks--at least to the degree that they matter at all.  But in more budget-friendly laptops, or for gaming--whether laptop or desktop?  Nope.

  • QuizzicalQuizzical Member LegendaryPosts: 25,531

    And what if that advertised 4.5 W TDP is a lie?

    http://www.notebookcheck.net/Lenovo-Yoga-3-Pro-Convertible-Review.129882.0.html

    Look at the Cinebench results.  It gives max turbo at a cost of using about 12 W for about 15 seconds.  Then it hits 75 C and throttles back to 1.8 GHz and the power consumption drops.  But the power consumption only drops to about 6 W, and never gets down to 5 W.

    Now Cinebench does push a CPU hard.  But that's with the GPU completely idle.  A stress test that pushed both left the CPU at 600 MHz and the GPU at 300 MHz.  For comparison, Intel promises a base CPU clock speed of 1.1 GHz, and that's 600 MHz under load.

    Yes, that's a stress test.  And it's surely lower power than Haswell.  But could they really slash power consumption from 15 W to 4.5 W from just a die shrink?  Nope.

  • HrimnirHrimnir Member RarePosts: 2,415

    Well, broadwell was really all about power consumption reduction etc, so i'm not particularly surprised.

    Its kind of like trying to get 600hp out of an econobox inline 4 engine, yeah, it can be done, but its much easier, and generally better to just use a larger displacement engine.

    "The surest way to corrupt a youth is to instruct him to hold in higher esteem those who think alike than those who think differently."

    - Friedrich Nietzsche

  • RidelynnRidelynn Member EpicPosts: 7,383


    Originally posted by Quizzical
    And what if that advertised 4.5 W TDP is a lie?

    Look at the Cinebench results.  It gives max turbo at a cost of using about 12 W for about 15 seconds.  Then it hits 75 C and throttles back to 1.8 GHz and the power consumption drops.  But the power consumption only drops to about 6 W, and never gets down to 5 W.

    ...

    Yes, that's a stress test.  And it's surely lower power than Haswell.  But could they really slash power consumption from 15 W to 4.5 W from just a die shrink?  Nope.


    Playing Devil's Advocate here... isn't that what you'd want a chip to do? If it can safely exceed the TDP - and it can verify that it's safe in doing so (ie it's knows it's own temperature and power draw) - then what's the problem in exceeding the advertised TDP?

    If your cooling solution works fine at 4.5W but is entirely inadequate at 6W (it would be difficult to do that, I admit, but I'm just theorizing) - then the chip should throttle on down to eventually hit something around 4.5W based on maintaining a limit temperature.

    If power draw (battery consumption) is an issue, you can always have the software force the CPU into a higher P-state, which provides a temporary, easy-to-configure and software-selectable cap, and would throttle it down farther in order to reduce power draw (conserve battery life). This is what Windows does in Energy Saving mode, and why you can see it particularly pronounced difference in performance operating on AC power or just the battery.

    I don't see a problem with a CPU exceeding it's stated TDP so long as it has adequate monitoring and protection in place in the event that it does so. After all, most of us enthusiasts have been exceeding TDP intentionally for years now.

    And - how exactly are they measuring just the CPU TDP? Does it actually have an accuracy that could tell the difference between 6W and 4.5W? Or are they just guessing based on total system load? I tried to find this in the article, and it was never clear to me, but I just skimmed and I didn't read in detail.

    Now back to playing Consumer's Advocate:

    They are more or less guaranteeing a set speed at a set TDP. It would be an issue if the CPU couldn't deliver rated clock speed at or under rated TDP - the Cinebench benchmark seems to indicate that, but TDP (if nVidia has taught us anything with Fermi) is a somewhat liquid definition, and 100% load on a chip can vary based on exactly what part of the chip driving that to 100% (i.e. a full FPU load may run a different thermal profile than a full Integer load because they use different sections of the chip and would pipeline differently, even though both register as 100% CPU load). And does Intel consider "Turbo" to be part of that TDP, or does it only qualify base clocks at that rated TDP and Turbo can go above and beyond that so long as it stays within a thermal/power envelope? (I honestly don't know the answer to that question)

Sign In or Register to comment.