It looks like you're new here. If you want to get involved, click one of these buttons!
Intel has now launched Bay Trail Atom (with Silvermont cores), and sent out their top tablet bin Atom Z3770 to reviewers for benchmarks. On the CPU side, a Silvermont Atom core at 2.4 GHz is roughly competitive with an AMD Jaguar core at 1.5 GHz. On the GPU side, the top bin Intel Atom Z3770 is at best roughly competitive with AMD's bottom bin A4-1200.
That leaves the question of what market(s) Bay Trail Atom will make any sense for. Let's consider some, in order:
1) Nettops: Nope. In a nettop, you can (or at least will be able to by the time Atom shows up in commercial products) readily get four AMD Jaguar cores at 2 GHz. Intel would have to clock Atom above 3 GHz to hang with that on the CPU side. And the GPU side is just going to be brutal for Intel.
2) Laptops: Not really. If the AMD A4-5000's TDP of 15 W isn't too high for your taste, that will let you cheaply hang with Silvermont Atom's CPU performance and destroy its GPU performance. Intel being Intel, there's a good chance that Silvermont Atom will cost more, too. Oh, and Silvermont Atom is only 32-bit, which means you're capped at 4 GB of memory, and to get 4 GB, you have to use two memory channels, so multiple modules, which adds cost and power consumption.
If you want to push down around 8 W, then AMD won't clock Jaguar cores above 1 GHz there, and the Silvermont Atom's turbo availability could give it a substantial advantage in single-threaded CPU performance. So there, it's at least debateable. Why you'd be unable to handle a 15 W SoC in a laptop form factor is something of a mystery, however.
3) Windows tablets: Likely not. If you're looking at an AMD A6-1450 (8 W), then Atom might be a decent competitor, again because of its ability to turbo up a single core. But it's going to lose badly on graphics, and even that is quite a lot of power for a tablet. If you're looking at an AMD A4-1200 (3.9 W), then can Intel fit that TDP while clocking Silvermont Atom cores high enough (~1.6 GHz) to keep pace with 1 GHz Jaguar cores? Even if they can, they're going to get crushed on the GPU side.
4) Android tablets: Definitely not. Why would you want Silvermont Atom cores over ARM Cortex A15 cores, or for that matter Apple Swift (which would get you iOS, not Android) or Qualcomm Krait? Intel will charge much more while likely struggling to match ARM's power or performance numbers. And that's even before we consider the vast amounts of Android software built for ARM and virtually none for x86.
5) Cell phones: Of course not. It's not even clear that Silvermont Atom cores can scale performance down far enough to be suitable for phone use. And even if they can, the chances that they'll be more sensible than ARM are basically zilch.
6) Servers: Doubtful. Being 32-bit isn't so crippling in tablets or phones, but it's a big problem for an awful lot of servers. Even if you want cheap, low power server cores, AMD Jaguar cores are 64-bit and can handle far more memory. Keep in mind that the GPU accounts for much of the TDP in Kabini and Temash; if you're not using that, Jaguar cores by themselves are quite efficient. Not to mention being much faster than Silvermont Atom cores. Avoton Atom is coming soon (next year?), and will be 64-bit and have a better chance.
7) Embedded: Embedded needs can be all over the place, and this is a market I know less about than the others. But I don't see a big market for Silvermont Atom here, as you'd struggle to find a place where they make more sense than either AMD Jaguar or ARM Cortex A15 cores in Atom's intended power range. If you need more performance, then you look at Haswell or Piledriver cores or whatever. If you need lower power consumption, then ARM offers Cortex A7, A4, and a bunch of other options.
-----
A lot depends on power consumption. If the Atom Z3770 that Intel sent out for review has a TDP of 15 W, then it's dead on arrival. If it has a TDP of 4 W, then it could be an excellent chip. An AMD A4-5000 is much faster than an A4-1200, and if you're competing with the latter, you don't need nearly so much performance as for the former.
But here, Intel is conspicuously not stating the TDP of any parts just yet. That's not the sort of thing that a company confident in its hardware would do on launch day. AMD announced their TDPs up front with the launch of Kabini and Temash, and tipped their hand even before that with the launch of embedded versions of them. Intel itself announced TDPs at the launch of Haswell, Ivy Bridge, Sandy Bridge, and every other part going back many generations.
Ultimately, Silvermont Atom is a huge advance over previous generation Saltwell Atom cores. But Intel is facing much stronger competition now: AMD Jaguar cores rather than Bobcat, and and ARM Cortex A15 cores rather than A9. Saltwell Atom was an awful product, and Silvermont is, at worst, considerably less bad. Silvermont at least got sent out for reviews; Intel marketing thought it was better to silently launch Saltwell and spare the poor chip the indignity of being blasted as worthless by every reviewer with a shred of integrity.
The only real reason to buy Saltwell Atom was not knowing any better. It's not yet clear whether Silvermont Atom will face the same fate.
Comments
wait...what?
I've never taken away any bias from his posts.
There is Quark: x86 CPU at 1/5th the size and 1/10th the power of Silvermont.
You could probably run a cheap phone with it, but it's definitely aimed at "wearables": watches and small embedded things (medical devices, google glass type stuff, etc).
Atom hasn't really had a niche since its introduction. Although I do see a lot of NAS stations using it, the market for NASes isn't terribly big in the first place.
^^
Yep this is Intel's newest direction. http://gizmodo.com/intel-announces-a-new-14nm-class-processor-for-a-weara-1285545222
Velika: City of Wheels: Among the mortal races, the humans were the only one that never built cities or great empires; a curse laid upon them by their creator, Gidd, forced them to wander as nomads for twenty centuries...
Quark would have to beat out entrenched incumbent chips from ARM and MIPS that cost mere pennies per core. Intel being Intel, Quark is not going to be cheap.
And do I really believe that Intel's first use of their cutting edge, very expensive 14 nm process node will be for a dinky little chip that may or may not have a market and costs next to nothing? No, as a matter of fact, I don't. Quark might come to 14 nm eventually. But not in 2014, and not likely in 2015, either. Rumor puts the first Quark chip on 32 nm, where it would be at a process node disadvantage to today's competition on 28 nm.
I don't know that there really is an entrenched market in the "wearable and ultra-small" category. That category is more or less still emerging.
We are just now really seeing the first products hit the market, those will only be for early adopters, but if they catch on, it could blow up big.
Pebble, Samsung Galaxy Gear, Google Glass... discounting the old 1980's calculator/video game watches, theses are the first wearable products with enough capability to really require a processor. The only device of this size/ability that has really shipped en masse has been the iPod Nano... and it isn't really a wearable (or is it?).
So, to say that Intel has to uproot the installed base is somewhat of a misnomer, there is no installed base yet. Intel is hoping to get in early, use their name recognition to allow them to sell their product at a premium (just like they do with x86 CPUs), and dominate the market from the start.
ARM may seem ideal, since they have been the low power kings for a really long time now, but this is a new category, and Intel may be able to shake it up if they play their cards right. I'm not betting on Intel - they totally botched Atom and their relationship with Microsoft wasn't nearly as strong as it needed to be to prevent an ARM-version of Windows (not counting Windows 8 not nearly as strong as it needed to be in the first place) and to drive people to Intel tablets/convertibles/subcompacts.
AMD may not be the speed kings in high performance desktops, but they have Intel bent over a barrel where the volume is in PCs, and ARM has totally dominated in the emerging "mobile" market.
Now Intel is betting the next emerging market is in wearable (or stuff even smaller and more ubiquitous than mobile). They are probably right - although it may not take off for another decade or so. Kinda like when Apple released the Newton, or Microsoft the Tablet PC - both were a bit premature for the technology, but we inevitably got there and now they are indisputably the general purpose technology market drivers right now.
analyst opinion
http://www.forbes.com/sites/davealtavilla/2013/09/11/intels-bay-trail-impresses-you-wont-have-atom-to-kick-around-anymore/
“Bay Trail is best suited for 10-inch form factors because users will be doing more content creation on them, but that doesn’t make it unsuitable for an 8 inch tablet.”
..
All told, Intel has teed up Bay Trail quite nicely and the Atom Z3000 series could make for excellent alternative tablet offerings this Q4 shopping season. Windows 8 slates are being queued up now for the new platform and are expected to hit next month, with Android offerings reported soon to follow. Intel’s new Bay Trail Atom family is the proverbial “real deal.” The remaining variables are what manufacturers are going to do with the new Atom and how are those tablet offerings will compete feature-wise with the likes of Apple and the ARM/Android army. We’ll find out soon enough.
http://www.itworld.com/hardware/372617/new-wave-tablets-intel-bay-trail-chip-will-start-99
Android tablets with the chips will chips will start at $99, said Intel CEO Brian Krzanich during an IDF keynote Tuesday. The first wave of tablets will have the Windows 8.1 OS, quad-core Atom Z3700 processors and start at around $350, with the less-expensive Android tablets appearing at the end of the year and running on either the quad-core chip or the dual-core Z3600 chips, which will only work with Android tablets.
EQ2 fan sites
The first article notes that Bay Trail Atom is vastly better than the previous Cedar Trail Atom, but that's not saying much. The author doesn't compare it to the obvious competition, and doesn't seem to even be aware that there is competition.
The second article is completely useless. It sounds like he read Intel marketing materials, didn't know what they meant, summarized them, and called it an article.
Whether Bay Trail Atom is any good on Windows systems depends greatly on how it compares to AMD's Kabini/Temash. Any article that doesn't try to make that comparison isn't a serious analysis unless it restricts itself to using Silvermont Atom in a non-Windows environment.
And there are some serious red flags. Intel's page on the chips is up now:
http://ark.intel.com/products/76760/Intel-Atom-Processor-Z3770-2M-Cache-up-to-2_39-GHz
The most conspicuous issue is that it makes no mention of TDP. There is an SDP, but that's about as meaningful as the nominal wattage rating on a power supply from a disreputable vendor. It's a marketing number, not an engineering one. This may be the first time Intel has put up a page for a CPU without listing the TDP. It's telling that Intel doesn't want to talk about TDP; AMD announced the TDP of their own products at launch. If Intel had a better chip than AMD, why not say so? The entire performance competition in low-wattage chips is how much performance can you pack into a given wattage; it's trivial to add more performance if you're allowed higher wattage.
Another concerning issue is the lack of mention of graphics API support. Intel claims DirectX 11 and OpenGL ES 3.0 for Windows. But the nearest DirectX equivalent to the latter is 9.0c; if it supported the full OpenGL, why not say so? Temash/Kabini support OpenGL 4.2, and will support 4.3 with the next video driver update. Intel doesn't say the feature level of DirectX 11, either, so it could easily mean 9_3. Microsoft lets graphics chips that don't properly support anything past DirectX 9.0c call it DirectX 11 feature level 9_3, and many just call this DirectX 11.
On the bright side, Bay Trail comes in cheaper than anticipated, at $37 for the top bin. For comparison, the top bin Kabini/Temash chip is $72.
Even so, with those two glaring red flags, I'm going to say that Bay Trail Atom is junk until proven otherwise. When a company keeps some specs secret at launch, it's basically never because they don't want you to know that they're awesome.
It will sell Windows 8/8.1 Tablets better than Windows RT, which everyone hated.
I can not remember winning or losing a single debate on the internet.
There is that ....
seen on cnet, forthcoming 10" win 8.1 tablet/laptop hybrid
Asus Transformer Book T100 features Bay Trail CPU, coming October 18 for $349 (hands-on)
http://reviews.cnet.com/tablets/asus-transformer-book-t100/4505-3126_7-35827544.html
EQ2 fan sites
What I find strange is that, while Acer and HP sell a number of laptops with AMD Temash chips--including the best bin A4-1250 and A6-1450--they only put the chips in normal laptops and not in tablets or detachables. I wonder if Bay Trail Atom will meet the same fate.
If you want a Windows 8 tablet, both Temash and Bay Trail Atom are vastly better than anything you can actually buy in a tablet today (the competition: Cedar Trail Atom, AMD Hondo, Ivy Bridge, and assorted junk). But proper Windows 8 tablets seem to be slow in coming, even after there are a lot of Kabini/Temash laptops out there. Now rumors say that Microsoft is going to stick a Haswell chip in the Surface Pro 2, which, at 11.5 W or higher, would be another failure at bringing a decent Windows 8 tablet to market.
Intel won't talk about Bay Trail power consumption, so we don't know what it is. The AMD A4-1250 is 3.9 W TDP. The AMD A6-1450 is 8 W TDP. Intel claims an "SDP" of 2 W for the Atom Z3770, but that doesn't actually mean anything. AMD could claim an SDP of 1.5 W for either of its tablet chips if so inclined. Or 3 W. Or any other number between the idle power consumption and the TDP.
It wins at having the Intel marketing machine behind it. A lot of people think having that "Intel Inside" sticker means more than it really does.
pcper can't test the power draw of just the CPU. They also can't come up with a theoretical max thermal output case - they can just run some benchmarks and see whatever a Kill-A-Watt pulls out of the wall.
That's a lot different than an actual engineering-based TDP, which is geared more toward thermal removal considerations than power input considerations.
The two numbers should correlate to some degree, but that doesn't mean they are equal, and it doesn't mean that pcper found the maximum power draw case with their benchmarks either.
another hands on
http://www.notebookreview.com/default.asp?newsID=6951&News=ASUS+Transformer+T100+Windows+8+Tablet+Hands+On+Preview
EQ2 fan sites
Full load under what test? Did they push both the CPU and the GPU hard? On nearly any AMD APU, even an artificial stress test of the CPU that doesn't touch the GPU probably won't draw half of the rated TDP.
Furthermore, you have to report both power draw and performance numbers for a chip and its competition in exactly the same test in order for the results to be meaningful. Otherwise, a chip can clock up for extra performance in the performance test and throttle back to save power in the power test.