Intel has been trying to get into mobile since a few years ago. The problem has not been performance (at the time), because Atom was much more powerful than the 400-600 Mhz ARM11 CPU’s we were having in smartphones back then. The problem has always been power consumption (cost, too, but that’s another issue).
They are now trying to convince us that Intel has finally cracked the code, and its chips are not just good enough for mobile, but actually superior. Well, I think that’s nowhere close to being true. Intel may have cracked the power consumption code for Atom, but only because they’ve kept the performance of Atom pretty much the same over the years, so they can work on making it more efficient. Working on increasing performance is in conflict with working on increasing efficiency, in general.
When a chip maker builds a chip, they have some choices. Do they keep the power consumption the same as last year’s chip, but increase the performance as much as they can? Do they keep the performance the same, but try to lower the power consumption as much as they can? Do they lower performance, but lower power consumption even more? Do they increase power consumption, but also increase the performance more? Or do they just increase the performance a little bit, and also reduce the power consumption a bit (a compromise)?
These can all be design goals for chips, and a chip maker has to make a decision for that specific chip, depending on what type of market they want to address. Since they wanted to address the mobile market, and since they already had a huge (not to be understated) advantage in performance with Atom, then they decided to just focus on power consumption. The problem is that during all this time, ARM chips have made incredible gains in performance, and now the latest Cortex A15 or latest Qualcomm custom ARM chips can easily beat Atom chips in performance, which is why ARM chips are still way ahead right now.
Another problem has been Intel’s choice of weak GPU’s. They always seem to use very old and weak PowerVR GPU’s, that keeps them way behind the competition. Their latest Clover Trail can’t even beat Tegra 3 in GPU performance, and whatever chip they have planned now (for end of the year shipping) most likely won’t be enough to compete even with Tegra 4.
Their video of showing how their GPU can run the Unreal engine and the Epic Citadel demo is pretty pointless, considering that demo is pretty obsolete now, and virtually any 2 year old device can run it at 30 FPS on max details on their native resolution. So although they are trying to be I suppose impressive with this video, it has no reason to impress anyone: