sexta-feira, 12 de dezembro de 2014

Apple A8 vs Snapdragon 805 vs Exynos 5433 - Smartphone SoC Comparison (2014 Edition)

Smartphones have become just about the most important gadget in a person's life, since it has positioned itself as a sort of a does-everything device (recently, even measuring your heart rate). As such, it needs a powerful processor to keep things going smoothly, even with so many utilities and features baked in. Accordingly, smartphone processor performance has seen exponential growth over the last few years, blazing past even some older laptops at this point. This year of 2014 we have the latest and greatest ultra-mobile processors shipping in devices in time for the holiday season. Among the best competitors we have: Apple, with its A8 processor, found in the iPhone 6 and 6 Plus, Qualcomm, with its latest and greatest Snapdragon 805, and Samsung's Octa-core Exynos 5433 SoC. It's no doubt that all three processors are performance monsters, but which of them offers the best performance, and more importantly, which one is the most power efficient?

Firstly, let's see how these processors compare on paper:

Apple A8 Snapdragon 805 Exynos 5433
 Process Node   20nm  28nm HPM  20nm HKMG 
 CPU  Dual-core 64-bit "Enhanced Cyclone" @ 1.4GHz  Quad-core 32-bit Krait 450 @ 2.7GHz  Octa-core 64-bit big.LITTLE (Quad-core ARM Cortex-A57 @ 1.9GHz + Quad-core ARM Cortex-A53 @ 1.3GHz)
 GPU  PowerVR GX6450 @ 450MHz (115.2 GFLOPS)  Adreno 420 @ 600MHz (337.5 GFLOPS)  Mali T760-MP6 @ 700MHz (204 GFLOPS)
 Memory Interface   Single-channel 64-bit LPDDR3-1600 (12.8GB/s)  64-bit Dual-channel LPDDR3-1600 (25.6GB/s)  32-bit Dual-channel LPDDR3-1650 (13.2GB/s)


At least on paper, all three processors are extremely powerful and very competitive when it comes to power and efficiency. However, the three SoCs use extremely different approaches to achieve their performance. While Apple prefers to have a smaller CPU core count, whilst making the core itself very large to achieve a high performance with just two cores, Samsung's quantity-over-quality philosophy means that they chose to throw in a very large number of CPU cores (in fact, eight of them). Qualcomm sits between Apple and Samsung, offering four CPU cores with decent per-core performance. In practice, given that most applications do not scale performace very well beyond two cores, I personally prefer Apple's approach, however a more limited selection of apps that can actually utilize a large core count, for instance, games which involve more complex physics calculations, might see Samsung's approach as the fastest option. Either way, the most accurate way of comparing the performance of these processors is using synthetic benchmarks.

Let's start with the GeekBench 3 benchmark, which tests CPU performance:
As you can see, Apple's second generation Cyclone core is just about the fastest core used in any current smartphone. Nvidia's Denver CPU core used in their Tegra K1 SoC outperforms the Cyclone core, but since the Tegra K1 is pretty much a tablet-only platform, I'm not considering it in this comparison. Meanwhile, The also 64-bit Exynos 5433, while behind the A8 by a large margin, is slightly above the Snapdragon 805. I also included data from the Snapdragon 801 chipset to quantify the evolution of the Krait 450 core in the Snapdragon 805 compared to its predecessor, Krait 400. The difference isn't big, actually, which makes for the fact that the Snapdragon 805 has the weakest single-threaded performance of all current high-end SoCs.
With four high-performance CPU cores aided by another four low-power cores (yes, Samsung managed to make both core clusters work at the same time, unlike with their previous big.LITTLE CPUs), it was obvious from the start that Samsung's processor would come out on top in applications that scale to multiple cores. In fact, the Exynos 5433's multi-threaded performance has a significant advantage over the competition. In second place comes the Snapdragon 805, with a much lower yet still very high score. Again, the multi-threaded test shows only a marginal improvement over the Snapdragon 801. And in last place comes Apple's dual-core A8, which, despite employing a very powerful core solution, simply had too few cores to outperform the competition. Still, it's not far behind the Snapdragon 805, and its score is very respectable indeed. 

Now, moving on to what probably is considered the most important area in SoC performance: graphics. To measure these processors' capability for graphics rendering, we turn to the GFXBench 3.0 test.
It's reasonable to say that the three main competitors in the high-end SoC segment are pretty much on par in terms of their GPUs' OpenGL ES 3.0 performance. However, the PowerVR GX6450 in the iPhone 6 Plus takes the lead, followed closely by the Snapdragon 805's Adreno 420, and in last place is the Mali-T760 in the Exynos 5433, but again, losing by a small margin.
For OpenGL ES 2.0 performance we see the performance gap widen, however the same basic trend can be seen: The Apple A8 takes first place, followed closely by the Snapdragon 805 and a bit further behind we have the Exynos 5433. Also note how, unlike what we've seen in the CPU benchmarks, this time the Snapdragon 805 gets a huge boost compared to its predecessor, the Snapdragon 801.
The ALU test focuses on measuring the GPU's raw compute power, and on this front, Qualcomm seems to be sitting very comfortably, since both the Snapdragon 805 and its sucessor the 801 are far ahead of the Apple A8 and the Exynos 5433 GPUs.

The Fill test depends mostly on the GPU's Render Output Units (ROPs) and on the SoC's memory interface. Given that the Snapdragon 805 has a massive memory interface, comparable to the one on Apple's tablet-primed A8X chip, it naturally had a huge advantage in this test. Meanwhile, the Apple A8 is slightly below the last-gen Snapdragon 801, and the Exynos 5433 comes in last place, but by a small margin.

Power Consumption and Thermal Efficiency

Since these chips are supposed to run inside smartphones, a lot of attention has to be given for the SoC to fulfill two requirements: consume as little power as possible, especially during idle times, and not heat up too much when under strain. I believe that Apple's A8 chip fares best in this department, because apart from being built on a 20nm process, it's Cyclone CPU has proved to be quite efficient in previous appearances. As for Samsung's Exynos 5433, despite being built on 20nm too, I'm not sure that a processor that can have 8 CPU cores running simultaneously can keep itself cool when under strain without thermal throttling. Although at least, in terms of power consumption, idle power should be very low thanks to the low-power Cortex-A7 cores. Finally, it's a bit hard to determine how power efficient Qualcomm's processors are because the company discloses close to nothing about its CPU and GPU architectures. However, it is a proved solution. Krait + Adreno SoC's from Qualcomm can be found on almost every flagship smartphone from 2014, so while it has the disadvantage of still not having moved to 20nm, experience from the past proves that their SoCs and architectures are sufficiently efficient. 

Conclusion

It's a bit hard to determine exactly which processor is the best. Each one of these fares better than the others in at least one area, but each also has its clear weakness.

The Apple A8, using just two, however powerful, CPU cores, not to mention at relatively low clock speeds, can deliver top-notch single-threaded performance, however its low core count hurts its performance amid the quad- and octa-core competition in multi-threaded applications. Also, the PowerVR GX6450 GPU was a good choice, as at least for general gaming it appears to be the fastest solution available on any smartphone. Power consumption should be also pretty low thanks to the 20nm process used and to Apple's and ImgTech's efficient architectures.

The Snapdragon 805 is really more of an evolution of the 801, without any huge changes. For instance, it's the only 32-bit processor being compared here. However, it still manages to deliver excellent performance, building on the success of the outgoing 801. While it's single-threaded performance is a bit disappointing for a 2.5GHz CPU, it does very well in multi-threaded applications, nearing the Exynos 5433's performance. The Adreno 420 GPU also performs extremely well, losing only to the Apple A8 in GFXBench's general gaming tests and absolutely destroying the competition in terms of memory bandwidth and raw compute power. While a move to 20nm would be appreciated, Qualcomm's processors are known for being power efficient, so no problem here. 

Finally, Samsung's Exynos 5433 is really a mixed bag. It's 20nm HKMG process, together with the low-power Cortex-A7 cores, makes way for excellent power efficiency, at least in terms of idle power, and thanks to its huge core count, its multi-threaded performance is ahead of everyone else. It should be noted that, despite the 20nm process, having 8 cores running at full load might introduce the need for thermal throttling, especially in a smartphone chassis.
However, the Mali-T760 GPU employed is slightly behind the competition in terms of general gaming performance, and raw compute power is quite disappointing...thankfully, raw compute power matters little to the vast majority of users. Still, it's an excellent GPU, just not THE best. 

Overall, these are all excellent processors, each one with their respective advantages and disadvantages. It all comes down to what aspects you think is more important to you, for instance, if you value performance in multi-threaded applications, a Exynos 5433-powered device is ideal for that. For an excellent all-around package, which is also a proven solution for smartphones (plus admirable GPU compute power), pick a Snapdragon 805 device. And if you don't mind as much about multi-threaded performance, but want to have the best gaming performance in any smartphone, you can pick one of Apple's A8-powered iDevices. 

domingo, 23 de novembro de 2014

Apple A8X vs Tegra K1 vs Snapdragon 805 - Tablet SoC Comprarison (2014 Edition)

In the last few years, ultra-mobile System-on-Chip processors have made unprecedented strides in terms of performance and efficiency, advancing very quickly the standards for mobile performance. One form factor that particularly benefits from the exponential growth of SoC performance are tablets, since their large screens allow for the processors' abilities to be fully utilized. For the holiday season of 2014, we have the latest and greatest of mobile performance shipping inside high-end tablets. Apple has made a whole new SoC just for their iPad Air 2 tablet, which they call the A8X. Nvidia's Tegra K1 processor, which borrows Nvidia's venerable Kepler GPU architecture, has also appeared on a number of new high-end tablets. Finally, we also have the Qualcomm Snapdragon 805 processor found in the Amazon Kindle Fire HDX 8.9" (2014). Unfortunately, most other tablets either use the aging Snapdragon 801 processor, or in the case of Samsung's latest high-end tablets, use an even older Snapdragon 800 processor or the also old Exynos 5420 processor, which debuted with the Note 3 phablet in late 2013. In any case, at the pinnacle of tablet performance, we have the Apple A8X, the Tegra K1 and the Snapdragon 805 battling for the top spot.

 Apple A8X   Nvidia Tegra K1   Snapdragon 805
 Process Node   20nm  28nm HPM  28nm HPM
 CPU  Tri-core "Enhanced Cyclone" (64-bit) @ 1.5GHz  32-bit: Quad-core ARM Cortex A15 @ 2.3GHz
 64-bit: Dual-core Denver @ 2.5GHZ
 Quad-core Krait 450 @ 2.5GHz
 GPU  PoverVR GXA6850 @ 450MHz (230 GFLOPS)  192-core Kepler GPU @ 852MHz (327 GFLOPS)  Adreno 420 @ 600MHz (172.8 GFLOPS)
 Memory Interface  64-bit Dual-channel LPDDR3-1600 (25.6GB/s)  64-bit Dual-channel LPDDR3-1066 (17GB/s)  64-bit Dual-channel LPDDR3-1600 (25.6GB/s)


The CPU

It can certainly be said that all of this year's high-end mobile processors have excellent CPU performance. However, each manufacturer took a different path to reach those high performance demands, and that is what we'll be looking at in this section.

Starting with the A8X's CPU, what we have in hand is Apple's first CPU with more than two CPU cores. This time we have a Tri-core CPU, based on an updated revision of the Apple-designed Cyclone core, which utilizes the ARMv8 ISA and is therefore a 64-bit architecture. Clock speeds remain conservative with Apple's latest CPU, going no further than 1.5GHz. So with three cores at 1.5GHz, how does Apple get performance competitive with quad-core, 2GHz+ offerings from competitors? The answer lies within the Cyclone core.
The Cyclone CPU, now in its second generation, is a very wide core. As it is, it can issue up to 6 instructions per clock. Also, each Cyclone core contains 4 ALUs, as opposed to 2 ALUs/core in Apple's previous CPU architecture, Swift. Also, the reorder buffer has been increased to 192 instructions, in order to avoid memory stalls and to utilize more fully the 6 execution pipelines. In comparison, a Cortex-A15 core can co-issue up to 3 instructions per clock, half as much as Cyclone, and can hold up to 128 instructions in its reorder buffer, only two thirds of the amount that Cyclone's reorder buffer can hold.
By building a very wide CPU architecture, and keeping their CPUs to low core counts and clock speeds, Apple has, in one move, achieved excellent single-threaded performance, far beyond what a Cortex A15 or a Krait core can produce, while at least matching the quad-core competition in multi-threaded processing. I've always said that, due to the fact that CPU instructions tend to have a very threaded nature, CPUs should be way more efficient if they are built emphasizing single-threaded performance, and Apple continues to do the right thing with Cyclone.

The Snapdragon 805 is the last high-end SoC to utilize Qualcomm's own Krait CPU architecture, which was introduced WAY back with the Snapdragon S4. Needless to say, it's still a 32-bit core. The last revision of the Krait architecture is dubbed Krait 450. While Krait 450 carries many improvements compared to the original Krait core, the basic architecture is still the same. Like the Cortex-A15 it's based on, Krait is a 3-wide machine, capable of co-issuing up to 8 instructions at once. In comparison to Cyclone, it's a relatively small core, therefore, it won't be as fast in terms of single threaded performance. Krait 450's tweaked architecture allows it to run at a whopping 2.7GHz, or to be more exact, 2.65GHz. In the case of the Snapdragon 805, we have four of these Krait 450 cores. Qualcomm's signature architecture tweak, which involves putting each core on an individual voltage/frequency controller, allows each core to have a different frequency. That reduces the power consumption of the SoC, and should translate into better battery life. With four cores, and at such a high frequency, the Snapdragon 805's CPU gets very good multi-threaded performance, although the relatively narrow Krait core hurts single-threaded performance very much.

Finally, we have the Tegra K1 and its two different versions. The 32-bit version of the Tegra K1 employs a quad-core Cortex-A15 CPU clocked at up to 2.3GHz, and we've seen a CPU configuration like this in so many SoCs that by this point it's a very well known quantity. The interesting story here is the 64-bit Tegra K1, which uses a dual-core configuration of Nvidia's brand new custom CPU architecture, named Denver. If you don't care much to know about Denver's architecture, you'd better skip this section, because there is A LOT to say about Nvidia's custom CPU.

Denver: The Oddest CPU in SoC history

Denver is Nvidia's first attempt at making a proprietary CPU architecture, and for a first attempt it's actually very good. Some of Nvidia's expertise as a GPU maker has translated into its CPU architecture. For instance, exactly like with Nvidia's GPU architectures, Denver works with VLIW (Very Long Instruction Word) instructions. Basically, this means that the instructions are packed into a 32-bit long "word", and only then are sent into the execution pipelines.

Denver's most peculiar characteristic might be this one: it's an in-order machine, while basically every other high-end mobile CPU has Out-of-Order Execution (OoOE) capabilities. Denver's lack of a dedicated engine that reorders instructions in order to reduce memory stalls and therefore increase the IPC (Instructions Per Clock) should be a huge performance bottleneck. However, Nvidia employs a very interesting (and in my opinion unnecessarily complicated) way of dealing with its in-order architecture.

By not having a hardware OoOE engine built into the CPU, Nvidia has to rely on software tricks to reorder instructions and enhance ILP (Instruction Level Parallelism). Denver is actually not meant to decode ARM instructions most of the time. Rather, Nvidia chose to build a decoder that would run native instructions, optimized for maximum ILP. For this optimization to occur, Nvidia has implemented a Dynamic Code Optimizer (DCO). Basically, the DCO's job is to recognize ARM instructions that are being sent to the CPU frequently, translate it into native instructions and optimize the instruction by reordering parts of the instruction to reduce memory stalls and maximize ILP. For this to work, a small part of the device's internal storage must be reserved to store the optimized instructions.

One implication of this system is that the CPU must be able to decode both native instructions and normal ARM instructions. For this purpose there are two decoders in the CPU block. One huge 7-wide decoder for native instructions generated by the DCO, and a secondary 2-wide decoder for ARM instructions. The difference in size between the two decoders shows how Nvidia expects to have the native instructions being used most of the time. Of course, at the first time that a program is run, and there are no optimized native instructions ready for the native decoder to use, only the ARM decoder would be used until the DCO starts recognizing recurring ARM instructions from the program and optimizes those instructions, from which point onwards that specific instruction would always go through the native decoder. If a program ran the same instructions multiple times (for example, a benchmark program), eventually all of the program's instructions would have a corresponding native optimized instruction stored, and then only the native decoder would be utilized. That would correspond to Denver's peak performance scenario.

While Nvidia's architecture might be a very interesting move, I ask myself if it wouldn't just be easier to build a regular Out-of-Order machine. But still, if it performs well in real life, it doesn't really matter how odd Nvidia's approach was. 

Now, going on to the execution potion of the Denver machine, we see why Denver is the widest mobile CPU in existence. That title was previously held by Cyclone, with its 6 execution pipelines, however, Nvidia went a step ahead and produced a 7-wide machine, capable of co-issuing up to seven instructions at once. That alone should give the Denver core excellent single-threaded performance.

The 64-bit version of the Tegra K1 employs two Denver cores clocked at up to 2.5GHz. That makes it the SoC with the lowest core count among the ones being compared here. While single-threaded performance will most certainly be great, I'm not sure that the dual-core Denver CPU can outrun its triple-core and quad-core opponents.

In order to test that, let's start our synthetic benchmarks evalutation of the CPUs with Geekbench 3.0, which evaluates the CPU both in terms of single-threaded performance and multi-threaded performance.

CPU Benchmarks

In single-threaded applications, Nvidia's custom Denver CPU core takes the first place, followed closely by Apple's enhanced Cyclone core on the Apple A8X. Meanwhile, the older Cortex-A15 and Krait 400 CPU cores are far behind, with the 2.2GHz A15 core in the 32-bit Tegra K1 pulling slightly ahead of the 2.7GHz Krait 450 core in the Snapdragon 805. 


In multi-threaded applications, where all of the CPU's cores can be used, the A8X, with its Triple-core configuration blows past the competition. The dual-core Denver version of the Tegra K1 gets about the same performance as the quad-core Cortex-A15 Tegra K1 variant, with the quad-core Krait 450 coming in last place, but by a very, very small margin. 

Apple's addition of one extra core to the A8X's CPU, together with the fact that Cyclone is a very powerful core, make it easily the fastest CPU in the market for multi-threaded applications. While Nvidia's 64-bit Denver CPU core has some impressive performance, thanks to its wide core architecture, it's core count works against it in the multi-threaded benchmark. It is, in fact, the only dual-core CPU being compared here. Even if it's not as fast as the A8X's CPU, Nvidia's Denver CPU is a beast. Were it in a quad-core configuration, it would absolutely blow the competition out of the water.

The GPU

Moving away from CPU benchmarks, we shall now analyze graphics performance, which is probably even more important than CPU performance, given that it is practically a requirement for high-end tablets to act as a decent gaming machine. First we'll look at OpenGL ES 3.0 performance with GFXBench 3.0's Manhattan test, followed by the T-Rex test, which tests OpenGL ES 2.0 performance, followed by some of GFXBench 3.0's low level tests.

The Manhattan test puts the Apple A8X ahead of the competition, followed closely by both Tegra K1 variants, which have about the same performance, since they have the exact same GPU and clock speed. Unfortunately, the Adreno 420 in the Snpadragon 805 is no match for the A8X and the Tegra K1, something that points out the need for Qualcomm to up their GPU game.

The T-Rex test paints a similar picture, with the A8X slightly ahead of the Tegra K1, while both of the Tegra K1 variants get about the same score, and the Snapdragon 805 falls behind the other two processors by a pretty big margin.

The Fill rate test stresses mostly the processor's memory interface and the GPUs TMUs (Texture Mapping Units). Since both the Apple A8X and the Snapdrgon 805 have the same dual-channel 64-bit LPDDR3 memory interface clocked at 800MHz, the performance advantage the Snapdragon 805 has shown in comparison to the A8X can only be attributed to the possibility that the Adreno 420 GPU has better texturing performance than the PowerVX GXA6850 in the Apple A8X. Meanwhile, the two variants of the Tegra K1 feature the same memory interface, which also consists of a dual-channel 64-bit LPDDR3 interface, only with a lower 533MHz clock speed. Therefore, the Tegra K1 offers signifcantly less texturing performance compared to the A8X and the Snapdragon 805, but is a very worthy performer nevertheless.
The ALU test is more about testing the GPUs sheer compute power. Since Nvidia's Tegra K1 has 192 CUDA cores on its GPU, it naturally takes the top spot here, and by a pretty significant margin.

For some reason, all tests show the 32-bit Tegra K1 in the Nvidia Shield Tablet scoring a few more points than the 64-bit Tegra K1 in the Google Nexus 9. But given that the two processors have the exact same GPU, this difference in performance is probably due to software tweaks in the Shield Tablet's operating system, which would make sense, given that it is more than anything a tablet for gaming.

Thermal Efficiency and Power Consumption

In the ultra-mobile space, power consumption and thermals are the biggest limiting factors for peformance. As the three processors being compared here are all performance beasts, several measures had to be taken so that they wouldn't drain a battery too fast or heat up too much.

In order to keep power consumption and die size in check, Apple has decided to shrink the manufacturing process from 28nm to 20nm, a first in the ultra-mobile processor market. That alone gives it a huge advantage over the competition, since they can put more transistors in the same die area, and with the same power consumption. Since the A8X is, in general, the fastest SoC available, the smaller process node is important to keep the iPad Air 2's battery life good. 

Nvidia's Tegra K1 should also do well in terms of power consumption and thermal efficiency in situations where the GPU isn't pushed too hard. The 28nm HPM process it's built upon is nothing particularly good, but it's still not old for a 2014 processor. While the Kepler architecture is very power efficient, straining a 192-core GPU to its maximum is still going to produce a lot of heat in any case. The Nexus 9 tablet reportedly can get very warm on the back while the tablet is running an intensive game.

Finally, the Snapdragon 805 should be the less power hungry processor because it is also a smartphone processor. Given that a 5" phone can carry this processor without heating up too much or draining the battery too fast, a tablet should certainly be able to do the same. To put things in perspective, if we put the Tegra K1 or the Apple A8X inside a smartphone, both would be too power hungry and would produce too much heat to make for a decent phone. In any case, the Snapdragon 805 is, like the Tegra K1, built on a 28nm HPm process. Given that its not as much a performance moster as the other two processors mentioned here, it must be the least power hungry of all three.

Conclusion

Objectively speaking, the comparisons made here make it pretty much clear that once again Apple takes the crown for the best SoC for this generation of high-end tablet processors. Not that the competition is bad. On the contrary, Nvidia went, in just one generation, from being almost irrelevant in the SoC market (let's face it, the Tegra 4 was not an impressive processor) to being at the heels of the current king of this market (aka Apple). The Tegra K1 is an excellent SoC, and even if it can't quite match the Apple A8X, it's still quite close to it in most aspects.

Meanwhile, Qualcomm is seeing it's dominance in the tablet market start to fail. It's latest SoC, the Snapdragon 805, available even on some smartphones and phablets, is available in only one tablet, while most others carry the Snapdragon 801 or even the 800, and this is disappointing, given that a tablet can utilize the processing power more usefully than a smartphone or a phablet. Either way, the Snapdragon 805 is still a very good processor. It's just far from being the fastest. Perhaps Qualcomm should consider, like Nvidia and Apple, making a processor with extra oomph, but meant only to run inside tablets, because while the Snapdragon 805 is an excellent smartphone processor, it's not as competitive in the tablet market. 

quarta-feira, 2 de julho de 2014

Samsung Releases New Exynos Processors 5422, 5260


Samsung has prepared a series of new Exynos System-on-Chip processors for this year, attending both mid-range and high-end segments. Firstly, there's the Exynos 5 Octa 5422, which is nothing more than a higher-clocked version of the 5420 chip introduced last year (and seen on devices like the Galaxy Note 3, Note 10.1, as well as the Galaxy Tab Pro, Note Pro, and Tab S range of tablets), then there's the first processor in the Exynos 5 Hexa series, the 5260, which, as its name suggests, has a total of six CPU cores. Finally, there's the Exynos 5 Octa 5800, which is basically a 5422 adapted for use in Samsung's Chromebook 2 13.3".

The 5422 is architecturally identical to the Exynos 5420 released last year, only with higher clock speeds. It therefore consists of two CPU clusters working in a big.LITTLE configuration, one being a high-performance cluster consisting of four Cortex-A15 cores clocked at 2.1GHz, and the other being a low-power cluster containing four Cortex-A7s clocked at 1.5GHz (vs 1.9GHz for the A15s and 1.3GHz on the A7s on the Exynos 5420). The GPU is a 6-core Mali-T628 MP6 clocked at 695MHz, which gives it up to about 142 GFLOPS processing power (vs 533MHz and 109 GFLOPS for the 5420). The system is fed by a 32-bit dual-channel LPDDR3-1866 (14.9GB/s bandwidth) memory interface, same as the 5420. The increase in CPU clock speed is hardly enough to yield a noticeable performance increase, but the GPU's higher frequency compared to the Exynos 5420 is definitely a significant jump. Currently the Exynos 5422 is only available in the SM-G900H variant of the Galaxy S5.

Next up is the Exynos 5 Hexa 5260, which is more of a mid-range SoC due to its lower CPU core count and weaker GPU. This processor, much like with the Exynos 5 Octa processors, has two CPU clusters in big.LITTLE configuration. The high-performance cluster consists this time of two Cortex-A15 cores clocked at 1.7GHz, and the low-power cluster is a quad-core Cortex-A7 clocked at 1.3GHz. The GPU is ARM's Mali-T624, probably in a 4-core configuration, but the clock speed is unknown and so is the theoretical compute power. However, it can be noted that the T624 has half as many execution units per core as the T628, so considering that, plus the fact that there are two less cores than inside the T628 MP6 powering the Exynos 5420/5422, you end up with about one third of the processing power at the same clock speed for the T624 MP4 compared to the T628 MP6. Finally, the 5260 SoC is fed by a 32-bit dual-channel LPDDR3-1600 memory interface, offering 12.8GB/s peak bandwidth, which is more than enough for the CPU and GPU. So far, the Exynos 5 Hexa has only made its way to two devices: the Galaxy Note 3 Neo and the Galaxy K Zoom. 

The Exynos 5 Octa 5800 is practically identical to the 5422, except that it has slightly lower clock speeds (2.0GHz vs 2.1GHz for the A15 cluster and 1.3GHz vs 1.5GHz for the A7 cluster) and it's adapted for use in Samsung's Chromebook 2 13.3" (and perhaps other Samsung Chromebooks in the future).

There are also rumors of an upcoming Samsung Exynos chip codenamed 5433, and this chip will allegedly be competitive with Qualcomm's Snapdragon 805 processor, so if that turns out to be true I'd expect a powerful chip like this to debut in one of the most awaited devices of 2014, the Galaxy Note 4. Also, you might have noticed that all of the new processors discussed here have 32-bit CPUs, while the rest of the mobile SoC market it slowly transitioning to 64-bit, so I wouldn't be surprised if the Exynos 5433 ends up being a 64-bit CPU, perhaps a Cortex-A57/53 big.LITTLE combo. 

terça-feira, 1 de julho de 2014

Microsoft Surface Pro 3 Review


Microsoft's Surface range of tablets were never successful enough to make a big impression in the tablet market, and this is mostly attributed to the fact that the Surface tablets tried to replace both your tablet and your laptop, and thus ended up failing at replacing either properly. Now Microsoft is upping its game with the recently announced Surface Pro 3, by choosing to make it much closer to a work-oriented PC. But by doing so, Microsoft sacrificed the portability factor which benefited the previous Surface tablets' use as a tablet for entertainment. Whether Microsoft's decision to make the Surface Pro 3 more laptop than tablet was a good one or a bad one isn't yet clear, but with a higher resolution screen, better cameras, and a considerably slimmer body, the Surface Pro 3 is definitely a better device overall than its predecessor. 

Since it boasts a much larger screen than previous Surface tablets (12" compared to 10.6"), the Surface Pro 3's dimensions are inevitably larger, even though the size difference was slightly compensated with slimmer bezels on the Pro 3. Measuring 292.1 x 201.4 x 9.1mm, it's far from an ultra portable tablet. In fact, it's about the same size as Samsung's Galaxy Note Pro 12.2 tablet (295.6 x 203.9 x 8mm), which evidently has a slightly larger 12.2" screen. The Note Pro 12.2 is significantly thinner, but then again, the Note Pro 12.2 doesn't have an Intel Core CPU, and nor does it run full Windows 8.1. In fact, the Surface Pro 3 is a very thin tablet considering the hardware it packs. It also weighs 800g, which, again, is pretty light for a 12-inch tablet with an Intel Core processor. In comparison, the Galaxy Note Pro 12.2 weighs 750g.

The back of the Surface Pro 3 is made from the same VaporMg magnesium alloy construction seen in all other Surface tablets, but like with the Surface 2 it has a light-silver colored finish. This makes the tablet very sturdy, and it also looks very premium. The kickstand which has always been unique to the Surface line is back, but it's much improved. The first-gen surface's kickstand had a single 22-degree angle, which didn't make it ideal for many usage cases. The second generation had two angles, 22-degree and 45-degree. Now with the Surface Pro 3, the kickstand can open to any degree between the initial 22-degree stage and the 150-degree limit, making it far more versatile than its predecessors.

The Surface Pro 3 also features improved cameras: two 5MP units on the front and back of the device. the high-res front-facing camera makes the Pro 3 a very good device for video conferencing. As for the rear-facing camera: given that a 10-inch tablet is already very awkward for taking pictures, imagine how doing that with a 12-inch tablet would look like! In any case, it's there, and it's decent enough for a tablet.

The Surface Pro 3 marks the first upgrade in the display department since the original Surface Pro. It's bigger, has a higher resolution and a much more interesting aspect ratio. The 12-inch size offers much more screen real estate than the previous Surface Pros' 10.6" displays, with the only trade-off that the tablet becomes a bit too large to be considered a portable tablet. The resolution has been upgraded from 1920 x 1080 to 2160 x 1440, and the screen aspect ratio is now 3:2, which makes the screen less wide and taller (in other words, squarer) than the usual 16:9 Windows tablets, including the Surface Pro and Pro 2. As with the other Surface tablets, the Pro 3 uses Microsoft's ClearType technology, which fuses the display layers into a single layer, the benefit of that being less screen reflectivity and therefore better sunlight contrast ratio.

Microsoft has made the right choices with the Pro 3's display. The larger size makes its use as a productivity machine as well as a fully-fledged laptop replacement much more feasible, and the increased resolution is also a great improvement. Like I said before, increasing the screen size makes it more suitable for a laptop replacement than a proper media consumption tablet, but this compromise is probably the right one to make. 

The Surface Pro 3 may not look like it, due to its thin chassis, but it packs some very powerful hardware inside it. Like its predecessors, it's equipped with Intel Core processors, in this case, the highly power-efficient 4th-gen parts. You can buy the Pro 3 with a Core i3, a Core i5 or even a Core i7 processor inside, depending on how much you're willing to pay. The following table shows the different processor/RAM configurations and their respective price

 Price   $799   $999   $1,299  $1,549   $1,949 
 CPU   Intel Core-i3 4020Y
(2 Cores/4 Threads @ 1.5GHz)
 Intel Core-i5 4300U
(2 Cores/4 Threads @ 1.9GHz base/2.9GHz Turbo)
 Intel Core-i5 4300U
(2 Cores/4 Threads @ 1.9GHz base/2.9GHz Turbo) 
 Intel Core-i7 4650U
(2 Cores/4 Threads @ 2.7GHz base/3.3GHz Turbo)
 Intel Core-i7 4650U
(2 Cores/4 Threads @ 2.7GHz base/3.3GHz Turbo)
 GPU   Intel HD 4200 @ 200MHz base/850MHZ Turbo  Intel HD 4400 @ 200MHz base/1.1GHz Turbo  Intel HD 4400 @ 200MHz base/1.1GHz Turbo  Intel HD 5000 @ 200MHz base/1.1GHz Turbo  Intel HD 5000 @ 200MHz base/1.1GHz Turbo
 Max TDP  11.5W  15W  15W  15W  15W
 RAM   4 GB  4 GB  8 GB  8 GB  8 GB


For the tablet form factor even the Core i3 is already a very good processor and should be fine for most basic tasks. I'd say that the Core i7 model is a bit overkill for a tablet unless you want to use it for gaming or for more intense tasks. All Surface Pro 3 variants have an internal fan for cooling, but they're generally quiet and therefore shouldn't be an inconvenience. However, logically the higher CPU bins, especially the Core i7 model, are more likely to make the fan run faster and noisier, as well as use up battery faster, as all Surface Pro 3 models have the same 42Wh battery. Taking heat dissipation, fan noise, battery life and performance in consideration, I consider the Core-i5 model the ideal compromise between these variables, since it's fast and at the same time not too power hungry. 

By the way, Microsoft claims 9 hours of continuous web browsing for the Surface Pro 3, which is very good for a tablet with this hardware, although I'm not sure how the different CPU bins will vary in terms of battery life.

As with other Surface Pros, the Pro 3 comes with a digital pen stylus. This time however, it's not a Wacom digitizer, which means there are less pressure sensitivity levels (256 in the Pro 3 vs 1024 in the previous Surface Pros), but the new NTrig technology allows for some nifty software features. As usual, there's no place on the tablet's body to store the Surface Pen inside. The Surface Pro 3's new 3:2 screen aspect ratio combines very well with the stylus, as in portrait mode, the screen's proportions make it look rather like a 12" drawing pad.

Pricing and Conclusion

Microsoft is offering the Surface Pro 3 in a variety of different processor/RAM/storage options. The entry-level model has a Core-i3 CPU, 4 GB of RAM and a 64 GB SSD drive and costs $799. Then there's a Core-i5 model with 4 GB of RAM and 128 GB SSD storage, which goes for $999 and another Core-i5 model, but with 8 GB of RAM and a 256 GB SSD, costing $1,299. Then there's a Core-i7 model with 8 GB of RAM and a 256 GB SSD which costs $1,549 and finally there's the most expensive Core-i7 model with 8 GB of RAM and 512 GB of SSD storage, which costs $1,949. The Touch Cover for the Surface Pro 3 is sold separately, and costs $129. Microsoft should really have this keyboard bundled with the tablet, as the tablet itself is already very expensive, and like I said before, the Pro 3 is more of a work-oriented laptop replacement than a media consumption tablet, and in order for it to do what it does best, that is, replace your laptop, it needs the keyboard cover.

The first two Surface Pros tried to be both a laptop and a tablet, but failed at being any of the two. Now with the Surface Pro 3, it's a device that excels in productivity tasks, therefore making itself a worthy laptop replacement, and is at least usable as a tablet, but sacrifices the portability factor that is why people usually buy tablets in the first place. So while it is a compromise, it's the best one Microsoft could've made.

So what is the verdict on the Surface Pro 3? Well, it has the screen real estate, the hardware, software, and a keyboard cover to fully replace your laptop, and at the same time it's also more portable than any ultrabook out there, probably has more battery life than most of them, and can still double as a tablet, albeit a very large one. Wrap that up with a versatile kickstand, a high-res screen, a chassis that may just the the thinnest to sport an Intel Core CPU and an improved stylus and you have just about one of the best tablet-laptop hybrid devices so far, and most certainly the best Surface tablet ever released.

domingo, 18 de maio de 2014

Xiaomi Announces the Mi Pad, First Tegra K1 Tablet


Four years into its existence, Chinese smartphone manufacturer Xiaomi has achieved great success with its smartphone sales in the Chinese market, and is now even starting to expand to other countries. In fact, the company has just announced its first tablet, dubbed the Mi Pad. As is common with many Xiaomi products, the Mi Pad is designed very similarly to one of its main competitors' mobile products, in this case, it looks like an iPad mini with a back that resembles the plastic, colourful iPhone 5c. However, the Mi Pad does bring something new to the table: A remarkably low 1,499 yuan (US$240) starting price and what might be just about the best mobile processor designed to date: Nvidia's Tegra K1. 

To say that the Xiaomi Mi Pad is based on Apple's iPad mini Retina would be an utter understatement. It has the same screen size and resolution, with a 7.9" diagonal and a 4:3 aspect ratio and a 2048 x 1536 resolution, which adds up to 326 pixels per inch. "Copied" or not from Apple's tablet, it has to be said that this is a very good screen for such a low-priced tablet.

Since the screen size and aspect ratio are the same, the Mi Pad also has very similar dimensions to the iPad mini, measuring 202 x 135mm. It's thicker than the iPad mini though, measuring 8.5mm thick, but that's because it has a huge (for its size) 25Wh battery (vs 23.4Wh for the iPad mini) to power both the high-resolution screen and the beefy 192-core GPU in the Tegra K1 processor. The Mi Pad is also slightly heavier than its competitors, weighing 360g (vs the 338g iPad mini Retina), although it's not exactly heavy, and the large battery more than compensates for the added weight. 

The back of the tablet is made of glossy plastic, which will be available in a variety of different colors (yellow, pink, blue, green, white and black). Xiaomi itself says that it uses the same kind of plastic build as the iPhone 5c. 

The tablet has a decent (on paper at least) 8MP Sony camera with F/2.0 aperture on the rear, and the front sports an unusually high-resolution 5MP front-facing camera, so it should be excellent for video conferences and selfies. 


The main reason why the Mi Pad stands out from other tablets in the market is because it's the first device to ship with Nvidia's Tegra K1 processor. While Nvidia's past Tegra processors have never been the fastest, the Tegra K1 marks a huge change in Nvidia's strategy in the mobile market. The Tegra K1 actually has two variants: one with a quad-core 32-bit Cortex-A15 CPU with a max clock speed of 2.3GHz (we've never seen a Cortex-A15's clock speed go this high before) plus Nvidia's signature use of a fifth low-power A15 core for handling low-power workloads while consuming very little power, and the other variant will sport a dual-core variant of a custom CPU core designed by Nvidia, the 64-bit Denver, which will apparently run at up to 2.5GHz. The Denver K1 will only be released later this year, so it's definitely the Cortex-A15 variant powering the Mi Pad. This is no issue though, as the Cortex-A15 is still one of the fastest CPU cores available, especially at 2.3GHz.

But what really stands out in the Tegra K1 is Nvidia's abandoning of the old GPU architecture used in previous Tegra's ULP Geforce GPUs in favor of the most recent, most efficient, Kepler architecture. This immediately implies that the Tegra K1 supports a wide range of APIs, including OpenGL ES 3.0 and DirectX 11. The Kepler GPU in the Tegra K1 is a full SMX unit, which means 192 unified shader cores, more than on any other mobile GPU. Not only that, but Kepler's cutting-edge power-efficiency means that, even on a thermally and power-constrained form factor like a tablet the GPU can get to very high clock speeds; Nvidia says it can go up to 950MHz, which is just insane for a mobile SoC. We still can't say wether the Mi Pad will heat up or drain its battery too fast with usage due to the powerful processor inside, but given Kepler's efficiency, I believe that neither will be a problem.

Some benchmarks of the Mi Pad that surfaced with its announcement show truly impressive GPU performance. For instance, it scored 30fps on the GFXBench 3.0 Manhattan Offscreen test, which is unprecedented for a mobile device. For comparison, the second fastest device for this particular benchmark, the iPad Air, tied with the iPad mini Retina and the iPhone 5s, scores 13fps on the same test. In other words, virtually all mobile SoCs are dwarfed by the Tegra K1 when it comes to GPU performance.

As there's no way to ascertain whether the Mi Pad is running the Tegra K1's GPU at the full 950MHz, I decided to try to determine it indirectly, so I ran the GFXBench T-Rex Offscreen (Manhattan isn't available on Windows yet) GPU benchmark on my G750JW laptop, which has a GPU with the same Kepler architecture. My laptop's GPU contains four Kepler SMX with a max clock speed of 910MHz, in other words, four times the amount of cores in the Tegra K1 with a very close clock speed. And indeed, my laptop scored a bit less than four times what the Mi Pad scored in the same test (60fps by the way), the difference being accountable for the 40MHz lower clock speed of my laptop's GPU.

Hence, I can confirm that the Mi Pad has its 192-core GPU running at the full stunning 950MHz. This gives it 364.8 GFLOPS of peak processing power, much more than what the Xbox 360 and PS3 could produce. So if the Mi Pad is rather uninspiring in its design and specs, at least its performance is downright amazing. The GPU is definitely capable of powering demanding mobile games, even with the Mi Pad's high 2048x1536 resolution. And with Nvidia bringing console games to its Tegra devices, like Portal and Half Life 2, the Mi Pad has the potential to be just about the best gaming tablet on the market. 

Conclusion

It's a shame that Xiaomi doesn't sell its devices in many countries, because they've been producing some very good stuff lately. Even though the Mi Pad isn't exactly original with its design and spec sheet, that doesn't mean its a bad tablet. Quite the contrary. With a high-resolution display, an excellent duo of rear and front cameras, a large battery, and what is by far the fastest mobile processor available, the Mi Pad has all it needs to compete with top-end small tablets like the Nexus 7, the Samsung Galaxy Tab Pro 8.4 and the iPad mini Retina. Wrap that up with a 1,499 yuan/US$240 price tag, and you may just have not only the best portable tablet, but also the best tablet value on the market.

The tablet will be on sale in China some time around June, and will subsequently roll out to a few other emerging markets as part of Xiaomi's expansion plan (some of these countries are: Brasil, Malaysia, Mexico, Russia, Indonesia and Thailand). In the US, it'll probably be available at some point through online importers, but that means that the price tag will be a bit above $240. If you can get your hands on the Mi Pad, and if its resemblance to some Apple products doesn't bother you that much, I would definitely recommend it.

segunda-feira, 3 de março de 2014

Samsung Galaxy S5 Revealed

After numerous leaks, the next iteration of the Galaxy S series is finally official. The new Samsung Galaxy S5, announed at this year's MWC event in Barcelona, is a relatively conservative upgrade to the Galaxy S4, and doesn't bring some of the huge improvements that many leaks were suggesting, like a 64-bit CPU and a 2K screen. So instead of bringing something brand new to the market, the Galaxy S5 rather seeks to improve upon its predecessors.

With an IP67 certification, a 5.1" 1080p screen, a 16MP rear camera with 4K video recording, Snapdragon 801 processing power and a new take on fitness features with a heart rate monitor, as well as an on-screen fingerprint scanner, the Galaxy S5 isn't a game-changing smartphone, but it's still probably the best smartphone available. If only Samsung had wrapped up all of these new features with a more premium metal design...

Actually, my biggest gripe with the Galaxy S5 has to do with its design. On the bright side, Samsung did ditch the glossy plastic used in the Galaxy S4 and S3, in favor of a faux-leather texture., which is still plastic nevertheless However, instead of molding the plastic to look like leather, like in the Note 3, The S5's plastic back has what looks like perforated holes, not unlike what was on the back of the 2012 Nexus 7. While it definitely looks less cheap than the glossy plastic, it's still far from looking premium and attractive. Even a simple, matte-finished plastic back like the one on the Nexus 5 or Moto G is more inviting than the S5's textured plastic design. The Galaxy S5 will be available in four colors at launch: a rather standard black and white, and two unusual colors, blue and gold. 

The front face of the phone remains similar to the Galaxy S4, except that the sides are a bit less rounded, like with the Note 3. The only real difference is that the Menu key has been replaced with a Task Switcher key, the Menu button now being displayed on the UI. 

The Galaxy S5 is significantly larger than the Galaxy S4, despite only a 0.1" increase in screen size. It measures 142mm tall, 72.5mm wide and 8.1mm thick, and weighs 145g. In comparison, the Galaxy S4 measures 136.6 x 70 x 7.9mm and weighs 130g. The main culprit for the S5's larger footprint is probably the IP67 certification, which is why I'd like to see Samsung release a non-waterproof version of the Galaxy S5, which a smaller, lighter body, perhaps (finally) with a metal construction. In any case, the Galaxy S5 isn't too big nor too heavy. In comparison to another ~5" smartphone with an IP certification, the Sony Xperia Z2, the Galaxy S5 is comparatively smaller and much lighter. 

The display is, on paper, almost identical to the one on the Galaxy S4. Screen size has increased only mildly to 5.1", while the resolution was kept at 1080p, resulting in a 432ppi pixel density. While that is lower than the S4's 441ppi (due to the smaller screen), the difference between 441ppi and 432ppi is impossible to notice, even with a microscope. Many rumors pointed to a 2K screen in the Galaxy S5, but I was actually very happy too see Samsung didn't make that move. A 2K screen would require too much power and GPU resources, not to mention that, at this screen size, the difference between 2K and 1080p wouldn't be very noticeable. Maybe in one or two years, when batteries will be larger and GPUs, more powerful, it might make sense to move up to 2K in smartphones, but for now it's a no go, in my opinion. The screen technology employed is AMOLED, of course, and there should be some improvements to the technology regarding brightness output and color reproduction compared to the Galaxy S4.
So the Galaxy S5's screen isn't very different from the S4, but then again the S4's display is as good as smartphones should go for now, and the improvements to the AMOLED technology help make the S5's display at least slightly superior to the S4.

The processor in the Galaxy S5 is a Snapdragon 801, which offers comparatively much more power than the original Galaxy S4's Snapdragon 600 processor. The 801 is, however, very similar to the Snapdragon 800 processor found on the LTE-A variant of the Galaxy S4, which released towards the end of 2013. The Snapdragon 801 boasts of four Krait 400 cores running at up to 2.5GHz, an Adreno 330 GPU with an unprecedented clock speed of 578MHz, and a 64-bit wide LPDDR3 memory interface at 933MHz. The Snapdragon 801 may not be a huge step up from the 800, but that doesn't mean it's not the most powerful SoC to date. Alas, the Snapdragon 800 already went neck to neck with the Apple A7 in some benchmarks, and outperformed it in others, so the Snapdragon 801 will only guarantee that Qualcomm retains the title of the manufacturer with the fastest SoC in existence. To put it shortly, the Galaxy S5 boasts of the best performance available in the mobile world.
The Galaxy S5 packs 2 GB of RAM, just like last year's model. I expected this year's S-flagship to pack 3 GB of RAM like the Note 3, but I assume that Samsung decided 2 GB was enough for the tasks the Galaxy S5 will undertake, and frankly I agree, especially since the S5 runs Android KitKat out of the box. The Galaxy S5's battery also got bigger compared to the S4, but not by much. The difference is only 200mAh, making the Galaxy S5's battery a 2,800mAh unit. Given that the screen size and resolution are very similar to the Galaxy S4's, the overall system power consumption shouldn't increase too much with the S5, so even the mild increase in battery size will probably help increase the battery life of the smartphone.

The Galaxy S5 continues the trend of continuously improving camera quality, packing a 16MP rear shooter. Aperture remains at F/2.2, and the new camera module makes use of ISOCELL technology, which means there's more separation between each pixel and improves on image resolution and increases the dynamic range. One feature which was ported over from the Note 3 is 4K video recording at 30FPS. We still don't have image and video samples for the Galaxy S5, but I'm confident that image quality will be significantly improved over the Galaxy S4. The front-facing camera is a 2 MP unit capable of 1080p video recording. 

Until now I only talked about improvements over the Galaxy S4, but what brand new features does the Galaxy S5 add? Firstly, there's a brand new take on smartphone fitness features with a heart rate monitor, which is located next to the LED flash. Some people will find this feature gimmicky, but I can definitely see it being useful for people who have an active lifestyle (this fits perfectly with the ruggedness of the Galaxy S5).
Also, in response to Apple's Touch ID fingerprint scanner, Samsung has introduced a slightly different kind of fingerprint scanner on the S5. Instead of being on the home button, like on the iPhone 5s, the Galaxy S5's fingerprint scanner is built into the bottom-center portion of the screen, and it's activated by swiping it, rather than holding it. It's not as convenient as Apple's solution, but it's still very good nevertheless.

sexta-feira, 28 de fevereiro de 2014

Qualcomm's 2014 SoC Lineup: Snapdragon 805, 801, 615, and More


Qualcomm enjoyed a very profitable 2013, with its Snapdragon 400, 600 and 800 System-on-Chips practically dominating the mobile market. And to keep manufacturers interested in Qualcomm's offerings, a refresh of various tiers of the Snapdragon line were recently announced. Specifically, we have six new Snapdragon SoCs coming out this year. 

On the top tier, we have the Snapdragon 801, which will be found inside the Samsung Galaxy S5 and the Sony Xperia Z2 and Xperia Z2 Tablet, as well as the Snapdragon 805, which is even more powerful than the 801, but will be available later this year. On the 600 tier, Qualcomm will offer the Snapdragon 602A, which isn't intended exactly for mobile devices, but rather, it's Qualcomm's offering for in-vehicle infotainment systems. Two 64-bit Snapdragon 600 variants will be available later this year too, the 610, which packs a quad-core CPU, and an octa-core variant named 615. The Snapdragon 410 is also due this year, and it'll pack a 64-bit CPU too.

So you might have noticed that, unlike with last year's Snapdragon lineup, where the designation 200, 400, 600 or 800 made it very clear where each tier stood, this year's numerical designations are a bit of a mess. It's still clear that the 805 and 801 are faster than the 610, 615 and 602A, which in turn are faster than the 410, but there might be some confusion between different variants of the same tier. For instance, 805 could be easily confused for 801, and the same goes for 610 and 615. It's just not as simple to understand as last year's lineup. Not only are the 2014 Snapdragons' nomenclatures messed up, but so are the architectural differences between the new SoCs. You might have noticed that while the 600 and 400 tiers are being upgraded to 64-bit CPU cores, the 801 and 805 are stuck with 32-bit capability, which simply does not make sense. Logically, 64-bit would come to top-tier processors first, and then make its way down to the subsequent tiers, but Qualcomm inexplicably decided to do the contrary. In any case, as it stands, the ability for 64-bit processing still isn't a very important feature in mobile devices, especially since Android OS doesn't even support 64-bit processing.

So let's analize each new Qualcomm SoC one at a time, starting with the high-end.

The Snapdragon 801 processor, found in recently announced Samsung and Sony flagship smartphones and tablets, is merely a mild upgrade over the Snapdragon 800. The biggest change in the 801 is the addition of eMMC 5 support, which will allow for faster flash storage solutions. Other than that, we still have a quad-core Krait 400 CPU, although the clock speed has been ramped up to up to 2.5GHz. This represents an ~8.7% speed increase over the Snapdragon 800. The Snapdragon 801 also keeps the Adreno 330 GPU, but increases its clock speed from 450MHz to up to 578MHz (~28% increase in theoretical performance). The memory interface, while still dual-channel 32-bit DDR3, gets a clock speed increase to 933MHz, which results in 14.9GB/s theoretical memory bandwidth, up from 12.8GB/s in most Snapdragon 800 variants. And that's all. Even though the architecture remains unchanged, the clock speed boosts should give the 801 a considerable performance advantage over the 800. 

Then there's the Snapdragon 805, which will be available later this year and will employ a quad-core configuration of a refresh to the Krait 400 CPU core, the Krait 450. The new CPU's improved efficiency allows for clock speeds to go up to 2.7GHz. Unfortunately, the Krait 450 core is still based on ARMv7, in other words, it doesn't support 64-bit processing. The GPU in the Snapdragon 805 gets a huge uplift, with the new Adreno 420, which brings a DirectX11-class feature set, improves on texture performance, as well as adds dedicated tesselation hardware, something previously seen only on PC graphics cards. The memory interface also gets a huge boost with the Snapdragon 805, moving to a 128-bit wide (quad-channel) LPDDR3-1600 interface. The added interface width results in peak theoretical memory bandwidth of 25.6GB/s. For comparison, most mobile SoCs today top out at 14.9GB/s. 
The Snapdragon 805 doesn't bring any big changes in comparison to the 800, with only mild improvements on the CPU side, but the new, more capable GPU and the impressively wide memory interface are enough to make the Snapdragon 805 an excellent processor. When it's available, I imagine it'll be quite capable to compete with the Tegra K1, the Apple A8 and whatever Samsung has to offer by then.

Moving down to the Snapdragon 600 tier, we have the new Snapdragon 602A which, as stated before, isn't meant for mobile devices, but rather for in-car infortainment systems, an area where many companies, NVIDIA included, have suddenly become interested in. Not much is known about the 602A, but we know it'll have a quad-core Krait (400?) CPU and an Adreno 320 GPU.

The Snapdragon 610 is one of the few Snapdragon processors that support 64-bit processing. It has four ARM Cortex-A53 cores (clock speed unknown) and an Adreno 405 GPU; so far we don't know anything about the GPU, although if it turns out to be a cut-down Adreno 420, we can expect the DirectX11-compatible architecture and the dedicated tesselation hardware. For the record, the Cortex-A53 is the lower end of the two Cortex-A5x CPUs released so far, and it's performance relative to current CPUs is still to be seen.
The Snapdragon 615 is essentially a 610 with four extra CPU cores inside. So it keeps the Adreno 405 GPU, but moves to an octa-core Cortex-A53 CPU. I'm a bit disappointed that Qualcomm decided to increase core count rather than use a more powerful CPU core. Most applications scale better to a fewer amount of threads, so not only will the last four cores or so probably end up not being efficiently utilized, the weaker single-threaded performance of the Cortex-A53 will hurt overall performance in applications that don't scale well to multiple threads. I would've preferred if Qualcomm had used a fewer number of the high-end Cortex-A57 cores, or even used its own Krait cores (even if that meant sacrificing 64-bit processing). 

Finally, there's the Snapdragon 410 mid-to-low-end processor, which combines four 64-bit Cortex-A53 cores at a clock speed of 1.2GHz with an Adreno 306 GPU (since this GPU belongs to the 3xx series, I suspect it won't have the architectural upgrades that the Adreno 405 and 420 got). Not much to go on about here, except to point out that even the weakest Snapdragon processor got an upgrade to 64-bit processing, while the high-end processors didn't. 

While I believe that until mobile processors built on a 22nm process show up there won't be another big leap in SoC performance, I'm not very impressed with Qualcomm's 2014 lineup. The Krait 450 CPU is a very small upgrade compared to the Krait 400, and the excessive use of the Cortex-A53 CPU core in the 600 and 400 tiers is rather disappointing because of the lower-end nature of the Cortex-A53. I'd talk about Qualcomm's decision to give the lower-end processors 64-bit processing, while leaving the high-end stuck at 32-bit, but since it's likely that a) Qualcomm doesn't have a 64-bit successor to Krait yet and b) The Cortex-A57 core is still not available, and since I wouldn't like to have seen more Cortex-A53s on the 800 tier, I won't comment on it. I also might as well reiterate how Qualcomm's nomenclature for its new SoCs can be rather confusing. On the bright side though, at least the Snapdragon 805 is bound to be very competitive, if not industry leading, in terms of GPU performance and memory bandwidth.

domingo, 16 de fevereiro de 2014

Motorola Moto G Full Review: An Excellent Smartphone For $179

The Moto X smartphone gave us a glimpse of what Google had in store for Motorola (before selling it to Lenovo). It's an attractive smartphone that proved that the number of cores and pixels isn't everything that makes a flagship phone by introducing innovative ways to control and use the phone and thorough customization of the phone's design. Having the Moto X occupy the flagship end, Motorola then proceeded to produce a budget smartphone which made as few compromises as possible compared to its flagship counterpart, and that is where the Moto G stands.

For $179, the Moto G offers an experience that isn't so far away from the Moto X, and is also one of the only smartphones in that price range you wouldn't be frustrated with having. It may not have the Moto X's Active Display and Always On voice commands features nor its wide array of customization options, but the in-hand feel and overall user experience of the Moto X is mostly preserved, but with a much lower price tag. If you can't, or don't want to, spend a lot of cash on a smartphone, but still want to get a good device, the Moto G is the perfect pick. The low price means you'll be making some key compromises, though, like the lack of LTE connectivity, a smaller display size and resolution, an unimpressive camera and a modest processor. But it's not like any other phone in that price segment offers any of these features. 


Design




The Moto G's design is just like the Moto X, except slightly bigger in all three dimensions. At its thickest point, the Moto G measures 11.6mm thick. It's also heavier than the Moto X, and indeed quite heavy for its size, weighing 143g. The Moto G may be quite a thick, heavy phone compared to more expensive flagships with even larger displays, but on the plus side this is due to the battery being quite large. The 2,070mAh battery may not hold a candle to the Galaxy S4's 2,600mAh battery, but then again the Moto G's total system power consumption is much lower than the Galaxy S4's. And even though the Moto G is heavier than some other phones, it's definitely light enough to use comfortably. 




The removable back cover of the Moto G, unlike the Moto X, is the only part of the phone that you can customize. The standard shell that comes with the phone is black, but you can also buy shells from Motorola with a variety of different colors, like white, red, yellow, blue, turquoise, among others. Each shell sells for $14.99. Motorola is also selling Flip Shells, which, as the name implies, have a magnetic flip cover that protects the phone's screen. These cost $29.99. For $19.99 you can also buy a Grip Shell, which is a thicker, more durable shell which also improve's the phone's grip, again, as the name implies. All shells, in all colors, have a matte finish, which depending on the color makes them quite the fingerprint magnets. By the way, removing the back cover will give you access to the microSIM card slot. Unfortunately, though, the battery is sealed inside and there's no expandable storage.


You may not be able to customize every last detail of the Moto G using Moto Maker, like with the Moto X, but it still has far more customization than most phones. 



























The Moto G's characteristic curved back fits the hand nicely, and makes the phone slipping from your hand less likely. The phone is relatively compact and is easily operable with one hand.




The microUSB port is placed on the bottom side, while the headphone jack is on top. The power and volume buttons, which by the way are small, but feel very solid and have excellent response, are located on the right side. Finally, the left side of the phone is bare. 


The default black shell is simple an understated, in the same way as the Nexus 5's matte black back looks. The 5 MP camera is placed on top-center, and to its left there's the speaker, which is nothing special, but gets pretty loud without distorting nevertheless. Under the camera there's an LED flash, and below is there's the same concave Motorola logo as on the Moto X. As a nice design touch, the concave Motorola logo is placed exactly where your index finger usually is when you're holding the phone, so that your index rests on top of the depression, which actually feels quite comfortable. 




Overall, the Moto G's design is nothing short of excellent. Its curved design is interesting looking and feels very comfortable to hold. The phone may be thicker than many recent smartphones, but the comfort that this design offers compensates for the phone's thickness. In the mid-range smartphone segment, we see a lot of glossy, creaky smartphones, but the Moto G's design is nothing short of excellent.


I'd like to emphasize just how good build quality is with the Moto G. I've dropped it quite a few times since I got it, and it looks intact. One of the last things you expect from a budget phone is Gorilla Glass 3 protection, so I was very happy to see it on the Moto G's spec sheet. Another thing you wouldn't ever expect to see on a $179 phone is water resistance. Motorola said that the Moto G had water-resistant nano-coating protecting the phone's internals, but since it has no IP certification I wouldn't advise you to go chucking your Moto G into a swimming pool. However, some tests on YouTube channels show the Moto G surviving for over 30 minutes underwater, whilst still functioning completely well afterwards. I didn't go as far as testing how my Moto G would do underwater, but I did (accidentally) splash water on it enough to know that it's able to survive some water. But like I said, without an IP certification it's more of a protection in case you accidentally drop your phone on the water rather than a reason for trying to take underwater selfies. 

Display


The Moto G sports a 4.5" LCD IPS display with a 1280 x 720 resolution, and a 326ppi pixel density. These specs may not sound very exciting, and they're not, but for a mid-range smartphone this is absolutely impressive. The display may not be as large as recent smartphones, but it's still a good balance between being too small and too big (5" in my opinion is dangerously close to "too big" territory). The 720p resolution sounds like old news, but just one year ago that was flagship stuff, and it's still more pixels than the latest iPhones pack. The 326ppi also matches exactly the iPhone's pixel density, so while it may not be as sharp as recent Android flagships, it's still crisp enough to be called a "Retina"-class display. 


As the Moto G has an IPS LCD display, not AMOLED like the Moto X, it can't have that nifty Active Display feature. That also means that, unlike on AMOLED, blacks won't be as good on the Moto G's LCD screen, and colors won't pop as much. Although to be fair, the Moto G does have excellent black levels, and the colors are quite good too. In fact, in the mid-range segment, the Moto G has the best display in that area. 


Processor


One of the main compromises a mid-range smartphone has to make is the silicon inside. The Moto G has a Snapdragon 400 processor, a common tier of Qualcomm silicon for mid-range smartphones. This particular Snapdragon 400 SKU, codenamed MSM8226, consists of a quad-core Cortex-A7 CPU @ 1.2GHz and an Adreno 305 GPU. These specs are really nothing special, hence the Moto G's performance can't compete with current flagships. Then again, performance is the one aspect of mobile technology that advances the fastest, so the Moto G's performance, while not even close to current flagships, can actually match 2012's flagships in terms of performance. 


For those of you who are not aware where the Cortex-A7 stands in ARM's lineup, the Cortex-A7 is architecturally similar to the Cortex-A8 core, so it's an in-order, 2-wide machine. With some power efficiency and architectural improvements over the Cortex-A8, however, one Cortex-A7 core can deliver up to Cortex-A9 levels of performance, all the while consuming much less power. 


As benchmarks will show, the Adreno 305 GPU is no match for current high-end mobile GPUs, but at the same time it's one of the most modern GPUs, architecturally speaking, thanks to its full OpenGL ES 3.0 compatibility. It may not have the power to run demanding games or benchmarks super-smoothly, but at least it has excellent API compatibility, unlike Nvidia's Tegra 4, which may have more processing power, but can't run OpenGL ES 3.0 applications.


In the Sunspider 1.0.2 test, which measures Javascript performance, the Moto G scored 1,377.1ms. Note that the test was run on the Chrome browser, which is notorious for not being the fastest browser for Javascript processing. The Moto G has Chrome as the default browser, therefore the standard Webkit Android browser isn't available on the device. 

Now on to AnTuTu, a benchmark which measures all-around performance of the phone, consisting of UX performance tests, as well as CPU, RAM, GPU and storage I/O tests.
The Moto G managed a pretty decent score here, even slightly outperforming the Nexus 4. It's still no match for recent flagships, but then again it's not meant to. 

Now moving on to 3D benchmarks, let's start with the recently released GFXBench 3.0.
The Manhattan test is one of the only OpenGL ES 3.0 benchmarks in existence, and unfortunately it doesn't put the Moto G in a very good position, as the smartphone is simply trumped by its competitors. You may notice, however, that all devices being compared to the Moto G in this chart are high-end devices with top-notch GPUs. I would compare the Moto G to other budget devices, but there weren't any for this test because the Adreno 305 is the only mid-range GPU that supports OpenGL ES 3.0. So while it's nice that the Moto G does support OpenGL ES 3.0, it's not properly equipped to render ES 3.0 games smoothly. Note that this is the Offscreen Manhattan test, which runs at a non-native resolution of 1080p. 
The Manhattan Onscreen test gives the Moto G an advantage over recent flagships since its 720p resolution strains the GPU much less than some of the 1080p-toting smartphones being compared here. The Moto G is still behind all of its competition, but now it's at least close to the Galaxy Note 2 and the Galaxy S4. 

Since the T-Rex test runs on any OpenGL ES 2.0-compatible GPU (i.e. all recent GPUs) I could put some fair comparison points for the Moto G. The Moto G is able to nearly match the Galaxy S III and the iPhone 5/5c's performance, at least in Offscreen mode. Not bad for a $179 phone. 

At native resolutions, the Moto G does even better. 

In the 3DMark test the Moto G scored pretty well too. It outperformed the Galaxy S III and, again, it got pretty close to the iPhone 5/5c.

The Average 3DMark score takes into account both the GPU-bound tests as well as the Physics CPU-bound test. The scores for the graphics tests only show the Moto G's Adreno 305 GPU struggling to keep up. It still easily outperforms the Tegra 3, but it falls behind the iPhone 5/5c and the Galaxy S III.
The Physics score, which mostly reflects CPU performance, puts the Moto G pretty much on par with the Tegra 3-based HTC One X+, the Samsung Galaxy S III and the iPhone 5 and 5c, with the Galaxy S4 still pulling ahead significantly. This test reflects how the Cortex-A7 core performs similarly to the Cortex-A9, seeing as the Quad Cortex-A7 @ 1.2GHz in the Moto G matches the performance of the Quad A9s @1.4GHz in the Galaxy S III and HTC One X+.

As a budget device, the Moto G has no hope of competing with current high-end smartphones, however, it's performance is at least competitive with the likes of the Galaxy S III and the iPhone 5 and 5c, so it's good that such an inexpensive phone can match 2012's flagships in terms of performance. 

Subjectively speaking, the Moto G performs pretty well. It's driving a pretty high resolution screen for its price range, however it's still able to deliver a fluid UI experience. The near-stock Android build doesn't have any bloatware pulling back the processor, and it's makes for a simple and intuitive user experience. Moving around the home screen and the app drawer is very smooth, however, after filling almost all five home screens with widgets and app icons things did get a bit slower, occasionally. I feel that the Snapdragon 400's single-channel LPDDR2-1066 memory interface occasionally struggles to keep up with the bandwidth demands of the 720p screen, an issue I recall last seeing in Tegra 3 tablets also with 720p displays. Launching apps was mostly a quick process, and web browsing is also pretty fast. Chrome isn't the smoothest browser for scrolling and pinch zooming, and I did indeed find some occasions where scrolling and zooming became a bit of a laggy process. When loading more complex pages I also found that the Moto G freezes long enough to make me impatient.

The only area where the Moto G seems to struggle a lot is with multitasking. Jumping between apps causes a bit of a slowdown, and the transition animations also get pretty laggy when I'm jumping between demanding apps, like Chrome. Also, after using the phone for some time it becomes apparent that the Moto G's 1 GB of RAM isn't enough, as apps get suspended very often and, occasionally, returning to the home screen requires a redraw, which can take a little over a second. The lack of enough RAM is enough to be annoying sometimes. For example, when using Facebook Messenger's Chat heads feature with a large app, like Chrome, open at the same time, I found that the phone's RAM management software kills the Messenger very often, so that my conversations disappeared without notice.

The Android 4.4 KitKat update did reduce the magnitude of this problem a bit due to its RAM management optimizations, but I still find that apps are getting suspended too much. 

The Moto G isn't exactly great for gaming, but it can run today's most demanding games at a decent framerate. For example, I've been playing Asphalt 8 on the Moto G, and at high settings I rarely encountered any noticeable lag. It's almost definitely the best gaming experience in the Moto G's price segment. 

As the Snapdragon 400 is built on 28nm silicon and the Cortex-A7 core is oriented for power efficiency, I never encountered any thermal issues with the Moto G. Even when gaming for extended periods of time the Moto G only got as far as becoming slightly warm. 

Between the low-power SoC, the relatively modest display and the large-ish battery, the Moto G has pretty good battery life. With moderate usage, the Moto G lasts from one to up to two work days. 

Conclusion

The Moto G is a bit like a budget version of Google's Nexus phones (that would actually make sense, since Motorola was owned by Google when the Moto G was released). In other words, it's a basic phone, with a basic, stock Android build, which offers a lot for its price, making as few compromises as possible. Motorola managed to make a phone which has a great screen, combined with excellent build quality and good internal hardware in a price range marked by horrible displays and flimsy build quality. 

If your budget is limited and you can't go above $200 for a phone, in that price range the Moto G is undoubtedly the best phone to get. I daresay the Moto G is even better than many phones above that price range. It's still no match for flagships, as it doesn't offer LTE connectivity, it's display is smaller and has a lower ppi, and its performance leaves something to be desired, not to mention that it's camera's not exactly great either. But still, considering its price, the Moto G is a great phone.