domingo, 24 de novembro de 2013

Apple iPad Air vs Samsung Galaxy Note 10.1 (2014 Edition): Tablet Comparison


The holiday season is almost upon us, and so the biggest players in the tablet market finally have their latest flagships already available. The iPad Air, from Apple, and the Galaxy Note 10.1 2014 Edition, by Samsung, are some of the most interesting tablet flagships this holiday season. Both of them have very high-end specs, including high-resolution displays and very powerful processors, along with a (perhaps too) high price tag. But which one is worth your money the most?

Apple iPad Air Samsung Galaxy Note 10.1 (2014)
 Body   240 x 169.5 x 7.5mm, 469g (Wi-Fi)/478g (LTE)  243 x 171 x 7.9mm, 540g (Wi-Fi)/547g (LTE)
 Display   9.7" IPS LCD 2048 x 1536 (264ppi)  10.1" TFT LCD 2560 x 1600 (299ppi)
 Storage   16/32/64 GB, 1 GB RAM  16/32 GB, 3 GB RAM
 Connectivity   Wi-Fi, GSM (2G), HSDPA (3G), LTE (4G)  Wi-Fi, GSM (2G), HSDPA (3G), LTE (4G)
 Camera (Rear)  5 MP with 1080p@30fps video, F/2.4 aperture, HDR, face detection  8 MP with LED flash, face detection and 1080p@60fps video
 Camera (Front)  1.2 MP with 720p@30fps video and face detection  2 MP with 1080p@30fps video
 OS  iOS 7  Android 4.3 Jelly Bean
 Processor  Apple A7 (Dual-core Cyclone @ 1.4GHz + PowerVR G6430 @ 450MHz) -Wi-Fi: Exynos 5420 (Quad-core Cortex-A15 @ 1.9GHz + Quad-core Cortex-A7 @ 1.3GHz + Mali-T628)
-LTE: Qualcomm Snapdragon 800 MSM8974 (Quad-core Krait 400 @ 2.3GHz + Adreno 330 @ 450MHz)
 Battery  Non-removable Li-Po 8,820 mAh
Video playback time: 10hrs
 Non-removable Li-Po 8,220 mAh
Video playback time: 10hrs
 Starting  Price  $499 (16GB, Wi-Fi)  $549 (16GB, Wi-Fi) 
 Accessories  --  S Pen


Design




Both Samsung and Apple have produced good designs for their flagship tablets, but it'll come down to the usual plastic vs aluminium debate, or in this case faux leather vs aluminium. While the iPad Air maintains its all-aluminium design, this time inspired on the iPad mini rather than the previous iPad, Samsung has done the same as it did with the Note III, replacing glossy plastic with a back casing that is still plastic, but is now disguised as leather. I'm not sure I appreciate the faux leather design at all. Personally, not only do I prefer the iPad's aluminium construction, but I think the faux leather looks so old-fashioned that even the glossy plastic they used previously may look better. That's just my opinion though, and ultimately it'll come down to personal taste. At least the faux leather gives the Galaxy Note 10.1 more grip than the iPad Air. The Galaxy Note 10.1 is available in black and white (bezel color included), and the iPad Air is similarly available in "Space" gray and silver. 

Both the Galaxy Note 10.1 and the iPad Air are remarkably thin and light. They are in fact one of the thinnest and lightest tablets available, but the iPad Air is definitely the winner in this department. It's technically thinner than the Note 10.1 (7.5mm vs 7.9mm), but the difference is so small it's practically unnoticeable to the user. While their thickness is one the same level, the iPad Air is significantly lighter than the Note 10.1 (469g vs 540g). In this case the difference in weight is definitely noticeable. The Note 10.1 is still lighter than most other tablets, though. 

Display

The display is possibly the area where these two tablets fare the best. Both are large, crisp, bright, and colorful. The iPad Air, much like two of its predecessors, has a 9.7" display with a 4:3 aspect ratio and a resolution of 2048 x 1536, which gives the screen 264ppi pixel density. The Note 10.1 has, like the name implies, a 10.1" display with a 16:10 aspect ratio that packs 2560 x 1600 pixels and has a pixel density of 299ppi.

Perhaps the most fundamental difference between the displays is the aspect ratio. The almost-square 4:3 display in the iPad Air makes it better to use in portrait mode, and is more suited for reading and web browsing, while the wide 16:10 display in the Note 10.1 makes it better suited for usage in landscape mode, and frankly makes portrait mode use a bit awkward, but is generally better for watching videos. 

You may think that the difference between 264ppi and 299ppi is huge, but honestly, it's hard to notice the Galaxy Note 10.1 being any crisper than the iPad Air, especially at the usual viewing distance. The difference is there, however, and any eagle-eyed person would probably notice a slight difference in sharpness. 

Leaving the numbers and quantitative data aside, both the Note 10.1's and the iPad Air's displays are sufficiently bright. Viewing angles are good, as is expected of any half-decent tablet these days, and colors are accurate and satisfyingly saturated in both tablets. 

Performance

Both of these tablets have the most powerful processors available to handle their ultra high-resolution duties. On the iPad Air we have the same A7 SoC found in the iPhone 5s and the Retina iPad mini, and on the Galaxy Note 10.1 we have either the Snapdragon 800 or a rare Exynos 5 Octa (5420) SoC for the LTE and Wi-FI models, respectively. All of these SoCs are built on 28nm process node to keep power consumption lower. 

The A7's CPU technology has gained quite a bit of popularity since its launch back in September. That's because its the first CPU to utilize the ARMv8 ISA, which happens to be a 64-bit architecture, hence also making it the first 64-bit mobile SoC. Apart from the new ISA, Apple made its new Cyclone CPU core the widest mobile CPU ever seen. With all that power packed into a single core, Apple needed no more than two of those cores with a relatively low 1.4GHz clock speed to match its competitors' performance. As benchmarks show, the dual-core Cyclone CPU @ 1.4GHz is perfectly capable of competing with the latest quad-core beasts, and since it packs much more power on a single core, the A7 really stands out from its competitors in single-threaded CPU benchmarks.

The CPU in the Exynos 5420 SoC in the Wi-Fi Galaxy Note 10.1 is one of the few CPUs to ultilize ARM's big.LITTLE technology. Based on the ARMv7 32-bit ISA, the Exynos 5420 contains two CPU clusters, one high-performance cluster to handle demanding tasks, and a low-power cluster for handling lighter tasks while reducing power consumption.The high-performance cluster contains four Cortex-A15 cores clocked at 1.9GHz, while the low-power cluster has four Cortex-A7 cores @ 1.3GHz. 

The Qualcomm Snapdragon 800 variant of the Galaxy Note 10.1 (the LTE version) has, like Apple, a custom CPU core dubbed Krait 400, based on the ARMv7 32-bit ISA. The Snapdragon 800 has four Krait 400 cores with an insane 2.3GHz clock speed. 




Like I said before, since the A7's Cyclone CPU has a 64-bit architecture and is wider than all of its competitors, it manages a much higher score in single-threaded CPU benchmarks. However, in multi-threaded applications the A7 has the disadvantage of having fewer cores compared to its competitors, however it can still definitely keep up with its quad-core competition. The multi-threaded test puts the A7 very close to the Exynos 5420, but both processors lag behind the Snapdragon 800.

With the CPU out of the way, let's focus on the GPU of the Galaxy Note 10.1 and the iPad Air. The A7 SoC follows Apple's tradition of licensing GPUs only from ImgTech, and so we have a PowerVR G6430 graphics processor in the A7. On the Wi-Fi Note 10.1's Exynos 5420 processor there's an ARM Mali-T628 GPU, and the LTE Note 10.1 has an Adreno 330 GPU. All of these GPUs are among the most powerful mobile GPUs available, so we turn to GFXBench to tell us which of these GPUs is the most powerful.
Note: Unfortunately there are no benchmark scores yet available for the Snapdragon 800-based Note 10.1, so I'm taking data from the closest match I could find, the Galaxy Note III with Snapdragon 800. However, I'll omit the Note III scores from the Onscreen tests due to the difference in resolution between the Note III and the Note 10.1.
The T-Rex HD Offscreen test shows the PowerVR G6430 in the iPad Air remarkably close to the Adreno 330, however the Mali-T628 GPU in the Wi-Fi Note 10.1 lags behind them both, but at least outperforms the NVIDIA Tegra 4 SoC in the ASUS Transformer Pad.

The lighter Egypt HD Offscreen test shows the iPad Air's GPU falling behing both the Wi-Fi Galaxy Note 10.1 and the Snapdragon 800-powered Galaxy Note III and puts the Snapdragon 800 at the top of the chart.

Note that since these two tests are rendered at a fixed, non-native resolution, the difference between the resolution of the Note 10.1 and the iPad Air don't affect the scores here. 



The Onscreen tests illustrate how the 1 million more pixels that these two Android flagships have to push versus the iPad Air bog down their performance. The T-Rex HD test shows that the iPad Air managed a much higher score compared to the Exynos-based Galaxy Note 10.1 and the Tegra 4 ASUS Transformer Pad TF701T. 

Since the Egypt HD test is much lighter than T-Rex HD the margin between the iPad Air and its competitors becomes narrower. However, it's still clear that the iPad Air, due to its significantly lower resolution, can push more frames than its 1600p Android competitors. 

Until the Snapdragon 800-based LTE Galaxy Note 10.1 gets released there's no data to indicate how it compares to the iPad Air in the Onscreen tests, although if I were to guess, I'd say that, even though the Adreno 330 is slightly more powerful than the PowerVR G6430 in the Apple A7, its performance advantage still won't be able to offset the resolution difference between it and the iPad Air.

Usually, the Onscreen tests would mimic most accurately real world gaming performance, given that Android and iOS games tend to run at the device's native screen resolution, but since the iPad 3 developers have been going another way: For specific ultra high-res devices, in order to avoid performance issues, the game runs at a lower-than-native resolution and then upscales to the device's screen resolution.  For example, a developer might program a game to run at 1920 x 1200 and then scale to 2560 x 1600 on the Galaxy Note 10.1 to keep framerates high. Given how the Galaxy Note 10.1's higher resolution obviously puts it behind its iPad competitor, it might be necessary for this sort of optimization to be made to keep a decent framerate in very demanding 3D games.

Conclusion

The Galaxy Note 10.1 and the iPad Air are in fact similar in many ways. They both have thin, light designs (although the plastic vs metal war continues with these flagships), displays with a very high resolution, large batteries and some of the best-performing SoCs available. 

In hardware terms, the iPad Air and the Galaxy Note 10.1 2014 Edition are indeed very similar, so it'll probably come down to software to determine which one is best for you. With the Note 10.1, we have Android 4.3 (and soon enough 4.4) with Samsung's TouchWiz UI added on top, and the iPad Air obviously runs iOS 7. 

The addition of the S Pen digitizer might make you choose the Note 10.1 over the iPad Air, but that'll be only if you really value the advantages that a stylus brings.

Selling for the usual $499 for the 16GB Wi-Fi version, the iPad Air is an expensive tablet, although not as expensive as the Galaxy Note 10.1, which sells for $549 for the 16GB Wi-Fi version. $549 is asking for a lot, so unless the S Pen is really useful for you or you really prefer the Android ecosystem, the iPad Air offers more bang for your buck than the Galaxy Note 10.1 2014 Edition.

sábado, 16 de novembro de 2013

Apple A7 vs NVIDIA Tegra 4 vs Snapdragon 800: SoC Wars


Mobile SoC performance has become one of the most competitive aspects in the mobile sector. Since 2010, when the iPad made it clear how important processing power is for mobile devices, performance in mobile devices has had exponential growth, and SoC vendors began to compete more and more. In 2013, the main SoC manufacturers can be narrowed down to Qualcomm, Apple, NVIDIA, and to a lesser extent, Samsung. TI used to be a big player in the SoC market, but this year it practically disappeared from the SoC sector. Now that these companies have their latest silicon shipping in commercially available products, in time for the holiday season, it's time to put their best offerings to the test and see who has the best offering.

Apple A7 NVIDIA Tegra 4 Snapdragon 800
 Process Node   28nm HKMG   28nm HPL  28nm HPM
 Die Size  102mm2  ~80mm2 118.3mm2
 Instruction Set   ARMv8 (64-bit)   ARMv7 (32-bit)   ARMv7 (32-bit)
 CPU  Dual-core Cyclone @ 1.3/1.4GHz   Quad-core Cortex-A15 @ 1.9GHz + Low Power Cortex-A15 @ 825MHz  Quad-core Krait 400 @ 2.3GHz
 GPU  PowerVR G6430 @ 450MHz  72-core ULP GeForce @ 672MHz  Adreno 330 @ max 550MHz
 RAM  32-bit Dual-channel LPDDR3-1600 (12.8GB/s)  32-bit Dual-channel LPDDR3/DDR3L-1866 (14.9GB/s)  32-bit Dual-channel LPDDR3-1866 (14.9GB/s)



The CPU: Dual-core vs Quad-core

Apple's most impressive feat on the mobile performance sector so far is that, in an age of quad-cores with insane clock speeds, Apple has not once shipped a device with more than two CPU cores and with a relatively low clock speed, and has still managed to at least keep up with the latest competition. Let's see how Apple's latest CPU, the dual-core Cyclone with a max clock speed of 1.4GHz, stacks up against NVIDIA's latest offering, the Tegra 4's four Cortex-A15s @ 1.9GHz and the Snapdragon 800's four Krait 400 cores @ 2.3GHz

Architecturally speaking, Apple's CPU is far superior to the Cortex-A15 and the Krait 400. That's because the A7 CPU runs on a brand new 64-bit ARMv8 architecture. The luxury of 64-bit allows the Cyclone CPU to be able to address memory much faster, giving it a tangible performance gain in some cases over traditional 32-bit solutions. Not only that, but Apple has made the Cyclone core much wider than its predecessor, the Swift core. In fact, I think it's the widest mobile CPU so far. The wider architecture plus 64-bit give the Cyclone cores much better single-threaded performance over any of its competitors, and remember that in most use cases single-threaded performance is the most important. Kudos to Apple for competing against monstrous quad-cores with only a dual-core. 

The NVIDIA Tegra 4's CPU uses NVIDIA's Variable Symmetric Multi-Processing architecture, which was introduced with the Tegra 3. Like ARM's big.LITTLE architecture, the Tegra 4 consists of a main CPU cluster, composed of four high-performance Cortex-A15 cores running at a max 1.9GHz, and a shadow A15 core than can go up to 825MHz. When CPU demand is low, the Quad-core A15 cluster is power-gated, and all processing transfers to the shadow A15 core, and it remains like this as long as demand from the CPU is low enough. The advantage of this is, of course, reduced power consumption.

Qualcomm's Snapdragon 800 uses Qualcomm's own modification of the Cortex-A15 core, dubbed Krait 400. Since Qualcomm likes to keep its mouth shut about its CPU architectures, not much is known about the Krait 400. What we know is that the Krait 400 is mostly the Krait 300 core in a 28nm HPm process. However, the move from 28nm LP in the Krait 300 and 28nm HPm in the Krait 400 means that there's been some relayout in the Krait 400. Other differences from Krait 300 include lower memory latency. Apart from that, we only know that, like the Cortex-A15 upon which it's based on, the Krait 400 is a 3-wide machine with OoO (Out-of-Order) processing capabilities. The move to HPm means the Krait 400 can achieve higher clocks than its predecessor, which accounts for the insane 2.3GHz max clock speed. Put that four of those monster cores together and you potentially have the most powerful mobile CPU to date. Unfortunately, it still remains that it also lags behind the Apple A7 in single-threaded performance, which is also very important in mobile OSes. 

Now let's put in some quantitative information to see how these CPUs compare in their actual performance: 

What I said before about single-threaded performance shows here. Apple's Cyclone cores can deliver at least 50% more performance on a single core than any of its competitors. But due to the fact that the A7 has only two cores while all of its main competitors have four of them, in multi-threaded situations the A7 loses its advantage, but can still keep up with all of its competitors. It's very impressive how Apple always manages to match quad-core performance with only two cores. 

The GPU and Memory

Apple has always put more emphasis on the GPU rather than the CPU on its SoCs, and the A7 is no different. Apple continues to license GPUs from Imagination Technologies, like it has been doing since its first iPhone. This time around, Apple is using a PowerVR "Rogue" series GPU, which is based on ImgTech's latest technology and, of course, supports OpenGL ES 3.0. The exact model of the new PowerVR GPU in the A7 is the G6430 variant, which contains four GPU modules with 32 unified shader units on each module. That equates to a total of 128 shader units with at a clock speed of 450MHz. 

Ironically, the NVIDIA Tegra 4's GPU is the least fancy of the current high-end mobile GPUs. Designed by NVIDIA, the GPU in the Tegra 4 is based on the ancient NV40 architecture (the same used in the GeForce 6000 series), hence, its the only modern GPU that uses discrete pixel and vertex shaders. In this case, there are a total of 72 shader units, 48 of which are pixel shaders and the remaining 24 are vertex shaders. The GPU runs on a max clock speed of 672MHz. The biggest limitation of the Tegra 4's GeForce GPU is that it only supports OpenGL ES 2.0. Right now, this isn't really a problem, as game developers haven't yet migrated to OpenGL ES 3.0 for their games, but that practically destroys the future-proofing of the Tegra 4.

Finally, we have the Snapdragon 800 with its Adreno 330 GPU. Like I said before, Qualcomm likes to reveal as little information as possible about its SoCs, and the Adreno line of GPUs are probably the biggest mysteries I'm faced with now. All I can say is that it's a unified shader architecture compatible with the latest OpenGL ES 3.0 API. The Adreno 330, in its highest configuration, runs at 550MHz, but the vast majority of Snapdragon 800 devices have their GPUs clocked at 450MHz. By the way, the benchmark results I'll show later on reflect the Adreno 330's performance at 450MHz, since no devices have released yet with the 550MHz bin of the Adreno 330. 

Snapdragon 800 Apple A7 NVIDIA Tegra 4 NVIDIA Tegra 4i
 GPU Name   Adreno 330  PowerVR G6430   72-core GeForce  72-core GeForce
 Shader Cores
 ?
 4  4 Pixel; 6 Vertex  2 Pixel; 3 Vertex 
 ALUs/Core
 ?
 32  12 Pixel; 4 Vertex  24 Pixel; 4 Vertex
 Total ALUs
 ?
 128  72 (48 Pixel, 24 Vertex)  60 (48 Pixel; 12 Vertex)
 Max Clock Speed  550MHz  450MHz  672MHz  660MHz
 Peak GFLOPS
 ?
 115.2  96.8  79.2


Peak theoretical compute power puts the Tegra 4 behind the A7, but the Tegra 4 is still close enough to the A7 to call it competitive. However, be aware that, while the A7's unified shader architecture allows it to have its peak 115.2 GFLOPS performance available to it in any situation (the same applies to the Adreno 330), the story is quite different with the Tegra 4. The discrete pixel shader architecture means that the GPU's peak 96.8 GFLOPS can only be achieved when the mix of pixel and vertex shader requests matches the ratio between pixel and vertex shader hardware (2:1), so most of the time the GPU achieves less than 96.8 GFLOPS.

There may not be a huge gap in theoretical compute between the A7's and Tegra 4's GPU, but the architectural difference is astounding. You can hardly put a unified shader architecture that supports OpenGL ES 3.0 in the same league as a discrete pixel and vertex shader architecture that is limited to OpenGL ES 2.0. While these differences may not affect real-world performance, the omission of OpenGL ES 3.0 is bad for future-proofing. 

Interestingly, every current high-end SoC uses pretty much the same memory interface. The Tegra 4, Apple A7 and Snapdragon 800 have dual-channel DDR3L solution, except that the Tegra 4 and the Snapdragon 800 allow for a slightly higher clock speed (933MHz) versus the A7 (800MHz), giving the A7 12.8 GB/s peak theoretical memory bandwidth, versus 14.9 GB/s on the Tegra 4 and Snapdragon 800. While the A7 has technically less theoretical memory bandwidth than its competitors, it counteracts this with a very interesting solution. It turns out the A7 has 4 MB of SRAM on-die, acting as a L3 cache, which can be used to unload instructions off the main memory interface and hence increase the bandwidth. You may recall that a similar solution is used in the Xbox One's SoC to increase memory bandwidth. 

Considering the 4MB SRAM on the A7's die, it may turn out that the A7 can deliver significantly more memory bandwidth than the Tegra 4, but still, both have enough memory bandwidth to power ultra high-resolution (>1080p) tablets comfortably. 

The T-Rex HD test shows the Tegra 4 significantly behind the Apple A7 and also puts it as the slowest of the high-end mobile GPUs. The Apple A7, however, is only beaten by the Snapdragon 800, however only by a very small margin. 

The less intensive Egypt HD test also shows the Tegra 4 behind the A7 and other high-end mobile SoCs, but by a smaller margin. The A7 is the second slowest of these SoCs in this test, achieving slightly lower scores than the Mali-T628 in the Exynos 5420 and the Adreno 330 in the Snapdragon 800. Both tests show the Snapdragon 800 as the supreme mobile GPU.
ImgTech GPUs have always had industry leading fill rate capabilities, and it shows in the A7. The PowerVR G6430 GPU has a much higher fill rate than any of its competitors. On the ther end of the spectrum, we have the Tegra 4. Tegra GPUs have a tendency of being substandard in terms of fill rate, and it shows. The Tegra 4 manages a significantly lower fill rate score than every one of its competitors, especially the Apple A7. That's a problem, because the Tegra 4 is currently used to power some of the few tablets which boast 1600p displays, for example, the ASUS Transformer Pad TF701T. On devices with 1080p screens or less however, even the Tegra 4 probably won't run into any bottlenecking due to the limited fill rate. The Snapdragon 800 also doesn't do very well, as it's also outperformed by the Mali-T628 in the Exynos 5420.



Here, the Tegra 4 and the Apple A7 are in the lead, with the Apple A7 pulling ahead slightly.



Adding lighting per vertex for some reason causes the Apple A7 to lag behind all of its competitors, leaving the Tegra 4 on the lead.


When using per pixel lighting, the A7 once again falls behind everyone else, and this time the Tegra 4 also joins it with the second lowest score.

Even though in some cases the Apple A7 lags behind its competition severely, I highly doubt this is going to make performance suffer in any way, since most mobile games aren't very geometry bound. 

The Snapdragon 800, while not at the top spot in most of these tests, shows strong scores across the board, outperforming the whole competition by a significant margin in the fragment lit test. 

Power Consumption

All of the current high-end SoCs should have low enough power consumption, since they all use 28nm silicon. On the CPU side, the A7 enjoys a low core count as well as a low clock speed, so I don't expect the CPU to draw too much power. The Tegra 4, on the other side, has four power-hungry Cortex-A15 cores with a much higher clock speed, however, the shadow A15 core has potential to counteract the extra power consumed when the main A15 cluster is active. The S800 doesn't have any extra low power cores, and relies on the efficiency of the main Krait 400 cores to yield good battery life. But given Qualcomm's record of making CPUs with low idle power, this is definitely not a problem.

One optimization that Qualcomm makes to reduce power consumption is that it can have different clock speeds on each active core. The competitors' architectures only allow them to run every active core at the same clock speed, even if unnecessary. So, for example, if there are two cores active, one of them fully loaded and the other running a much lighter task, the Krait 400 will have the first core on its max clock speed, while the second core could have a much lower clock, while its competing CPUs will run both cores at the max clock speed, even if the second core doesn't really need it. This is one of the many optimizations that make the Krait 400 core very power efficient. 

I can't really tell whether it's the 72-core GeForce GPU, the PowerVR G6430 or the Adreno 330 that consumes less power, but given ImgTech's record of making the most power efficient mobile GPUs, it's not a stretch to assume that the G6430 is the GPU that draws less power. 



Conclusion

While the Tegra 4, the Apple A7 and the Snapdragon 800 have completely different architectures, I'd say that they're pretty close to each other, based on the performance they've showed on synthetic benchmarks. The differences between the CPUs are the most astounding. While Apple focused on keeping core count and clock speed low while driving up single-core performance, NVIDIA's (or rather, ARM's) and Qualcomm's solution offsets the relatively lower single-threaded performance by using more cores at a higher clock speed. While the former is probably better for overall system performance, as mobile OSes tend to rely much more on single-threaded performance, the latter is probably better for multi-tasking. In any case, it's evident that all current high-end SoCs are surprisingly close together when it comes to peak multi-threaded performance.

Comparing the Tegra 4, Apple A7 and the Snapdragon 800 as well as the rest of the high-end competition, it's clear that the only one that is truly distinguished is the A7. The Tegra 4 and the Exynos 5420, for instance, both have four Cortex-A15 cores with a similar clock speeds (1.9GHz vs 1.8GHz, respectively), and they also have a separate CPU cluster for handling light tasks with low power (the Tegra 4 has a single A15 core at its disposal, while the Exynos 5420 uses a quad-core Cortex-A7 cluster for the same purpose). The Snapdragon 800 uses a unique architecture, the Krait 400, in a quad-core configuration and even takes the clock speed beyond the norm with an insane 2.3GHz, but unlike two of its competitors, it doesn't need extra low power cores, but has other solutions to keep idle power consumption low.

In GFXBench's high-level GPU benchmarks, it seems that all four main high-end SoCs are more or less on the same level, with only the Snapdragon 800 slightly pulling head of the A7. In both high-level tests, however, we can see the Tegra 4 lagging behind all of its competition. How ironic.

GFXBench's Low-level tests show a huge difference between the current high-end mobile GPUs, however. In the fill rate department we see the Apple A7 blowing all of its competitors out of the water, and we also see the Tegra 4 on the bottom of the chart and the Snapdragon 800 slightly ahead of the Tegra 4, but still behind the Exynos 5420 and the Apple A7.

The verdict of this comparison is that, while pretty much all of the current flagship SoCs are pretty close in terms of CPU power, the Tegra 4 falters slightly when the GPU is put to the test. The Apple A7 does very well on the GPU side, but it's just slightly outperformed by the Adreno 330 GPU on the Snapdragon 800. But really, they're all so close it's hard to pick one as a definite winner. You could call the Snapdragon 800 the overall inner, but I say it's too close to call.

sábado, 9 de novembro de 2013

LG G Pad 8.3 Review


LG has only had one attempt at making an Android tablet (the old Optimus Pad), and it was definitely a failed one. However, with LG's recent success in the smartphone department, it was only a matter of time before they took another shot at the tablet market, and so we have today the LG G Pad 8.3. As the name suggests, the LG slate falls within the 7"-8" segment, and that puts it in direct competition with the iPad mini and the Nexus 7. With a 8.3" display of 1920 x 1200 resolution, a powerful, if not slightly outdated, Snapdragon 600 processor and a reasonable starting price of $349, the G Pad 8.3 might be just what LG needs to gain some market share in the tablet space.

LG G Pad 8.3 Apple iPad mini 2 Google Nexus 7 (2013)
 Body   217 x 126.5 x 8.3mm, 338g   200 x 135 x 7.5mm, 331g (Wi-Fi)/341g (LTE)  200 x 114 x 8.7mm, 290g (Wi-Fi)/299g (LTE)
 Display   8.3" IPS LCD 1920 x 1200 (273ppi)  7.9" IPS LCD 2048 x 1536 (324ppi)  7" IPS LCD 1920 x 1200 (323ppi) w/ Corning Gorilla Glass
 Storage   16/32 GB, 2 GB RAM (microSD expandable)  16/32/64 GB, 1 GB RAM  16/32 GB, 2 GB RAM
 Connectivity   Wi-Fi  Wi-Fi, GSM (2G), HSDPA (3G), LTE (4G)  Wi-Fi, GSM (2G), HSDPA (3G), LTE (4G)
 Camera (Rear)  5 MP with HDR and 1080p@30fps video  5 MP with HDR and 1080p@30fps video  5 MP with 1080p@30fps video
 Camera (Front)  1.3 MP  1.2 MP with 720p@30fps video  1.2 MP
 OS  Android 4.2.2  iOS 7  Android 4.3
 Processor  Qualcomm Snapdragon 600 APQ8064 (Quad-core Krait 300 @ 1.7GHz + Adreno 320 GPU)  Apple A7 (Dual-core Cyclone @ 1.4GHz + PowerVR G6430 GPU)  Qualcomm Snapdragon S4 Pro (Quad-core Krait @ 1.5GHz + Adreno 320 GPU)
 Battery  Li-Ion 4,600 mAh  Li-Po 6,430 mAh  Li-Ion 3,950 mAh
 Starting Price  $349 (16 GB)  $399 (16 GB)  $229 (16 GB)


Design

One of the best things about the G Pad 8.3 is its amazing design. Many elements of the G Pad 8.3's design are somewhat similar to the iPad mini, for instance, the narrow side bezels and the aluminium construction. For its size, the G Pad 8.3 is very thin. Measuring just 8.3mm (yeah, just like the name of the tablet and the size of the display), it's slightly thinner than the 2013 Nexus 7, but still a bit thicker than the iPad mini. The G Pad 8.3 also weighs 338g, so it's significantly heftier than the Nexus 7 (of course, the Nexus 7 has a much smaller display and battery), and a few grams heavier than the new iPad mini (albeit with a much smaller battery than the iPad mini's). It's clearly not as svelte as the iPad mini, but for its size and price it's still very thin and light.

The G Pad is one of the few Android tablets that have a back cover made of aluminium. In the case of the G Pad, it's a brushed aluminium finish. On the very center, oriented vertically, is an LG logo, and the right side of the back contains one speaker, decently spaced apart. Obviously, the landscape positioning of the two speakers means you'll only get a stereo effect when you're holding the tablet in landscape mode, which makes sense, since watching videos and playing games is usually done in landscape mode. On the top-left corner sits a 5 MP camera that can shoot up to 1080p@30fps video. 

The front of the tablet makes it look a lot like a blown up LG G2. The 8.3" display is of IPS technology, which means excellent viewing angles and good color reproduction. The 1920 x 1200 resolution results in a very crisp pixel density of 273ppi. While the display is extremely sharp and it's almost impossible to be able to discern individual pixels with the naked eye, it still falls short of the iPad mini 2 and the 2013 Nexus 7, which have displays of 323ppi and 324ppi, respectively. While the difference between 273ppi and 323ppi sounds big, in most cases it's very hard to notice a difference in sharpness between them. 

Performance

That is the area where LG has compromised to keep the price of the G Pad 8.3 low. While lately we've only seen flagship devices be powered by the latest and greatest SoCs, like the Snapdragon 800 or the Tegra 4, the G Pad 8.3 sports a more modest Snapdragon 600 SoC. This processor is composed of four Krait 300 cores running at up to 1.7GHz plus an Adreno 320 GPU. This means that the G Pad will perform considerably worse than the iPad mini 2, but will still match the 2013 Nexus 7 on that area. While the G Pad is nowhere close to the current flagships in terms of processor power, it would be a lie to say that it's slow by any means. As long as LG doesn't bloat its custom UI too much (which it hasn't done), the G Pad 8.3 should offer smooth performance at all times.

Conclusion

LG's return to the tablet market has potential to be great. The G Pad 8.3, aka LG's first shot at making a tablet in years, excels in its design with a sufficiently thin and light aluminium chassis and has an excellent display. LG's choice of using a slightly older processor in the G Pad 8.3 might be a deal breaker to some, though. Priced at $349 for 16 GB, it isn't exactly overpriced, but it's not that cheap either, considering what it offers, but still, the G Pad 8.3 is currently the best 8-inch Android tablet available. If you can afford to spend an extra $50 though, I think the iPad mini 2 will be a better alternative when it becomes available. 

terça-feira, 5 de novembro de 2013

Google Nexus 5 Review

The Nexus 5 smartphone has had so many leaks that it barely needed an official announcement. Still made by LG, the latest Nexus smartphone brings all of the expected upgrades; a larger display with a higher resolution, a faster processor, etc. The Nexus 5 is also the first device to ship with Android 4.4 KitKat, which brings a lot of important changes, including a more refined UI. While the Nexus 5 doesn't have anything that distinguishes it from other flagships, it's price tag is very appealing as always, and the pure Android experience may be more appealing to some users than competitors' modified software. 

Google Nexus 5 Apple iPhone 5s LG G2
 Body   138 x 65 x 8.6mm, 130g   124 x 59 x 7.6mm, 112g   138.5 x 71 x 8.9mm, 143g 
 Display   4.95" True HD-IPS+ 1920 x 1080 (445ppi) w/ Corning Gorilla Glass 3  4" IPS 1136 x 640 (326ppi) w/ Corning Gorilla Glass  5.2" True HD-IPS+ 1920 x 1080 (424ppi) w/ Corning Gorilla Glass 2
 Storage   16/32 GB, 2 GB RAM  13/32/64 GB, 1 GB RAM  16/32 GB, 2 GB RAM
 Connectivity   Wi-FI, GSM (2G), HSDPA (3G), LTE (4G)  Wi-FI, GSM (2G), HSDPA (3G), LTE (4G)  Wi-FI, GSM (2G), HSDPA (3G), LTE (4G)
 Camera (Rear)  8 MP with OIS, LED flash and 1080p@30fps video  8 MP with dual-LED flash, 1/3" sensor size, 1.5µm pixel size, HDR, 1080p@30fps and 480p@120fps video  13 MP with OIS, LED flash, HDR and 1080p@60fps video
 Camera (Front)  1.3 MP  1.2 MP and 720p@30fps video  2.1MP and 1080p@30fps video 
 OS  Android 4.4 KitKat  iOS 7  Android 4.2.2 Jelly Bean
 Processor  Qualcomm Snapdragon 800 MSM 8974 (Quad-core Krait 400 @ 2.3GHz + Adreno 330 GPU)  Apple A7 (Dual-core Cyclone @ 1.3GHz + PowerVR G6430 GPU)  Qualcomm Snapdragon 800 MSM 8974 (Quad-core Krait 400 @ 2.3GHz + Adreno 330 GPU)
 Battery  Non-removable 2,300 mAh
Talk time: 17hrs
Standby time: 300hrs
 Non-removable 1,560 mAh
Talk time: 10hrs
Standby time: 250hrs
 Non-removable 3,000 mAh
Talk time: 17.5hrs
Standby time: 900hrs
 Starting Price (Off-contract)  $349 (16 GB)  $649 (16 GB)  $549 (16 GB)


Design



The Nexus 5 may be cheap, but its design doesn't look very cheap at all. While it may not be as premium as the iPhone 5s with its aluminium build, it's still solidly built. The back of the smartphone takes a leaf from the 2013 Nexus 7's design with a matte finish, sold in either black or white. A large Nexus logo is rather awkwardly placed horizontally on the center, accompanied by a small, vertically oriented LG logo on the bottom. An 8 MP camera, which by the way slightly protrudes from the chassis, sits on the top left of the device, accompanied by an LED flash below the camera.

The phone's design isn't very rectangular and consists of slightly curved sides. The front, of course, consists mostly of the 4.95" 1080p display. The display is in line with what is expected from a flagship device nowadays, no more, no less. As the Nexus 5 is based on the LG G2, the bezels on either side of the display are very narrow, while the top and bottom bezels are, well, regular for a smartphone. 

The Nexus 5 isn't exactly the skinniest flagship smartphone available. Measuring 8.6mm thick, it's considerably thicker than the iPhone 5 and 5s, and also thicker than Samsung's Galaxy S4 (7.9mm). It's thinner than the LG G2, though. Nevertheless, while it may not be the thinnest, you can't possibly call the Nexus 5 thick. After all, that extremely attractive price tag requires some minor compromises, and anyway, for a $349 phone, the Nexus 5 is doing very well on the thinness department. 

The Nexus 5, weighing 130g, isn't lighter than the iPhone 5 and 5s (of course, it has a much larger display and battery), but is one of the lightest 5" flagships, as it's considerably lighter than the LG G2 (143g) and has the same weight as the Galaxy S4. Not bad for a phone with such a low price. 

Performance

The Nexus 5 is but another phone that enjoys the extreme power offered by the Snapdragon 800 processor. Powered by four Krait 400 cores clocked at 2.3GHz and the monstrous Adreno 330 GPU, you're pretty much dealing with one of the absolute fastest smartphones available. That, combined with the trimmed down Android 4.4 KitKat OS and the absolute lack of any OEM customizations and bloatware, gives the Nexus 5 impeccable fluidity and performance. 

Conclusion

The Nexus 5 isn't an attempt to revolutionize the smartphone market through fingerprint sensors and weird hand and eye gesture controls and whatnot. Quite the contrary, it's supposed to be a basic smartphone with flagship qualities, achieving success through the simplicity of its hardware and user interface. The exclusion of all those bells and whistles is what allows for its very low price, and for some people, that may be just about perfect.

Some people don't need, or don't want, a bunch of extra hardware and software features they'll barely ever use on their phones, in other words, some people just need a powerful yet easy to use phone, and that is the space the Nexus 5 intends to fill in. And it does so very well. This phone has the basics a flagship requires nowadays, a 5" display of 1080p resolution and a Snapdragon 800 processor, and also has a basic, unmodified OS, and that's pretty much it. 

If all you want from a phone is for it to provide a fast, fluid experience for all use cases, from texting to gaming and watching videos, I would strongly recommend the Nexus 5. I would only recommend you to get another phone if basic isn't your kind of thing, that is, if you like the complexity of OEM-modified UIs and nifty hardware extras like fingerprint sensors. But the Nexus 5 is, with no doubt, the best basic smartphone flagship this year.