domingo, 1 de fevereiro de 2015

CES 2015: Asus unveils the Zenfone 2: $199 Flagship

A flagship smartphone starting at $199? Is that even possible? Well, Asus appears to have achieved it, according to their CES 2015 announcement of their new Zenfone 2 smartphone. Basically, the Zenfone 2 has specs that put it right on flagship territory, however, it's price tag is similar to mid-range phones. Costing as little as $199 before taxes, the Zenfone 2 might just have the most bang for the buck we've seen for a phone since the Moto G.

Asus' new smartphone has many variants at different price points, and instead of offering only variants with different storage capacities like most phones, the Zenfone 2 is available in different configurations for storage, RAM, and even processor speed. Of course, the $199 price tag applies to the simplest configuration, which is itself no slouch, with 16GB storage, 2GB of RAM and an Intel Z3560 processor (Quad-core 1.8GHz). Considering that most phones at this price range offer much less than that, Asus really must be congratulated for making such a good phone at this price range. 

Of course, it is very possible that Asus made some compromises to reach the low starting price for the Zenfone 2, and this can be confirmed only when the phone is actually available for purchase. For now, though, Asus has an astounding phone in their hands, at least on paper.

Asus Zenfone 2
 Body   152.5 x 77 x 10.9~3.9mm, 170g
 Display   5.5" IPS LCD fullHD (1920 x 1080, 403ppi)
 w/ Corning Gorilla Glass 3
 Data  LTE: Cat. 4 150/50 Mbps
 (FDD-LTE: Bands 1/2/3/4/5/7/8/9/17/18/19/20/28/29
 TDD-LTE: Bands 38/39/40/41)
 Connectivity   -Wireless:
 WiFi 802.11a/b/g/n/ac
 Bluetooth 4.0
 3.5mm audio jack
 microUSB 2.0 (with USB OTG)
 microSD card slot, up to 64GB
 Camera (rear)   13MP, 5-element f/2.0 lens, with dual-color LED flash. 1080p@30fps video
 Camera (front)  5MP, f/2.0 aperture lens, 85-degree wide lens
 Storage/RAM  16/32/64 GB, 2/4 GB RAM
 - Asus WebStorage: 5GB (lifetime)
 Processor  Intel Atom Z3560/Z3580 
 CPU   Quad-core Silvermont (4C/4T) 1.8GHz/2.3GHz
 GPU  PowerVR G6430 @ 533MHz (136 GFLOPS)
 OS  Android 5.0 Lollipop with Asus ZenUI
 Battery  Li-Ion 3,000 mAh, non-removable


The Zenfone 2 is undeniably a beautiful phone. There are many notable details about the Zenfone 2's design, the sum of which might just make Asus' new design the best one we'll see in a while. Most notably, Asus decided to, like LG, ditch the traditional side-mounted volume keys to the back of the device, aiming a reduction in bezel width. Unlike LG's approach, though, which moves the power button to the back as well as the volume keys, Asus chose to move only the volume keys to the back, putting the power button on the top of the phone. Speaking of bezel width, Asus' new phone has exremely slim bezels, which make the phone that bit more attractive, with a relatively good screen to body ratio of 72%.

Other notable features of the design include the all-aluminium rear and the Zenfone series' signature spun metal strip on the bottom of the front of the phone. The rear of the device is entirely made of brushed metal, in a rather HTC One-like fashion, which makes for a very premium looking phone, despite the low starting price. 

ZenUI is what Asus calls their Android skin. Their newest version of ZenUI has Android 5.0 Lollipop on its core, the latest Android version. Overall, ZenUI is one of the best Android skins available, and is the opposite of TouchWiz. While it keeps layout and functionality close to stock, adding little bloatware, it focuses rather on changing the look of Android.

ZenUI makes Android look very sleek with a flat, modern user interface, all without being intrusive to the unexperienced user. In general, this must be one of the best Android skins, if not the best. 


The Zenfone 2's display is on par with current flagships, which is very good for a phone with such a low starting price. What we have in hand is a 5.5" IPS LCD with fullHD resolution. With such a screen size, the Zenfone definitely steps into phablet territory, even if the extremely slim side bezels help reduce the phone's footprint.

If anyone's disappointed that Asus used "just" a 1080p display instead of a QHD screen: don't be. Any resolution increase from 1080p on a 5.5" phone would be very hard to notice. The difference is definitely not noticeable enough to offset the extra power consumption and performance impact that comes with QHD. At this point, it's much better to keep resolution where it is and work on other aspects of the display, and Asus' choice to keep it to 1080p might even give it a performance advantage over the competition. 

Anyways, Asus' phone ends up with a 403ppi pixel density, which should be perfectly fine unless you are trying to see some really small details. The Zenfone 2 will look as sharp as any flagship phone should look.


You can actually choose between two different processors for the Zenfone 2, depending on how much you're willing to pay for the phone. Ever so faithful to Intel, Asus has again decided to ship their flagship phone with an Atom processor, in this case, either the Z3560 or the Z3580. Since both available processor options belong to the same platform (Moorefield), both processors are physically identical, and the difference between them comes down to clock speed.

Both processors are built on a 22nm process, and include a CPU consisting of four Silvermont cores with 2MB of L2 cache. Since Silvermont is a 64-bit CPU and the Zenfone 2 runs on Android 5.0, the phone will benefit of the processor's 64-bit processing capabilities.

Also, both processors use a PowerVR G6430 GPU (same as the one on the Apple A7), with a base frequency of 457MHz and a burst frequency of 533MHz. In burst mode, the G6430's GPU can deliver a peak compute power of 136 GFLOPS, which is slightly above the Snapdragon 800's Adreno 330, but much less powerful than the Snapdragon 805's Adreno 420. The GPU is the only aspect of the Zenfone 2 which is not on par with flagships, but then again, it's hardly a slouch.

The differences between the Z3560 and the Z3580 come down to their CPU clock speed. While the Z3560 has its four cores clocked at up to 1.83GHz, the Z3580 can go all the way to 2.33GHz. That's a 500MHz increase for each core, which should definitely make a difference in usage...and in battery life. 


A flagship phone for $199. That is a combination most people would say is impossible. However, at least in paper, Asus really didn't cut back on any features to achieve the low price. Every aspect of this phone is flagship-level, and this achievement could make the Zenfone 2 a very successful smartphone. 

You get everything that you can possibly ask for from a flagship - A beautiful design, the latest Android version, good cameras (although 4K recording would have been appreciated), a large, high-quality, high-resolution display; a good processor too (even if not exactly among the best). All for a very low starting price. Even if the $199 version of the phone has "only" 2GB of RAM and a lower-clocked CPU, you're still getting much more than any other phone on this price range.

In general, Asus has done very well with its latest flagship. It has the specs of a flagship and the price of a mid-ranger. How that achievement will translate to sales figures is yet to be seen, but if Asus gets availability and marketing right, it's Zenfone 2 could be a huge success. 

segunda-feira, 19 de janeiro de 2015

CES 2015: Nvidia Announces the Tegra X1 Processor

During this year's Consumer Electronics Show (CES), Nvidia has given us a glimpse of their next generation mobile processor. And it's fantastic. For the first time implementing Nvidia's new mobile first strategy, they were able to quickly port their latest graphics architecture on to their Tegra line. As a result, the new Tegra X1 processor has a GPU based on Nvidia's brand new architecture, dubbed Maxwell. Thanks to its advanced power efficiency, Nvidia was able to build a very, very powerful GPU for the Tegra X1 without exceeding the power budgets that define the ultra-mobile market. The Tegra X1 will be most likely destined for high-performance and gaming tablets, and maybe even high-end Chromebooks.


The truth is that 28nm is getting old. This generation, we're starting to see manufacturers moving to the smaller 20nm process node. Apple has done it with its A8 and A8X SoCs, and Samsung has done it with its Exynos 5433 processor. Nvidia has also now jumped on the 20nm wagon with the Tegra X1. Built by TSMC, the Tegra X1 is Nvidia's first SoC to benefit from a 20nm process node, and that should really help to keep the power consumption in check, which is necessary for Nvidia, especially considering the extremely beefy GPU.


Unlike with the Tegra K1, which had a 32-bit Cortex-A15 version and a 64-bit Denver version announced, for the Tegra X1 Nvidia has so far only mentioned one version, which ditches their own Denver core in favor of complete ARM-designed cores. It's still 64-bit, luckily. The Tegra X1 features a big.LITTLE CPU configuration, with four high-performance Cortex-A57 cores, and four Cortex-A53 cores designed for low-power operation. At this point, clock speeds are not specified, however. It must be pointed out that, unlike with Samsung's Exynos 5433, which also uses big.LITTLE quad-core Cortex-A57s and A53s, the Tegra X1's CPU cannot use both CPU clusters at the same time, so it can't be considered an actual octa-core CPU. Which is a good choice by Nvidia, considering that the Cortex-A57 is already a very powerful core, and considering how most applications don't scale well beyond four cores. At this point, performance estimates cannot be made due to the unannounced clock speeds, but given that the Tegra X1 is meant for high-end tablets, I imagine that the clocks will be pretty high, and it's pretty safe to say that the Tegra X1 will be no slouch when it comes to CPU performance.

Of course, another variant of the Tegra X1 with a Denver CPU is definitely very possible, but like with the Tegra K1, such a variant would only be released later into the year. In truth, I'm surprised that Nvidia didn't make Denver the default CPU configuration for Tegra X1. It showed very good performance in the Nexus 9, and I would like to have seen the Tegra X1 launch with it.


This is, of course, the spotlight of Nvidia's announcement. Nvidia's mobile first development strategy has enabled them to adapt their latest GPU architecture for mobile very quickly, and that will represent a huge advantage for Nvidia over the competition. With a 2x performance/watt advantage compared to Kepler, the new Maxwell architecture is extremely power efficient, delivering much more performance than Kepler, without using any more power.

For the Tegra K1, Nvidia had a single Kepler SMX (192 CUDA cores) running at up to 950MHz (although the devices that launched with it usually kept the clock speed at 850MHz). A single SMX had four ROPs (Render Output units) and 8 TMUs (Texture Mapping Units). However, with the Tegra X1, Nvidia is moving to two of Maxwell's basic graphics units, named SMM. Each contains 128 CUDA cores, therefore, the Tegra X1 has a total of 256 CUDA cores. These are accompanied by 16 ROPs and 16 TMUs, all this at, according to Nvidia, a max clock speed of 1GHz. This clockspeed sounds even a bit preposterous, and it is very possible that tablets running on Tegra X1 will keep the GPU clock at a bit less than that, for thermal and power budget purposes.

Wrapping up the technical stuff, here's a table comparing Nvidia's last few SoCs:

Tegra X1 Tegra K1 Tegra 4 Tegra 3
 CPU   64-bit Quad-core Cortex-A57 + Quad-core Cortex-A53   32-bit Quad-core Cortex-A15 @ 2.3GHz + Single "companion" Cortex-A15 core  or 64-bit Dual-core Denver @ 2.5GHz  32-bit Quad-core Cortex-A15 @ 1.9GHz + Single "companion" Cortex-A15 core @ ~800MHz  Quad-core Cortex-A9 @ 1.6GHz + Single "companion" core @ ~500MHz
 Lithography   20nm  28nm  28nm  40nm
 GPU core configuration   256 CUDA cores
 16 ROPs
 16 TMUs
 192 CUDA cores
 4 ROPs
 8 TMUs
 48 Pixel shaders
 24 Vertex shaders
 8 Pixel shaders
 4 Vertex shaders
 GPU clock  1,000MHz  950MHz  672MHz  520MHz
 FP32 Peak Compute power (GFLOPS)  512  365  97  12.5
 Pixel Fill Rate (MP/s)  16,000  3,800  ?  ?
 Texture Fill Rate (MT/s)  16,000  7,600  ?  ?
 Memory Interface   Dual-channel 64-bit LPDDR4-1600 (25.6GB/s)  Dual-channel 64-bit LPDDR3-1066 (17GB/s)  Dual-channel 32-bit LPDDR3-1866 (15GB/s)  Single channel 32-bit LPDDR3-1600 (6.4GB/s)

The table clearly shows how far Nvidia has come since the Tegra 3. The Tegra X1 is a huge leap forward compared to the K1, in every aspect, especially in the graphics department. The Tegra X1's GPU is far beyond what previous-gen consoles like the Xbox 360 and PS3 could achieve, and even some current low-end dedicated laptop GPUs are less powerful than the Tegra X1's GPU. Kudos to Nvidia for this impressive achievement.


Nvidia's new focus on bringing their latest GPU architectures to mobile is doing them a lot of good. The new Tegra X1 has a very, very large graphics processor, but with the benefit of the Maxwell architecture's astounding power efficiency.

In general, Nvidia's new processor is a great package overall, showing off excellent specs and top notch future proofing. Everything that's necessary for a new high-end SoC is there: 64-bit processing, 20nm process, for instance. While in most aspects Nvidia is playing in equal ground with other flagship mobile processors, its GPU sets it apart from anything else on the market now. The Tegra K1 was already ahead of pretty much every other SoC, except for the Apple A8X, in graphics benchmarks. Now the Tegra X1 will help Nvidia extend their lead even further. I was only a bit disappointed that Nvidia, at least for now, is not making use of its Denver CPU cores for the Tegra X1 (and considering how well they performed in the Nexus 9, one might wonder why Nvidia made this decision).

In terms of actual products that might eventually carry the Tegra X1, I believe that smartphones are still out of the picture. Despite the 20nm process and Maxwell's efficiency, a 256-core GPU might still be too much for a smartphone's battery size and thermal dissipation capacity. However, I can speculate that maybe a Tegra X1 with a much lower clocked GPU could make its way into a high-end phablet. 

Overall a great package, with the added benefit of a GPU that rivals even some lower-end dedicated laptop GPUs. Nvidia did a great job with its new SoC, and while it may still not be fit for smartphones, the Tegra X1 is just perfect for high-end tablets and any form of compact gaming devices, and might just turn out to be this year's most powerful SoC. 

sexta-feira, 16 de janeiro de 2015

Samsung Galaxy Note 4 & Note Edge Review

Samsung's Galaxy Note line went from being a doubtful niche attempt to essentially the most popular line in Samsung's inventory in just four years. Naturally, Samsung's 2014 Note phablets were highly anticipated, and Samsung delivered accordingly. This time around, the direct successor to last year's Note 3, the aptly named Note 4, came accompanied by a very interesting attempt from Samsung to make a popular phone making use of its curved display technology, the Galaxy Note Edge. Very similar to the Note 4, except with a cuved extension of the main display replacing the right side of the device. 
Both phones excel in almost very aspect you can think of, and are definitely very worthy successors to last year's Galaxy Note 3. Those who are adventurous, and are looking for something new and different, will look to the Note Edge, while those who just want a traditional phablet, albeit in this case, the best one in the market, will go for the regular Note 4. 

To begin this review, let's look at how this year's Galaxy Notes fare in terms of pure specs, listed in the table below:

Galaxy Note 4 Galaxy Note Edge
 Body   153.5 x 78.6 x 8.5mm
 151.3 x 82.4 x 8.3mm
 Display   5.7" Super AMOLED QHD (2560 x 1440, 515ppi)   5.6" Super AMOLED WQXGA (2560 x 1600, 524ppi) with curved edge
 Storage & RAM  32/64 GB, 3GB RAM   32/64 GB, 3GB RAM
 Networks  GSM, HSDPA, LTE Cat. 6  GSM, HSDPA, LTE Cat. 6
 WiFi   dual-band 802.11 a/b/g/n/ac   dual-band 802.11 a/b/g/n/ac
Bluetooth  Bluetooth 4.1 LE  Bluetooth 4.1 LE
Camera (Rear)   16MP with OIS, LED flash, face detection and HDR
 4K@30fps (2160p) video, 1080p@30fps or 720p@120fps with video stabilization
  16MP with OIS, LED flash, face detection and HDR
 4K@30fps (2160p) video, 1080p@30fps or 720p@120fps with video stabilization
Camera (Front)  3.7MP with 2K@30fps (1440p) video   3.7MP with 2K@30fps (1440p) video
 OS  Android 4.4 KitKat w/ TouchWiz UI  Android 4.4 KitKat w/ TouchWiz UI
 Processor   SM-N910S: Snapdragon 805
 SM-N910C: Exynos 5433
 Snapdragon 805
 CPU  SM-N910S: Quad-core 32-bit Krait 450 @ 2.7GHz
 SM-N910C: Octa-core 64-bit bit.LITTLE (Quad-core Cortex-A57 @ 1.9GHz + Quad-core Cortex-A53 @ 1.3GHz)
 Quad-core 32-bit Krait 450 @ 2.7GHz
 GPU  SM-N910S: Adreno 420 @ 600MHz (337.5 GFLOPS)
 SM-N910C: Mali-T760MP6 @ 700MHz (204 GFLOPS)
 SM-N910S: Adreno 420 @ 600MHz (337.5 GFLOPS)
 Battery  Removable Li-Ion 3,200mAh
 Video playback: 14 hours
 Removable Li-Ion 3,000mAh
 Video playback: 12 hours
 Features  Heart rate and Sp02 sensor, fingerprint scanner, S Pen  Smart Edge screen, heart rate and Sp02 sensor, fingerprint scanner, S Pen


This is probably the only aspect where the Galaxy Note 4 and the Galaxy Note Edge actually differ significantly. While the Galaxy Note 4 is the regular shape we've come to expect phones to have, with a flat screen on the front, the Galaxy Note Edge makes an interesting use of Samsung's curved screen technology. Instead of having a flat screen covering the front, the Note Edge's panel cascades down the right edge of the phone, replacing the entire right side of the device. While it makes for a very interesting phone, from an aesthetic point of view, what is even better is the added functionality that the curved display offers. The side screen can be used for many things, like quick notifications, controls, and app shortcuts. While it won't exactly revolutionize the smartphone experience, it is a convenient extra to have.

Lefties beware! Since the Note Edge curves down the right side of the phone, it is less convenient for lefties to use. In the future, we might be seeing phones that curve down both sides, but until then, this device is more appropriate for right-handed users. 

In any case, aside from the curved screen, the Galaxy Note Edge and the Note 4 are very similar. Taking a leaf from the Galaxy Alpha's book, the two phablets' sides are made of aluminium (or in the case of the Note Edge, three of the four sides), with the back panel made of plastic, textured to resemble leather. The back cover of the new Notes are still removable, and so is the battery. While I would've liked to see an all-aluminium design, the new Notes feel very good in hand thanks to the metal sides. The front of the devices resembles any recent Samsung device, with the usual physical home button sitting in between Task Switcher and Back capacitive buttons. Underneath the home button there is Samsung's swipe-based fingerprint scanner.

Overall, a very good design for the new Notes. The aluminium sides, especially, are a very welcome addition to the phablets' designs, especially coming from Samsung. And at long last, Samsung has found a very non-disruptive and useful way to implement their curved AMOLED screens on a smartphone.


When it comes to displays, Samsung's flagships are always highly anticipated, as their AMOLED screens are always among the best in the mobile space. Both the Note Edge and the Note 4 feature Super AMOLED screens with a Diamond PenTile pixel matrix. Of course, the main difference here is that one has a flat screen and the other has a curved one.

The Note 4 features a 5.7" display, just like the Note 3, except that this time the resolution is bumped to a stunning 2560 x 1440 resolution, which results in a fine 515ppi. This pixel density is really starting to get close to the limit after which the human eye cannot process any further details, but still, so far there are still advantages to be had with the extra resolution.

The Note Edge features a curved 5.6" display with 2560 x 1600 resolution, which translates into a pixel density of 524ppi. In comparison to the Note 4, the extra 160 pixels form the extra edge display portion. That aside, the Note Edge should be identical to the Note 4's display, offering the same benefits of Samsung's AMOLED technology, like extremely saturated colors and stunning contrast.

Software & Features

Both the Galaxy Note 4 and the Note Edge run on an Android 4.4.4 KitKat build skinned with Samsung's TouchWiz UI. An update to Android 5.0 Lollipop should be coming out soon.

TouchWiz is known for its various software features, some useful, many useless. The same can obviously be said about the new Galaxy Notes' software. Features like Multi Window and Air Gestures continue to be implemented and refined in Samsung's new phablets. Despite how heavy the whole TouchWiz package is, the 3 GB of RAM and powerful processors should keep the new Notes running very smoothly, no matter what you throw at it. 

Of course, these being devices from the Note range, a very important part of the package is the S Pen, Samsung's active stylus, which is better than ever this time around. More than ever, the S Pen lends itself to making the Note experience much more interactive and productive than on any other device.

Processor & Performance

As every flagship device these days, the Galaxy Note 4 and Note Edge are powered by some of the most powerful processors currently available. There is a distinction to be made, however. While the Galaxy Note 4 is available either with a Snapdragon 805 or an Exynos 5433 processor, depending on the region, the Galaxy Note Edge uses only the Snapdragon 805.

The Snapdragon 805, built on a 28nm HPm process (which is starting to get old), consists of a quad-core Krait 450 CPU clocked at up to 2.7GHz. The CPU is 32-bit, so when Lollipop comes to the Snapdragon 805-powered Notes, the new OS's 64-bit support will be of no benefit. Alongside the CPU, there is an Adreno 420 GPU clocked at 600MHz and a dual-channel 64-bit LPDDR3-1600 memory interface, offering ample bandwidth at a peak 25.6GB/s.

Samsung's Exynos 5433 processor, used in some variants of the Note 4, but not on the Note Edge, is a totally different beast. Built on Samsung's cutting-edge 20nm HKMG process node, it consists of a big.LITTLE CPU configuration, with four high-performance Cortex-A57 cores clocked at 1.9GHz, and four low-power, low-performance Cortex-A53 cores clocked at 1.3GHz. When necessary, both CPU clusters can work together, essentially making it a true octa-core CPU. Backing up this beastly CPU is ARM's Mali-T760 GPU clocked at 700MHz, and to wrap it off (not so well), the system is fed by a 32-bit dual-channel LPDDR3-1650 memory interface capable of delivering up to 13.2GB/s of bandwidth. This is much less than what the Snapdragon 805's memory interface can deliver, and I'm not sure 13.2GB/s can cut it for such a high-resolution display. This could only pose a potential bottleneck issue when running bandwidth-heavy games at the Note 4's native resolution. 

All variants of the Note 4 and Note Edge carry 3 GB of RAM, which should be more than enough space, even with Samsung's heavy TouchWiz features eating up memory.


Considering the large, high-resolution display and the powerful processors, the Galaxy Note 4 and Note Edge require a large battery to keep them running for long enough. The Galaxy Note 4 has a large 3,220 mAh battery feeding it, which should be enough, despite the power hungry internals. However, probably because of the curved display portion, Samsung had to reduce the Galaxy Note Edge's battery size down to 3,000 mAh. While this is still a large battery, it means that the Galaxy Note Edge will hardly win any battery life tests.


The new Galaxy Note 4 and Note Edge are excellent devices, delivering the goods in pretty much every aspect you can possibly think of. Beautiful screen, attractive design, powerful hardware and, of course, the venerable S Pen all make the new Notes very worthy successors of the flagship Note line.

The Note 4 is not exactly revolutionary compared to last year's Galaxy Note 3, however, every aspect of it is improved in comparison, securing Samsung's advantage in the phablet market. The Galaxy Note Edge, however, is more like Samsung still experimenting ways to implement its curved displays, and is so far Samsung's best attempt at it. Without impairing the usability of the device, Samsung managed to implement its curved screen technology in a way that not only made for an aesthetically pleasing phone, but also added functionality that people might actually use, unlike previous attempts (Galaxy Round, I'm talking to you).

Overall, Samsung's 2014 Note devices are their best phablets ever, and probably the best in the entire phablet market, scoring high marks in every aspect you can think of.