Perhaps because it was the first quad core to really make an impact in mobile, or due to the recent tiff with Apple over who really has the better graphics engine, or with the contrast to Qualcomm’s Snapdragon S4 which includes LTE connectivity and is poaching design wins, a lot has been written about the Tegra 3. NVIDIA has made an extremely competitive solution in a relatively short time, but the full impact may not be seen until gaming tablets and optimized games using advanced effects take shape.
The Tegra 3 is best described as burst performance when needed. Several features have been put in to help the device save power, but when the workload calls for processing and graphics performance, the Tegra 3 offers more processing. A coarse block diagram shows highlights:
CPU: Rather than optimize cores for complete dynamic capability, NVIDIA decided to implement the precursor to what ARM calls the “big-little” approach. There are actually five ARM Cortex-A9 cores on chip: four which can run at 1.4 GHz (with one in a special turbo 1.5 GHz mode), and a fifth core – the “battery saver” clocked at 500 MHz. The battery saver core runs with the 4 main cores off, until the workload demands more performance, then it switches off and one or more of the 4 main cores comes on. They use single channel LPDDR2 memory at 1066 MHz, which probably limits their performance.
Graphics: The ULP GeForce engine inside the Tegra 3 features 12 “cores”, which are really SIMD thread execution units. Reports are the GPU can run at 500 MHz, which runs out to a peak of 12 GFLOPS. On the surface, this falls short of the Apple A5X, with a graphics engine featuring 16 SIMDs and 4 multiply-adds per SIMD, which translates to a peak of 32 GFLOPS. The controversy lies in how code is optimized for shading and vertex processing, and we’re still waiting for NVIDIA to weigh in on Apple’s claims. One fact: measured on the GLBenchmark 2.1.2 Egypt offscreen 720p, the Tegra 3 delivers 58 fps, while the new ipad has 139 fps. But keep reading that link: when more effects are added, typical of gaming more than just video playback, the Tegra 3 performance starts to become more evident. (Of course, the same argument could be applied to optimizing for PowerVR SGX543MP4 effects.)
Connectivity: Tegra 3, at least in current versions, doesn’t integrate baseband. This appears to have cost them some business where carriers have forced manufacturers, notably HTC, to switch out the Tegra 3 in some markets. Stay tuned for “Grey” on the roadmap, where the Icera modem begins to show up on-chip.
The issue Tegra 3 has exposed is a complicated one. What’s a typical workload for a tablet? Are there scenarios that would routinely invoke 4 processing cores, or games that take full advantage of advanced effects? If the answer is yes, the performance Tegra 3 offers is more competitive. If the answer is no, other dual core hardware can keep up. Could there be a Doom-like benchmark for tablets coming? This is further complicated because directly comparing Android and iOS (and wait until we have to look at Windows Phone 8 and maybe BlackBerry 10) is subjective, which is why GLBenchmark frame rates are most often used.
With design wins in the ASUS Transformer Prime, Acer Iconia Tab A510, LG Optimus 4X HD, HTC One X (non-LTE), and ZTE Era, the Tegra 3 has made some progress. While there was some speculation Amazon was changing horses for the next Amazon Kindle Fire, it appears that TI will continue to have that business. Design wins are vital to NVIDIA, perhaps more than any other mobile SoC architecture since they are a newcomer, and it’s been uphill against well-entrenched competition and limited opportunity space.
NVIDIA is going all-in on the 4+1 approach. On the roadmap for 2013 is “Wayne”, a quad-core ARM Cortex-A15 clocked possibly at 2 GHz and incorporating technology from the new Kepler GPU line, and also with a battery-saver core. This will stand in contrast to dual-core A15 implementations such as the TI OMAP 5 and Samsung Exynos 5.
This two part debate – scenarios of prolonged quad-core effectiveness in tablets and superphones, and optimized game effects – will shape both the strategy and success of NVIDIA in mobile over the long term. What are your thoughts?
Next up in the Smart Mobile SoC series: Intel.
(Disclaimer: Not sponsored by, or invested in, companies mentioned. Never free, never me, but I am inexpensive when you need help with content on topics in social computing like this.)
Chief Story Officer, Left2MyOwnDevices