Wednesday 2 April 2014

Nvidia GTC: The GPU has come of age for general-purpose computing


Titan supercomputer featuring GPU along with CPU

Traditionally, powerful graphics processors have been useful mostly to gamers looking for realistic experiences along with engineers and creatives needing 3D modeling functionality. From spending a few days at this year’s Nvidia Global Tech Conference (GTC) it is very clear that the uses for GPUs have exploded — they have become an essential element in dozens of computing domains. As one attendee suggested to me, GPUs could now be better described as application co-processors.

GPUs are a natural fit for compute-intensive applications because they can process hundreds or even thousands of pieces of data at the same time. Modern GPUs can have several thousand reduced-instruction-set cores that can operate in groups across large amounts of data in parallel. Nvidia’s release of its CUDA (Compute Unified Device Architecture) SDK in 2007 helped usher in an era of explosive growth for general-purpose programming on GPUs (often referred to as GPGPU).

Imaging and vision have a lot in common with graphics

GPU accelerated applications leverage both the GPU and the CPU

Two of the most active markets for GPGPU computing are image processing and computer vision. Much like computer graphics, they both require running algorithms over potentially millions of elements in realtime — exactly what a GPU is designed to do well. One of the most amazing demonstrations of the power even a mobile GPU can bring to bear on computer vision is Google’s Project Tango. With only a tricked-out smartphone, Tango records over 250,000 3D measurements each second, using them to create a highly-accurate map of the surrounding building — including rooms, furniture and stairwells — as the user walks around. To do that, it uses not just a state-of-the art Nvidia mobile GPU — which project lead Johnny Lee points out has more computing horsepower than the DARPA-challenge-winning autonomous vehicle from 2005 — but two custom vision processors.

To give you a sense of how fast a GPU can accomplish tasks using parallel processing, Mythbusters did this amusing and instructive demonstration for Nvidia:


Big data is a lot like big graphics

Delphi showed off a prototype user-customizable car dashboard where the power of the GPU allowed drivers to theme their car's display

It didn’t take long for the big data craze to tap into GPU horsepower either. For example, startup Map-D has found a way to use GPUs and their memory to implement an ultra-high-speed SQL-compatible database. This allows it to analyze nearly unlimited amounts of data in near-realtime. One of their eye-opening demos is an interactive data browser of tweets worldwide. The system allows realtime analysis of one billion tweets using eight Nvidia Tesla boards on a server. Map-D won the Emerging Company Showcase at GTC with its cool demos, but it wasn’t the only startup showcasing the use of GPUs for big data hacking. Brytlyt is using Nvidia GPU cards to run queries that it says would take Google’s BigQuery 30 years in just six minutes. Brytlyt’s software will enable large retailers to do better interactive promotions and targeted marketing by allowing them to react quickly to customer location and actual purchases.

Global Valuation uses GPUs to tame big data in the back office of financial firms. It is tackling the esoteric job of risk management for firms holding massive, interconnected, portfolios of derivative securities. Apparently — despite the lessons learned in the financial markets meltdown — current risk management tools (even running on 30,000 CPU cores) can only run a fraction of the scenarios needed to accurately evaluate risk. They’re also much too slow to run in realtime before trades are made — leaving companies exposed during the trading day. Running in GPU memory, Global Valuation says it can process 100,000 interconnected scenarios in under a second — fast enough to double check a company’s portfolio risk before each new trade is made.


No comments:

Post a Comment