GPU’s Are Taking Over

Posted on Posted in Blog, News & Research, Uncategorized

Recently, I’ve been watching tech conference keynote speeches by nVidia’s founder & CEO, Jen-Hsun Huang. What I’m realizing from watching these talks is just how important graphics cards (aka GPUs) are for enabling the next era of computing – an era that many are calling the “cognitive computing” era.

GPU’s – Not Just for Graphics 

Like many people, I simply [and wrongly] assumed that GPUs were used for processing graphics in industries such as gaming, digital animation, 3D engineering, and architecture. Even with the knowledge that some bitcoin miners used GPUs to mine cryptocurrencies, it didn’t really hit me until two days ago that GPUs are the real workhorses of modern AI applications being developed by companies like Google, Facebook, Microsoft, Baidu and beyond. This breakthrough of using GPUs for machine learning neural networks happened around the year 2012 as will be explained below.

A number of years back, Mythbusters did a great demo in partnership with nVidia to illustrate the difference between sequential CPU processing and massively parallel GPU processing which I’ve included below.

AI’s ‘Big Bang’ Moment 

The so-called Big Bang in deep learning (a field of AI research) happened sometime in 2012 at the University of Toronto. As described by nVidia’s CEO, something truly remarkable happened when U of T Computer Science professor Geoffrey Hinton and students Alex Krizhevsky & Ilya Sutkevyer took nVidia graphics cards and combined them with machine learning techniques to achieve a fundamental breakthrough in AI performance. The innovation was called AlexNet and radically improved the performance of image recognition algorithms over previous methods.

Screen Shot 2017-01-17 at 4.17.25 PM.png

Since then, the Artificial Intelligence & GPU developer community has exploded, with an estimated 25x increase in the number of developers in only 2 years (2014-2016). More importantly, researchers have shown that increases in GPU computing power have significantly outpaced increases in CPU computing power in the last 5 years. Moore’s Law, which applies to CPU’s, seems to be plateauing and appears that it will be transcended by 3D chip architectures which allow for massively parallel computing processes which are core to cognitive applications such as natural language processing, autonomous vehicles, and cloud-based artificial intelligence.

This explosion in use of GPUs for AI applications can easily be seen in the above images and also quite clearly in nVidia’s stock price in comparison to Intel’s over the last 10 years.  If we can say that Intel is fundamentally a CPU company and nVidia is fundamentally a GPU company, then what does the below chart say about the future of computing?


Why nVidia?

Digging into the nVidia technology platforms a bit, we see waves of new product lines, partnerships, and technologies being released every year. CES 2017 was a big year for announcements.

When it comes to gaming, although popular consoles such as the Microsoft xBox One or  Sony Playstation 4 use AMD chips, a huge population of developers and PC-gamers use nVidia hardware, frameworks, & libraries.

Gaming has been nVidia’s bread & butter for a long time, but they’re increasingly making strides in other markets such as cloud computing, data centres, IoT, autonomous vehicles, and beyond. They’re providing the hardware and enabling technologies upon which the future of AI is being built.

Notable nVidia Products & Services:

  • GeForce & Titan Chipsets – nVidia’s Latest GPUs for 4k, 3D, VR/AR/MR and  AI Applications 
  • GeForce Now – nVidia’s Cloud-Based Gaming Service 
  • Shield & Spot – nVidia’s Connected Home Platform 
  • Xavier & AI Co-Pilot – nVidia’s AI Car Platform

Notable Partnerships & Initiatives 

  • Car Companies & Auto Supplier Companies (Audi, Bosch, ZF)
  • Mapping Companies (Baidu, Here, Tom Tom, Zenrin)
  • IBM/Watson (link)
  • Facebook/Oculus
  • Tesla
  • Uber
  • Amazon

Emerging Trends & Applications of nVidia’s Technology 

Cognitive Computing in the Cloud

  • In the same way that Salesforce created the Software-as-a-Service market, other companies will offer “cognition as a service” likely using chips from companies such as nVidia. IBM has a Watson service in the cloud. Google will be offering machine learning services through it’s ML Engine API’s, and so on.


Cloud Gaming

The List Goes On…

Further Questions: 

  • What about Google’s TPUs? What about nVidia’s Volta architecture?
  • What have people like Elon Musk and other tech/social/economic luminaries said the future of cognitive computing? What is the Davos position?
  • What are the social, economic and regulatory implications of this rapidly advancing technology? What’s Trump’s policy on automation, AI research, etc.
  • Take another look at Marshall Brain’s comments/predictions about computer vision tech and emergent AI.
  • If the level of complexity of DNN’s (artificial brains) in 2017 rivals that of a bee or a bird brain, how long before that complexity begins to approach the level of a human brain?
  • What does this say about the SMIILE paradigm? Particularly the aspects of Intelligence Increase & Space Exploration?

Some Final Thoughts

If tech luminaries like Jen-Hsun Huang and Kevin Kelly are correct, the future of computing will be cognitive. Adding cognitive capabilities to previously nuanced tasks, will become increasingly possible.

Links & Media for Further Research

Leave a Reply

Your email address will not be published. Required fields are marked *