GPU accelerated data science & AI

Today, Artificial Intelligence is synonymous with neural networks. For decades, this was not the case. The widespread development and adoption of neural network techniques has led to a boom in AI. But why? And why now?

The artificial intelligence market has literally exploded in fewer than 10 years. It has gone from the obscurity of research labs to impacting every industry in the world. Today, artificial intelligence is synonymous with neural networks. You could be forgiven for thinking that researchers found a new breakthrough in neural networks 10 years ago that has led to the boom that we are witnessing now. 

In fact, computer-based neural networks have been part of AI research for well over 70 years. The first major paper on neural networks was published by McCulloch and Pitts in 1943. Their proposed method for neural networks was applied shortly thereafter. Later, in the mid-80's researchers developed a method to train neural networks which contained more than one layer.

Dual Quadro RTX 8000 graphics

GPU parallel computing provides the performance needed for deep learning.

The multi-layer neural network breakthrough led to an increase in R&D in neural networks. Still, by the end of the 1990's, several issues hampered neural network development. This included a lack of readily available data for neural network training. More important, perhaps, was the lack of computing power necessary for training purposes.

One of the most important advances in AI has been the application of GPU acceleration to deep learning. This has enabled an explosion of R&D in artificial intelligence.

Enter the GPU

The GPU is a powerful processor and an inherently parallel-processing device. It is burdened with performing complex mathematical operations for real-time 3D graphics and generating a continuous flow of millions of pixels. For real-time rendering on a typical 4K display, a GPU will produce between half a billion and one billion pixels per second.

It makes sense that a powerful, highly parallel computing engine like the GPU would be a perfect match for the parallel processing needs required by neural network training. Two advantages are particularly interesting for AI. The GPU is a high-performance processor that is well-adapted to AI calculations. And GPUs are mass produced and therefore widely available and affordable.

This combination allows GPUs to train the larger neural networks needed for deep learning with massive data sets. The increase in pure parallel processing performance reduced training times by 50%, 70%, and more. Training that would have required months or even a year were reduced to a week or even days. Obviously, this is a binary result. Training that requires one year is a "0" and training that requires less than a week is a "1". 

A milestone in this development occurred in the ImageNet Large Scale Visual Recognition Challenge (ILSVRC) in 2012. Alex Krizhevsky, Geoffrey Hinton, and Ilya Sutskever created a Convolutional Neural Network (CNN), known as AlexNet, trained using GPU computing. Their system won the challenge by an impressive margin.  While this was not the first time fast GPU computing had been applied to training, it became a tipping point in the industry.

The tools available to developers have continued to develop. NVIDIA provides a complete software stack for AI development that is fully GPU accelerated. It has been developed to run on workstations, servers, and in the cloud and scales from notebook computers to data center clusters. Workstation vendors provide this software pre-installed and tested on their Data Science Workstation products. 

But Wait. There's More

GPU acceleration is not the only factor contributing to the AI explosion, but we can argue that it has been essential. Other key factors include: massive data sets for training, the improvement of neural network structures, and improved techniques for network initialization. 

Researchers found that the performance of their systems improved with very deep neural networks.  They also found that the data sets needed to be much larger. And by investigating the reasons for underperformance of networks, they. discovered that initializing a network for training could be improved. The first tow impact training times and that is a fact that makes the use of GPU computing to training even more important. 

And where is the market going? In terms of technologies, AI is being applied significantly to computer vision and natural language processing, Both are broad technologies used in many industry segments. But AI is touching every industry and touching it in unexpected ways. Which industries will benefit from AI-generated musical compositions? Which will benefit from AI-based video resolution scaling?  

These examples are specific, but their application to industry is broad. In fact, the broad use of AI technology in many different industries is one reason to take market growth projections seriously. Market growth estimates imagine 10-fold growth over five to seven years with CAGR in the 30% - 45% range. 

There is reason to be optimistic. The volume of data generated every day is growing. Much of this will be training fodder for new applications of AI technology. GPU computing power continues to grow. I recently looked at a Data Science Workstation with dual Quadro RTX 8000 graphics. The workstation configuration had over 37 billion transistors in the GPUs alone. NVIDIA's new Ampere architecture is on the market now and it raises the performance significantly.  Finally, companies and governments are investing in AI and this investment is driving innovation and improvements in deep learning technology.

A Final Perspective

AI technology based on neural networks seemed to explode out of no where when in fact, the idea is essentially as old as the computer itself. The ability to apply massively parallel GPU computing combined with huge amounts of data for training, and new developments in neural network techniques created an interest and investment in developing this technology. The technology of AI itself can be applied across all of the largest industries which in turn is generating a tremendous investment in artificial intelligence. This positive feed-back loop has generated the explosion that we are witnessing today and it is a boom that will continue for many years to come.

Zum Seitenanfang