Nvidia, the company that is breaking records on Wall Street and has become a superpower thanks to the artificial intelligence revolution.
The company is originally known for manufacturing computer chips that process graphics, particularly for gaming.
When ChatGPT went public last November, it made an impact far beyond the technology industry.
From helping with speech writing to computer coding, suddenly artificial intelligence (AI) was emerging as a real and useful tool.
However, this would not be possible without very powerful computer hardware.
And one company in particular became the center of the AI bonanza: California-based Nvidia.
Originally known for making the kind of computer chips that process graphics, particularly for computer games, Nvidia’s hardware is the foundation of most AI applications today.
“It’s the leading player in technology that enables this new thing called artificial intelligence,” says Alan Priestley, a semiconductor industry analyst at Gartner.
“What Nvidia is to AI is almost like what Intel was to PCs,” explains Dan Hutcheson, an analyst at TechInsights.
ChatGPT was trained using 10,000 of Nvidia’s graphics processing units (GPUs), clustered on a supercomputer owned by Microsoft.
“It’s one of many supercomputers that have been built with Nvidia GPUs, for a wide variety of scientific and artificial intelligence uses,” says Ian Buck, general manager and vice president of accelerated computing at Nvidia.
Nvidia has about 95% of the machine learning GPU market, noted a recent report from CB Insights.
Figures show that its artificial intelligence business generated about $15 billion in revenue last year, up 40% from the previous year and surpassing gaming as its largest revenue source.
Nvidia shares soared nearly 30% after it released its first-quarter results on Wednesday. The company said it is ramping up production of its chips to meet “growing demand.”
The AI chips cost about US$10,000 each, although its latest and most powerful version sells for much more.
How did Nvidia become a central player in the AI revolution?
Ultimately, thanks to the marriage of a bold bet and good timing.
A crucial decision
Jensen Huang, now CEO of Nvidia, was one of its founders in 1993. At that time, Nvidia focused on improving graphics for games and other applications.
In 1999, it developed GPUs to improve image display for computers.
GPUs excel at processing many small tasks simultaneously (e.g., handling millions of pixels on a screen), a mechanism known as parallel processing.
In 2006, researchers at Stanford University discovered that GPUs had another use: they could accelerate mathematical operations in a way that normal processing chips could not.
At that point, Huang made a crucial decision for the development of AI as we know it.
He invested Nvidia’s resources in creating a tool to make GPUs programmable, thus opening up their parallel processing capabilities for uses beyond graphics.
That tool was added to Nvidia’s computer chips. For computer game users, it was a capability they didn’t need and probably didn’t even know about, but for researchers it was a new way to do high-performance computing.
That capability helped trigger the first breakthroughs in modern AI.
In 2012, Alexnet, an AI that could classify images and was trained using just two of Nvidia’s programmable GPUs, was introduced.
The training process took only a few days, rather than the months it might have taken with a much larger number of regular processing chips.
The discovery that GPUs could massively accelerate neural network processing began to spread among computer scientists, who started buying them to run this new kind of work.
“Artificial intelligence found us,” says Nvidia’s Ian Buck.
The company took advantage of its head start by investing in the development of new types of GPUs better suited to AI, as well as more software to make the technology easier to use.
A decade, and billions of dollars later, came ChatGPT, an AI that can give eerily human answers to questions.
Artificial intelligence company Metaphysic creates videos of celebrities and others using AI techniques. Its spoofs of Tom Cruise created a stir in 2021.
To train and then run its models, Metaphysic uses hundreds of Nvidia GPUs, some purchased from Nvidia and some accessed through a cloud computing service.
“There are no alternatives to Nvidia to do what we do,” says Tom Graham, its co-founder and CEO. “It’s way ahead of the curve.”
However, while Nvidia’s dominance seems assured for now, the long term is harder to predict. “Nvidia hit the target that everyone is trying to hit,” notes Kevin Krewell, another industry analyst at TIRIAS Research.
Other major semiconductor companies offer some competition. AMD and Intel are best known for making central processing units (CPUs), but they also make dedicated GPUs for AI applications (Intel recently joined the competition).
Google has its tensor processing units (TPUs), which are used not only for search results but also for certain machine learning tasks, while Amazon has a custom chip for training AI models.
Microsoft and Meta, meanwhile, are developing their own AI chips.