NVIDIA Maintains Position in 2020 Market for AI Processors for Cloud and Data Center

By Tiera Oliver

Assistant Managing Editor

Embedded Computing Design

August 06, 2021

News

NVIDIA Maintains Position in 2020 Market for AI Processors for Cloud and Data Center

NVIDIA maintained its position in the global market for artificial intelligence (AI) processors used in the cloud and in data centers in 2020, with an 80.6% share of global revenue, according to Omdia.

NVIDIA generated cloud and data center AI processor revenue totaling $3.2 billion in 2020, up from $1.8 billion in 2019, as reported by Omdia’s AI Processors for Cloud and Data Center Forecast Report. The company continued to benefit from its success in the market for GPU-derived chips, which currently represent the leading type of AI processor employed in cloud and data center equipment, including servers, workstations, and expansion cards. 

The market for AI processors is undergoing growth. Global market revenue for cloud and data center AI processors rose 79% to reach $4 billion in 2020. Revenue is expected to soar by a factor of nine to reach $37.6 billion in 2026, according to Omdia.

 

Figure: AI processors for cloud and datacenter revenue forecast, world markets: 2019–26 (Source: Omdia)

During the past few years, competitive suppliers ranging from small startups to major semiconductor vendors have entered the AI processor market with a number of different chips, ranging from their own types of GPU-based chips, to programmable devices, to new varieties of semiconductors specifically designed to accelerate deep learning.

“Despite the onslaught of new competitors and new types of chips, NVIDIA’s GPU-based devices have remained the default choice for cloud hyperscalers and on-premises data centers, partly because of their familiarity to users,” said Jonathan Cassell, principal analyst, advanced computing, at Omdia. “NVIDIA’s Compute Unified Device Architecture (CUDA) Toolkit is used nearly universally by the AI software development community, giving the company’s GPU-derived chips a huge advantage in the market. However, Omdia predicts that other chip suppliers will gain significant market share in the coming years as market acceptance increases for alternative GPU-based chips and other types of AI processors.”

In its definition of AI processors, Omdia includes only those chips that integrate distinct subsystems dedicated to AI processing. These devices include GPU-derived AI application-specific standard products (GPU-derived AI ASSPs), proprietary-core AI application-specific standard products (proprietary-core AI ASSPs), AI application-specific integrated circuit (AI ASICs) and field-programmable gate arrays (FPGAs). While central processing unit (CPU) chips like Intel’s Xeon are used for AI acceleration in cloud and data center operations, Omdia is not including these devices in its AI processor analysis.

The other top players in the cloud and data center AI processor market include:

  • Second-ranked Xilinx, which offers field-programmable gate array FPGA products commonly used for AI inferencing in cloud and data center servers.
  • Third-ranked Google, whose Tensor Processing Unit (TPU) AI ASIC is employed extensively in its own hyperscale cloud operations.
  • Fourth-placed Intel, which is supplying its Habana AI proprietary-core AI ASSPs and its FPGA products for AI cloud and data center servers. 
  • Fifth-ranked AMD, which is offering GPU-derived AI ASSPs for cloud and data center servers.

Omdia’s AI Processors for Cloud and Data Center Forecast Report features market sizes and forecasts for categories of AI processors. Markets are segmented by power consumption requirements, inferencing vs. training, market segment, power consumption, workload, vertical market, and performance.

For more information, visit: omdia.tech.informa.com

Tiera Oliver is the assistant managing editor at Embedded Computing Design. She is responsible for web content editing, product news, and story development. She also manages, edits, and develops content for ECD podcasts, including Embedded Insiders.

She utilizes her expertise in journalism and content management to oversee editorial content, coordinate with editors, and ensure high-quality output across web, print, and multimedia platforms. She manages diverse projects, assists in the production of digital magazines, and hosts company podcasts by conducting in-depth interviews with industry leaders to deliver engaging and insightful discussions.

Tiera attended Northern Arizona University, where she received her bachelor's in journalism and political science. She was also a news reporter for the student-led newspaper, The Lumberjack. 

More from Tiera