Qualcomm Challenges Nvidia with In-House AI Accelerator Chips
Qualcomm has officially entered the AI chip race with the launch of its two new accelerator chips. With this move, the company aims for a major shift from its traditional focus on smartphone and wireless connectivity semiconductors. With the launch of its new AI200 and AI250 chips, the company has positioned itself as a new challenger in the booming data center market, currently dominated by Nvidia and AMD.
Qualcomm announces AI200 and AI250 accelerator chips
According to an official announcement, Qualcomm plans to commercially release its new accelerator chip, the AI200, in 2026. The AI250 is scheduled to launch later in 2027. Both of these chips are designed for large-scale, liquid-cooled server racks. They are capable of powering an entire rack with up to 72 chips acting as one system.
Qualcomm builds its data center chips on the same hexagon neural processing unit (NPU) as its mobile processors. According to the company’s general manager of data center and edge, Durga Malladi, this is a part of a strategic move. She says that “We first proved ourselves in other domains, and then scaled up to the data center level.”
The new AI chips are competing on cost, efficiency, and flexibility
Unlike NVIDIA, whose GPUs are primarily used for training AI models, Qualcomm’s chips focus on inference, running pre-trained models efficiently. The company claims its rack-scale systems will cost less to operate. It is said to consume around 160 kilowatts per rack, roughly similar to Nvidia’s systems.
Malladi further adds in her statement that Qualcomm will offer modular sales. Clients will be able to purchase full racks or individual components. Interestingly, even competitors like Nvidia or AMD could use Qualcomm’s CPUs or other data center parts. The new AI cards are also said to be capable of handling 768GB of memory, surpassing both NVIDIA and AMD in this metric.
The post Qualcomm Challenges Nvidia with In-House AI Accelerator Chips appeared first on Android Headlines.


