Chipmakers and PC vendors for more than a year have been zeroing in on the promised coming of the AI PC boom, a time when users and developers will be able to run a broad array of AI tasks locally on their systems and — hopefully — give a charge to a global PC market that has slowed in the post-pandemic years.
Already this year the industry has seen Intel, NVIDIA, and Arm roll out new processors, platforms, and architectures aimed at delivering the needed compute to these systems, which are the cornerstones of the burgeoning edge AI push. Most recently, AMD and Qualcomm also expanded their offerings.
AMD earlier this month rolled out its Ryzen Pro 8000 Series desktop chip aimed at AI PCs for business users and offering high performance and low power consumption. And that came a week after the vendor unveiled a second-generation portfolio of its Versal Systems on a Chip (SoCs) for both AI-enabled and traditional embedded devices.
This week had Qualcomm introducing its Snapdragon X Plus platform, which includes a 10-core Arm-based Oryon CPU and Hexagon neural processing unit (NPU), which are important for PCs running AI tasks. The platform is aimed at Windows-based PCs, which will put it in even more direct competition with x86 chip makers Intel and AMD.
“By delivering leading CPU performance, AI capabilities, and power efficiency, we are once again pushing the boundaries of what is possible in mobile computing,” Kedar Kondap, senior vice president and general manager of compute and gaming at Qualcomm Technologies, said in a statement.
The Emerging AI PC Market
The silicon vendors are all competing in an emerging AI PC market that is set to expand rapidly in the coming years, with the systems able to run such AI tasks as model training, natural language processing, image and speech recognition, and computer vision. According to analysts at market research firm Canalys, about 48 million AI-capable PCs will ship worldwide this year, accounting for about 18% of all PC shipments. However, the number shipping next year will be more than 100 million, or 40% of all shipments.
In 2028, 205 million AI PCs will ship, growing at an average of 44% a year between now and then, Canalys said.
“The wider availability of AI-accelerating silicon in personal computing will be transformative, leading to over 150 million AI-capable PCs shipping through to the end of 2025,” Canalys Principal Analyst Ishan Dutt said. “This emerging PC category opens new frontiers for both software developers and hardware vendors to innovate and deliver compelling use cases to customers across consumer, commercial and education scenarios.”
This will be welcome news to PC and component makers, who saw demand spike at the onset of the COVID-19 pandemic as businesses shifted hard to remote-work scenarios, only to watch it then plummet as the public health threat lifted.
According to Gartner, 2023 was the worst year in the history of PCs, with shipments — coming in at 2241.8 million units — falling year-over-year by 14.8%. It was the second consecutive year of a double-digit decline. That said, IDC is predicting 265.4 million units will move this year — a 2% rise — thanks in large part to the rise of AI PCs, as well as the need to replace systems that were bought during the pandemic.
Pushing out to the Edge
The draw of AI PCs and AI capabilities on other edge devices is that, typically, AI applications are housed in larger servers in cloud data centers and accessed via the internet. However, sending all the data generated at the edge to the cloud is costly and can cause latency and security problems. In addition, the cloud doesn’t help if the PC has no internet connection.
At the same time, Gartner analysts predict that by next year, as much as 75% of data will be generated at the edge, so the demand is growing to be able to analyze, store, and process all that data where it’s created rather than transferring it to the cloud.
Writing about Qualcomm’s efforts, Tim Bajarin, chairman of market research firm Creative Strategies, said AI PCs are a significant step forward for the industry and users.
“This strategic shift is not just about enhancing computational power or efficiency; it’s about reimagining what PCs can do with AI at their core,” Bajarin wrote. “This vision is shared by Intel, AMD, Microsoft, and others who are creating a new generation of PCs that are capable of delivering AI functionalities on the PC itself.”
Microsoft, Apple, and others already are pushing out small language models (SMLs) that can used by PCs and edge devices to run AI applications. The processors to power those systems are coming out fast and furious, with Intel and its Core Ultra, NVIDIA and its Blackwell GPUs, and Arm and its Ethos-U85 NPUs being positioned for work at the edge.
‘Everyone’s Going to Want an AI PC’
Qualcomm’s Snapdragon X Plus comes on the heels of its Snapdragon X Elite, another processor complete with the chip maker’s AI engine, a 12-core Oryon CPU and Hexagon NPU and also aimed at AI workloads on Windows systems that was launched in October. Qualcomm expects systems running both to start appearing later this year. Both can come with up to 64GB of memory.
The company claims Snapdragon X Plus delivers up to 37% faster CPU performance than competing Intel and AMD chips and consumes 54% less power.
AMD expects systems powered by its new Ryzen Pro 8000 chips to hit the market in the second quarter this year.
In an interview with CNBC, AMD CEO Lisa Su said ChatGPT and large language models showed businesses can access a lot of capability through training and inferencing in the cloud.
“But it turns out people have a lot of personal data and the way you use your PC, it’s actually a personal productivity tool,” Su said. “What I see is we’re at the beginning of the era where we can make much more capable personal assistants in the PC form factor. … As the technology gets better, I am absolutely sure that everyone’s going to want an AI PC.”
The post Qualcomm, AMD Add Fuel to the AI PC Engine appeared first on The New Stack.
The chipmakers bring new silicon offerings to a push to let users and developers run AI workloads locally.