The PC industry is losing the race for local AI
It’s not enough to champion AI hardware that supports local large language models, generative AI, and the like. Hardware vendors need to step up and serve as a middleman — if not an outright developer — for those local AI apps, too.
Qualcomm almost has it. At MWC 2024 (formerly known as Mobile World Congress, aka one of the world’s largest mobile trade shows), the company this week announced a Qualcomm AI Hub, a repository of more than 75 AI models specifically optimized for Qualcomm and Snapdragon platforms. Qualcomm also showed off a seven-billion-parameter local LLM, running on a (presumably Snapdragon-powered) PC, that can accept audio inputs. Finally, Qualcomm demonstrated an additional seven-billion-parameter LLM running on Snapdragon phones.
That’s all well and good, but more PC and chip vendors will have to demonstrate real-world examples of AI. Qualcomm’s AI Hub is a good start, even if it’s a hub for developers. But the only way that chip and PC vendors are going to convince users to use local AI is to make it easy, cheap, and so, so available. Yet very few have stepped up to do so.
The PC industry tends to seize upon any trend it can because PC hardware sales are perpetually undercut by smartphones, the cloud, and other devices that threaten its dominance. While laptop and Chromebook sales soared during the pandemic, they’ve come back to earth, hard. The argument that you’ll need a local PC to run the next big thing — AI — and pay for high-end hardware is an argument should have the PC industry slavering.
But the first examples of AI ran in the cloud, which puts the PC industry behind. This feels like a point I harp upon, but I’ll say it again: Microsoft doesn’t seem particularly invested in local AI quite yet. Everything with Microsoft’s “Copilot” brand in its name runs in the cloud, and generally requires either a subscription or at least a Microsoft account to use. (Copilot can be used with a Windows local account, but just for a limited number of times before forcing you to sign in to continue using it.)
Most people probably aren’t necessarily convinced that they need to use AI at all, let alone running locally on their PC. This is the problem chip and hardware vendors need to solve. But the solution isn’t hardware — it’s software.
Mark Hachman / IDG
The answer is apps: lots and lots of apps
Microsoft introduced an AI Hub to the Microsoft Store app last year, but even today it feels a little lackluster. Most of the chatbot “apps” available actually run in the cloud and require a subscription — which makes no sense, of course, when Copilot is essentially free. Ditto for apps like Adobe Lightroom and ACDSee; they’re subscription-based, too, which is what a local app could avoid by tapping the power of your local PC.
That leaves hardware vendors to carry the torch. And some have: MSI, for example, makes an “AI Artist” generative AI app available for its latest MSI Raider GE78 gaming laptop. While it’s a little clunky and slow, at least it provides a one-click installation procedure from a trusted vendor.
But that’s an oasis in a desert of local AI. Both AMD and Intel tout the performance of their chips on AI language models like Llama 2. That makes sense for those who already have tried out AI chatbots and are familiar with some of the various models and how they work. AMD, for example, laid out specifically what applications take advantage of its own NPU, which it brands as Ryzen AI.
With respect, that’s not quite good enough. It isn’t enough that Intel launched an AI development fund last year. What chip vendors need to do is get them into the consumer’s hands.
How? One method is already tried and true: trial subscriptions of AI-powered apps like Adobe Photoshop, BlackMagic’s DaVinci Resolve, or Topaz. Customers traditionally don’t like bloatware, but I think that if the PC industry is going to market AI PCs, they’re going to have to make a stab at a creator-class PC that leans into it. Instead of running with “Intel Inside,” start marketing “AI bundles.” Lean into the software, not the hardware. Start putting app logos on the outside of the box, too. Would Adobe be willing to put its stamp on a “Photoshop-certified” PC? It’s a thought.
Otherwise, I’d suggest one of the better ideas that Intel had: a return of the gaming bundle. Today, both Intel and AMD may bundle a game like Assassins’ Creed: Mirage with the purchase of a qualifying CPU or motherboard. But not too long ago, you could download several games, for free, that would showcase the power of the CPU. (Here’s an MSI example from 2018, below.)
Running AI locally does offer some compelling advantages: privacy, for one. But the convenience factor of Copilot and Bard is a powerful argument to use those sanitized tools instead. Consumers are fickle and won’t care, either, unless someone shows them they should.
If AMD, Intel, and eventually Qualcomm plan to make local AI a reality, they’re going to have to make that option simple, cheap, and ubiquitous. And with the AI hype train already barreling full speed ahead, they need to do it yesterday.
Author: Mark Hachman, Senior Editor