I liken it to when ultrabooks and netbooks launched in a time when most people were offline and uninterested in connectivity. Full featured laptops existed then, as well.
In this case, dedicated capacity for AI has been in systems for a while behind the scenes (ie- Intel GNA) but without consumer demand; it's OEMs and OS/software vendors who are most interested in pushing the "AI Ready" standard.
If Microsoft built a set of local tools and models that integrated with Office or other applications (like Krita + ComfyUI) that were "sold" out of the Microsoft Store and only available on AI Ready systems, it might ignite some consumer interest.
However, leading with "Look how we can record your screen in secret and use AI to harvest your private information!" instead of "Here's you as a cartoon character" more or less solidified everyone's distrust in their responsible intent.
Small screen, full-OS laptops for $300 at a time before the iPad/iOS era of dumbing down computers. I wrote most of my undergrad thesis on a Dell Mini, with STATA installed for regression analysis. 9-inch screen, 1 GB RAM, 1GHz Intel Atom was plenty powerful for handling office work and watching Youtube. Its battery life was actually better than most high TDP Intel laptops, and games up to 2003-04 worked very well on it.
I remember netbooks like the Asus eee PC and Dell Mini (which I think I had for a spell) being _very_ portable but being capable of _maybe_ three hours of battery life with the most conservative power settings, which made these tiny laptops dog slow most of the time.
That’s interesting because I knew a fair number of people who had those and got rid of them due to terrible performance and battery life. One of the big ways they hit the low price point was using spinning metal drives, which really held back the entire system.
I agree, but only when running Linux. The Windows powered netbooks I tried were all too underpowered for the OS. Inputs were super laggy because of that and waiting for an application to load was aggravating. If you were very disciplined and only kept one app open at a time, and never more than 5 browser tabs, you could live on it, but it wasn't a great experience for me.
This niche is pretty well served by some of the portable gaming PC companies that have added business netbooks to their lines. GDP has the mini 8" transformer which isn't bad. The 4 just came out with ridiculous specs for such a small machine, but with poor battery to match. The 3 can be had cheaper with less powerful CPUs and better battery. I'm using a onexpro netbook 5, a really high quality 10" that compiles large rust code bases fast enough for me to be happy and also doesn't suck at all to type on, battery is fine at 5 hours with really fast charging.
The PC market is too enshittified for this to happen. Companies no longer differentiate on hardware but on "subscriptions and services". Portable gaming machines like Steam Deck, ASUS ROG Ally and some indie OEMs are the closest thing we have to a mainstream mini PC.
You must have had one of the better ones. Most of them were literally not usable.
Even YouTube was pretty choppy on most of these things. And the screen resolution so bad that web browsing was terrible on them. A lot of dialog boxes wouldn't even fit on the screen, leaving you stuck and unable to change settings or even save files in a lot of situations. The drives were slow in most of them too.
Not to say that they couldn't be better, but they were rated poorly because most of them really were trash.
AI has become a negative brand for a fast-growing cohort. When you advertise something as having AI or adding AI, people are turned off rather than excited. And even if they feel neutral about it, they'll certainly opt to spend less money on the non-AI version if they can.
It's simply another sign that we're at the tail-end of a massive bubble that's about to burst spectacularly.
I think there is a mismatch with Intel/Window's idea of an AI PC being a beefed up edge TPU core, while customers who do want AI want LLMs, which I'm under the impression the Lunar Lake AI features aren't that great at accelerating (but are presumably good at non LLM vision and audio processing). If I'm misunderstanding this, I'd love a reply.
Having a chatgpt style experience locally requires about 24GB of dedicated VRAM at a bandwidth of about 1 TB/s. These "AI PCs" are nothing close to this. They can only run much smaller neural networks for a small variety of data operations. Think of like, photoshop filters. Any AI chat experience on an AI PC is still being sent over network to a datacenter. Some new photoshop filters does not define a whole new category of computer.
Exactly. The problem is Intel's AI PC might be PCs but they are too under-powered for AI work. Apple makes real AI PCs and they sell well enough, the latest AMD chips are getting into that territory as well. I want to see Intel succeed but they repeatedly miss the boat by over-promising and under-delivering, as if they're afraid of providing too much value too soon.
Premium price is one issue, the second one is dependence on some proprietary AI system. I remember buying LG or Samsung phone with their systems that they stopped to support and switch to Android.
And you don’t want to buy premium brick if you don’t have to.
I actually want an AI PC, but not the kind that are being marketed. Basically I want to see a cheaper Mac M3 Ultra with about 128GB unified memory combined with discrete-performance GPU on-board. AMD is moving in that direction and might get there in 3 or so generations. Could perhaps get away with less hardware if software improves before then.
Nvidia's DGX Spark is a bit of a joke with only 273 GB/s memory bandwidth which is less than a 10 year old Radeon Fury X with 512 GB/s or the current RTX 5090's 1.79 TB/s.
Looks like Ryzen AI Max is quad-channel soldered DDR5-8000, which is 256 GB/s. Not bad, but if use a model that fills that 128GB, you're only getting 2 tokens per second.
It would be easier to agree with you if any PC manufacturers were rushing out to fill the consumer ARM PC market. But they aren't. Qualcomm is sorta just jerking off, Nvidia has the licenses but doesn't want to negotiate with Softbank, and AMD hasn't made a serious ARM-related effort in a decade. The only ARM PC options are the ones that lock you in with MacOS and Apple as your computer nanny. And those customers would buy a Power ISA laptop in 2025 if Tim Cook said it was cutting-edge.
In the end, it sure looks like the ARM detractors were right. Architectural licenses made it so nobody except Softbank partners could afford to design an ARM core. Stock ARM core designs are outright anemic and inefficient even compared to decade-old x86 cores, meaning Broadcom or Rockchip can't just release a board and compete with AMD and Intel.
Nowadays, international detente is down the toilet, Softbank is seen as a penny-pincher, and RISC-V is already being shipped in place of ARM microprocessors in mass-market computer hardware. Apple basically ruined the entire ISA in the process of making it marginally competitive with generations-old silicon.
It doesn’t matter why Intel products are worse in every dimension, the fact is that they are and looking at Intel’s most recently quarterly results, customers and the market agree.
I agree that Intel is now reaping precisely what they've sowed. But if making money is the goal, buying an ARM architectural license would have put Intel in an early grave. The rest of the industry agrees - nobody without direct leverage over Softbank is able turn a profit designing and shipping ARM hardware.
I understand how Mac users would argue that sacrificing ARM to Moloch makes great PCs. But it also hamstrings the rest of the industry, prevents direct competition, slows down ARM proliferation and accelerates the development of foreign replacements like RISC-V. As a fan of ARM, I cannot say I'm happy with the stewardship Softbank exercises over the IP.
Insane idea: Intel releases a big RISC-V product, even if it's just an existing Lake with a new decoder on the front.
I could imagine it sucking a lot of air out of the competitive ecosystem. Anyone working on with fiddly little Rasperry Pi class products to help get the platform bootstrapped to full desktop performance suddenly has to compete with a much heavier opponent.
How is ARM proliferation being slowed down? Amazon, Microsoft, Meta and Google are all designing their own ARM based server chips.
Apple has been making money hand over fist designing ARM based hardware since 2010 with the A4.
Amazon is saving billions using its own custom chips for both servers and specialized networking chips.
But the fact is that Intel has to figure out something. Right now x86 may not be dead. But it is smelling funny. They are selling fewer desktops, servers and are non existent in mobile.
Won’t big enterprises eat this stuff up once they figure out Microsoft Recall lets them copy employees in 5-10 years by recording everything every employee does on every work computer and training AI on all that juicy “behavior cloning” data?
Terrible for wagies, great for capitalist equity lords
I liken it to when ultrabooks and netbooks launched in a time when most people were offline and uninterested in connectivity. Full featured laptops existed then, as well.
In this case, dedicated capacity for AI has been in systems for a while behind the scenes (ie- Intel GNA) but without consumer demand; it's OEMs and OS/software vendors who are most interested in pushing the "AI Ready" standard.
If Microsoft built a set of local tools and models that integrated with Office or other applications (like Krita + ComfyUI) that were "sold" out of the Microsoft Store and only available on AI Ready systems, it might ignite some consumer interest.
However, leading with "Look how we can record your screen in secret and use AI to harvest your private information!" instead of "Here's you as a cartoon character" more or less solidified everyone's distrust in their responsible intent.
Netbooks were so underrated.
Small screen, full-OS laptops for $300 at a time before the iPad/iOS era of dumbing down computers. I wrote most of my undergrad thesis on a Dell Mini, with STATA installed for regression analysis. 9-inch screen, 1 GB RAM, 1GHz Intel Atom was plenty powerful for handling office work and watching Youtube. Its battery life was actually better than most high TDP Intel laptops, and games up to 2003-04 worked very well on it.
We must have had a different experience.
I remember netbooks like the Asus eee PC and Dell Mini (which I think I had for a spell) being _very_ portable but being capable of _maybe_ three hours of battery life with the most conservative power settings, which made these tiny laptops dog slow most of the time.
That’s interesting because I knew a fair number of people who had those and got rid of them due to terrible performance and battery life. One of the big ways they hit the low price point was using spinning metal drives, which really held back the entire system.
I agree, but only when running Linux. The Windows powered netbooks I tried were all too underpowered for the OS. Inputs were super laggy because of that and waiting for an application to load was aggravating. If you were very disciplined and only kept one app open at a time, and never more than 5 browser tabs, you could live on it, but it wasn't a great experience for me.
I wish we could get a mainline manufacturer like HP or Dell to come back around with a 7"-9" ultraportable laptop.
This niche is pretty well served by some of the portable gaming PC companies that have added business netbooks to their lines. GDP has the mini 8" transformer which isn't bad. The 4 just came out with ridiculous specs for such a small machine, but with poor battery to match. The 3 can be had cheaper with less powerful CPUs and better battery. I'm using a onexpro netbook 5, a really high quality 10" that compiles large rust code bases fast enough for me to be happy and also doesn't suck at all to type on, battery is fine at 5 hours with really fast charging.
Tablets with portable keyboards replaced that niche, mostly.
This is served by tablets with a keyboard, like the iPad Pro or the Samsung Galaxy Tab.
The PC market is too enshittified for this to happen. Companies no longer differentiate on hardware but on "subscriptions and services". Portable gaming machines like Steam Deck, ASUS ROG Ally and some indie OEMs are the closest thing we have to a mainstream mini PC.
You must have had one of the better ones. Most of them were literally not usable.
Even YouTube was pretty choppy on most of these things. And the screen resolution so bad that web browsing was terrible on them. A lot of dialog boxes wouldn't even fit on the screen, leaving you stuck and unable to change settings or even save files in a lot of situations. The drives were slow in most of them too.
Not to say that they couldn't be better, but they were rated poorly because most of them really were trash.
Asus ones like 1215B were quite good, mine lasted between 2009 and 2024, now a tablet has taken over its role as travel computing device.
I used one as well, would just remote desktop in to my box running in my dorm.
AI has become a negative brand for a fast-growing cohort. When you advertise something as having AI or adding AI, people are turned off rather than excited. And even if they feel neutral about it, they'll certainly opt to spend less money on the non-AI version if they can.
It's simply another sign that we're at the tail-end of a massive bubble that's about to burst spectacularly.
Makes sense. I wouldn't want to pay a premium for a feature that I'm not interested in having.
My employer just refreshed all of our dev machines, and they didn't go with "AI PCs" either.
I think there is a mismatch with Intel/Window's idea of an AI PC being a beefed up edge TPU core, while customers who do want AI want LLMs, which I'm under the impression the Lunar Lake AI features aren't that great at accelerating (but are presumably good at non LLM vision and audio processing). If I'm misunderstanding this, I'd love a reply.
Having a chatgpt style experience locally requires about 24GB of dedicated VRAM at a bandwidth of about 1 TB/s. These "AI PCs" are nothing close to this. They can only run much smaller neural networks for a small variety of data operations. Think of like, photoshop filters. Any AI chat experience on an AI PC is still being sent over network to a datacenter. Some new photoshop filters does not define a whole new category of computer.
Exactly. The problem is Intel's AI PC might be PCs but they are too under-powered for AI work. Apple makes real AI PCs and they sell well enough, the latest AMD chips are getting into that territory as well. I want to see Intel succeed but they repeatedly miss the boat by over-promising and under-delivering, as if they're afraid of providing too much value too soon.
Premium price is one issue, the second one is dependence on some proprietary AI system. I remember buying LG or Samsung phone with their systems that they stopped to support and switch to Android.
And you don’t want to buy premium brick if you don’t have to.
And, those premium prices can pay for a whole bunch of cloud compute time, almost certainly with lower latency than local.
I actually want an AI PC, but not the kind that are being marketed. Basically I want to see a cheaper Mac M3 Ultra with about 128GB unified memory combined with discrete-performance GPU on-board. AMD is moving in that direction and might get there in 3 or so generations. Could perhaps get away with less hardware if software improves before then.
Nvidia's DGX Spark is a bit of a joke with only 273 GB/s memory bandwidth which is less than a 10 year old Radeon Fury X with 512 GB/s or the current RTX 5090's 1.79 TB/s.
AMD is there right now with their Strix Halo processors.
E.g. https://frame.work/de/en/desktop 128 GB of RAM, gigantic memory bandwidth, a great GPU, and a great CPU.
Looks like Ryzen AI Max is quad-channel soldered DDR5-8000, which is 256 GB/s. Not bad, but if use a model that fills that 128GB, you're only getting 2 tokens per second.
Memory bandwidth is 256 GB/s, which is shared by CPU, GPU, and NPU. So not comparable to a discrete GPU.
Of course, AI PCs are useless on GNU/Linux, and no one wants the return of Clippy and Minority Report on their Windows machines.
I doubt CoPilot+ PC with Qualcom chips are doing any better.
> AI PCs are useless on GNU/Linux
The AI part is useless because no Windows but AI laptops typically have beefy Intel/AMD APUs which most definitely makes life nice while using Linux.
Frustratingly though, it’s going to get hard and harder to avoid.
Even looking at building my own machine today, “AI PC” is all over much of this.
More expensive than Apple’s ARM computers, less battery efficient, Lame
(and I bet this will stand up to the test of time better than the original quote)
It would be easier to agree with you if any PC manufacturers were rushing out to fill the consumer ARM PC market. But they aren't. Qualcomm is sorta just jerking off, Nvidia has the licenses but doesn't want to negotiate with Softbank, and AMD hasn't made a serious ARM-related effort in a decade. The only ARM PC options are the ones that lock you in with MacOS and Apple as your computer nanny. And those customers would buy a Power ISA laptop in 2025 if Tim Cook said it was cutting-edge.
In the end, it sure looks like the ARM detractors were right. Architectural licenses made it so nobody except Softbank partners could afford to design an ARM core. Stock ARM core designs are outright anemic and inefficient even compared to decade-old x86 cores, meaning Broadcom or Rockchip can't just release a board and compete with AMD and Intel.
Nowadays, international detente is down the toilet, Softbank is seen as a penny-pincher, and RISC-V is already being shipped in place of ARM microprocessors in mass-market computer hardware. Apple basically ruined the entire ISA in the process of making it marginally competitive with generations-old silicon.
It doesn’t matter why Intel products are worse in every dimension, the fact is that they are and looking at Intel’s most recently quarterly results, customers and the market agree.
They lost over $800 million last quarter.
I agree that Intel is now reaping precisely what they've sowed. But if making money is the goal, buying an ARM architectural license would have put Intel in an early grave. The rest of the industry agrees - nobody without direct leverage over Softbank is able turn a profit designing and shipping ARM hardware.
I understand how Mac users would argue that sacrificing ARM to Moloch makes great PCs. But it also hamstrings the rest of the industry, prevents direct competition, slows down ARM proliferation and accelerates the development of foreign replacements like RISC-V. As a fan of ARM, I cannot say I'm happy with the stewardship Softbank exercises over the IP.
Insane idea: Intel releases a big RISC-V product, even if it's just an existing Lake with a new decoder on the front.
I could imagine it sucking a lot of air out of the competitive ecosystem. Anyone working on with fiddly little Rasperry Pi class products to help get the platform bootstrapped to full desktop performance suddenly has to compete with a much heavier opponent.
And does that serve any real market? Mobile? Low end laptops (Chrome)? laptops? Desktops (a small part of the market)? Servers?
It torpedoes an emerging threat.
How is ARM proliferation being slowed down? Amazon, Microsoft, Meta and Google are all designing their own ARM based server chips.
Apple has been making money hand over fist designing ARM based hardware since 2010 with the A4.
Amazon is saving billions using its own custom chips for both servers and specialized networking chips.
But the fact is that Intel has to figure out something. Right now x86 may not be dead. But it is smelling funny. They are selling fewer desktops, servers and are non existent in mobile.
Won’t big enterprises eat this stuff up once they figure out Microsoft Recall lets them copy employees in 5-10 years by recording everything every employee does on every work computer and training AI on all that juicy “behavior cloning” data?
Terrible for wagies, great for capitalist equity lords
You do not need "AI" to record stuff. They can use keyloggers, log changes to files etc. on any machine, and any OS and use it to train AI later.