Apple needs on-device AI to do chores for me with the apps I have installed. Apple has everything it needs:
* Apps are already logged in, so no extra friction to grant access.
* Apps mostly use Apple-developed UI frameworks, so Apple could turn them into AI-readable representations, instead of raw pixels. In the same way a browser can give the AI the accessibility DOM, Apple could give AIs an easier representation to read and manipulate.
* iPhones already have specialized hardware for AI acceleration.
I want to be able to tell my phone to a) summarize my finances across all the apps I have b) give me a list of new articles of a certain topic from my magazine/news apps c) combine internet search with on-device files to generate personal reports.
All this is possible, but Apple doesn't care to do this. The path not taken is invisible, and no one will criticize them for squandering this opportunity. That's a more subtle drawback with only having two phone operating systems.
> iPhones already have specialized hardware for AI acceleration.
This really is the problem. Why do I spend hundreds of dollars more for specialized hardware that’s better than last years specialized hardware if all the AI features are going to be an API call to chatGPT? I am pretty sure I don’t need all of that hardware to watch YouTube videos or scroll Instagram/web, which is what 95% of the users do.
This is all possible, but an absolutely terrible idea from a security point of view, while prompt injection attacks are still a thing, and there's little evidence they will stop being a thing soon.
> Apple needs on-device AI to do chores for me with the apps I have installed
Nevermind that—iOS just needs to reliably be able to play the song I’m telling it to without complaining “sorry, something went wrong with the connection…”
I think on-device AI will show up more front and center but in a few more years.
A big issue to solve is battery life. Right now there's already a lot that goes on at night while the user sleeps with their phone plugged in. This helps to preserve battery life because you can run intensive tasks while hooked up to a power source.
If apps are doing a lot of AI stuff in the course of regular interaction, that could drain the battery fairly quickly.
Amazingly, I think the memory footprint of the phones will also need to get quite a bit larger to really support the big uses cases and workflows. (I do feel somewhat crazy that it is already possible to purchase an iPhone with 1TB of storage and 8GB of RAM).
2TB microsdxc cards have been available for a year or so, and 1TB cards have been available for several years and are even quite affordable. They work in many Android phones including my cheap Motorola. So it's Apple's sky-high premiums that has made their 1TB phones surprising.
I agree completely, it's really unfortunate how AI on apple devices has been going. The message summarization is borderline useless and widely mocked, meanwhile their giant billboard ads for it are largely stupid and uncompelling. Let me choose to give it access to my data if I want to do really useful stuff with on device processing. They've been leaning into the privacy thing, do the stuff that would be creepy if it left my device, generate push notification reminders for stuff I forgot to put in the calendar, or track my location and tell me I'm going to the wrong airport. Suggest birthday gifts for my friends and family, idk.
Edit: And add strong controls to limit what it can and cannot access, especially for the creepy stuff.
Apple is generally anti market hype. It is a smart PR move to avoid mentioning AI after the Apple Intelligence fiasco, their researchers leaving, and the bubble sentiment at the moment.
> In the same way a browser can give the AI the accessibility DOM, Apple could give AIs an easier representation to read and manipulate.
Apps already have such an accessibility tree; it's used for VoiceOver and you can use it to write UI unit tests. (If you haven't tested your own app with VoiceOver, you should.)
This is a non-story. This was a hardware event. Apple is releasing many new AI features as part of iOS 26 which will launch along side the new iPhones. AI is software. And yet, a number of the features are clearly powered by AI models such as camera enhancements, health monitoring and live translation. Also GPU performance continues to increase in the A19, with CPU remaining presumably fairly flat since no numbers were given, so that’s a win for on-device inference.
If Apple had an insanely great AI feature that truly differentiated itself from their competition, we all know they'd take a lot of time focusing on how their hardware enabled or enhanced that functionality.
The expectation is that Apple will eventually launch a revolutionary new product, service or feature based around AI. This is the company that envisioned the Knowledge Navigator in the 80s after all. The story is simply that it hasn't happened yet. That doesn't make it a non-story, simply an obvious one.
I think this is the correct approach on a phone. I don't want AI front-and-center. I want it in the background quietly making everything better. To me, that's a much more useful form of AI.
Yeah, the annonying movies IPhone does from my photo library is something I'd love to opt out of. I get that some people love this feature but I don't. And as you say, I could ask the phone to do that for me on request.
You're letting the hype men set the goalposts for you, then, as every ML thing has been retroactively rebranded as "AI."
Remember the term "smart" as applied to any device or software mode that made ~any assumptions beyond "stay on while trigger is held"? "AI" is the new "smart." Even expert systems, decision trees, and fulltext search are "AI" now.
> You're letting the hype men set the goalposts for you, then
Not really, I'm taking the hint. If they call a feature "AI", there's a 99% chance it's empty hype. If they call a feature "machine learning", there may be something useful in there.
Notice how Apple, in this event even, uses the term "machine learning" for some features (like some of their image processing stuff) and "AI" for other features. Their usage of the terms more or less matches my line of features I want and features I don't want.
Well, yeah, Apple is being reasonable now because Apple just got through a big bad PR thing with their recent failed attempt at "AI". Apple are currently trying, as much as possible, to avoid applying the term "AI" to anything.
But that's not true of any other actor in the market. Everyone else — but especially venture-backed companies trying to get/retain investor interest — are still trying to find a justification for calling every single thing they're selling "AI".
(And it's also not even true of Apple themselves as recently as six months ago. They were approaching their marketing this way too, right up until their whole "AI" team crashed and burned.)
Apple-of-H2-2025 is literally the only company your heuristic will actually spit out any useful information for. For everything else, you'll just end up with 100% false positives.
Not to be too, um, dismissive, but one of the things we discussed in my 300 level class called _Artificial Intelligence_ in college 2 decades ago was regular expressions, so, that ship has sailed far over the horizon.
I think it is just generally the correct approach. It was not that long ago that ML (outside of research) was often talked about but it being ML was not the focus. The focus was on the actual benefits and what it did.
Of course this is going to be spun and turned into a negative, but I basically want ML to be invisible again. The benefits being clear, but the underlying tech no longer mattering.
Apple has so hardwired my neurons to expect "It just won't work" when it comes to Siri that if I cannot physically reach my device, I just don't bother anymore.
I have a similar philosophy about home automation. Every few years I geek out and set up a bunch of crap and spend a bunch of money and waste a lot of time, and then it tends to fall apart fairly quickly and I repent of everything...
...except for the motion-activated lighting in our foyer and laundry room. $15, 15 minutes to install, no additional charges, no external services, no security issues, and just works year after year with 100% reliability.
They did talk about ML in their image processing software for their cameras, though. And I don’t think they mentioned AI when talking about their babelfish AirPods, but it’s clearly there, as well.
Yeah. I think they talked about ML for the Watch workout feature as well (or was it the AirPods?) And I think for other features as well but I am not certain and I am not rewatching the whole thing.
They’ve been putting AI in a lot of places over the years.
If you've read the Steve Jibs biography, you’d know his obsession with creating the future products.
I feel like he’d be obsessively working to combine AI, robotics and battery technology into the classic sci fi android.
Instead, modern Apple seems to be innovating essentially nothing unless you count the VR thing and the rumors of an Apple car, which sounds to me much like the Apple Newton.
I'd disagree: his obsession was with watching everyone fail to create the products of the future, learning from their mistakes, and only then committing to a product strategy (e.g: iPod, iPhone, iPad).
That's because people don't trust AI that much... and most people don't differentiate between a really cool feature and AI so it's more confusing marketing than anything.
Apple fans would be going crazy with excitement if Apple unveiled Gemini 2.5 Flash Image (AKA "nano banana"). It's the most powerful image editing tool since Photoshop 1.0.
How are they behind, in terms of things normal people actually use? Apps like ChatGPT work just fine, and I have literally never heard an Android user I know personally talk about something they can do which isn’t available on iOS.
That’s not true. There were multiple AI features presented, just very well-known ones like the babel fish AirPods Pro. Nothing fancy, but it’s still AI and it’s useful.
I feel like they don’t want to get burned the way they did with the AVR/metaverse thing. Those headsets are worthless. Probably a lot of ptsd going on with people thinking how Jobs would have had their ass over it. I suspect they’re being very cautious with “AI”.
This is what I like about Apple. They just use technology judiciously without making a big deal about it, and talk up the product instead. As it should be.
Good.
Apple needs on-device AI to do chores for me with the apps I have installed. Apple has everything it needs:
* Apps are already logged in, so no extra friction to grant access.
* Apps mostly use Apple-developed UI frameworks, so Apple could turn them into AI-readable representations, instead of raw pixels. In the same way a browser can give the AI the accessibility DOM, Apple could give AIs an easier representation to read and manipulate.
* iPhones already have specialized hardware for AI acceleration.
I want to be able to tell my phone to a) summarize my finances across all the apps I have b) give me a list of new articles of a certain topic from my magazine/news apps c) combine internet search with on-device files to generate personal reports.
All this is possible, but Apple doesn't care to do this. The path not taken is invisible, and no one will criticize them for squandering this opportunity. That's a more subtle drawback with only having two phone operating systems.
> iPhones already have specialized hardware for AI acceleration.
This really is the problem. Why do I spend hundreds of dollars more for specialized hardware that’s better than last years specialized hardware if all the AI features are going to be an API call to chatGPT? I am pretty sure I don’t need all of that hardware to watch YouTube videos or scroll Instagram/web, which is what 95% of the users do.
This is all possible, but an absolutely terrible idea from a security point of view, while prompt injection attacks are still a thing, and there's little evidence they will stop being a thing soon.
> Apple needs on-device AI to do chores for me with the apps I have installed
Nevermind that—iOS just needs to reliably be able to play the song I’m telling it to without complaining “sorry, something went wrong with the connection…”
I think on-device AI will show up more front and center but in a few more years.
A big issue to solve is battery life. Right now there's already a lot that goes on at night while the user sleeps with their phone plugged in. This helps to preserve battery life because you can run intensive tasks while hooked up to a power source.
If apps are doing a lot of AI stuff in the course of regular interaction, that could drain the battery fairly quickly.
Amazingly, I think the memory footprint of the phones will also need to get quite a bit larger to really support the big uses cases and workflows. (I do feel somewhat crazy that it is already possible to purchase an iPhone with 1TB of storage and 8GB of RAM).
2TB microsdxc cards have been available for a year or so, and 1TB cards have been available for several years and are even quite affordable. They work in many Android phones including my cheap Motorola. So it's Apple's sky-high premiums that has made their 1TB phones surprising.
https://www.bhphotovideo.com/c/product/1868375-REG/sandisk_s... 2TB $185
https://www.bhphotovideo.com/c/product/1692704-REG/sandisk_s... 1TB $90
https://www.bhphotovideo.com/c/product/1712751-REG/sandisk_s... 512GB $40
I agree completely, it's really unfortunate how AI on apple devices has been going. The message summarization is borderline useless and widely mocked, meanwhile their giant billboard ads for it are largely stupid and uncompelling. Let me choose to give it access to my data if I want to do really useful stuff with on device processing. They've been leaning into the privacy thing, do the stuff that would be creepy if it left my device, generate push notification reminders for stuff I forgot to put in the calendar, or track my location and tell me I'm going to the wrong airport. Suggest birthday gifts for my friends and family, idk.
Edit: And add strong controls to limit what it can and cannot access, especially for the creepy stuff.
They've being doing some research on this: https://machinelearning.apple.com/research/ferretui-mobile
Apple is generally anti market hype. It is a smart PR move to avoid mentioning AI after the Apple Intelligence fiasco, their researchers leaving, and the bubble sentiment at the moment.
You are missing the point. Why was Apple Intelligence a fiasco? Because they failed to understand what users like GP wanted.
> In the same way a browser can give the AI the accessibility DOM, Apple could give AIs an easier representation to read and manipulate.
Apps already have such an accessibility tree; it's used for VoiceOver and you can use it to write UI unit tests. (If you haven't tested your own app with VoiceOver, you should.)
[dead]
This is a non-story. This was a hardware event. Apple is releasing many new AI features as part of iOS 26 which will launch along side the new iPhones. AI is software. And yet, a number of the features are clearly powered by AI models such as camera enhancements, health monitoring and live translation. Also GPU performance continues to increase in the A19, with CPU remaining presumably fairly flat since no numbers were given, so that’s a win for on-device inference.
If Apple had an insanely great AI feature that truly differentiated itself from their competition, we all know they'd take a lot of time focusing on how their hardware enabled or enhanced that functionality.
The expectation is that Apple will eventually launch a revolutionary new product, service or feature based around AI. This is the company that envisioned the Knowledge Navigator in the 80s after all. The story is simply that it hasn't happened yet. That doesn't make it a non-story, simply an obvious one.
I think this is the correct approach on a phone. I don't want AI front-and-center. I want it in the background quietly making everything better. To me, that's a much more useful form of AI.
And I want it turned the fuck off, quietly not doing anything with my personal shit.
I want to reach for my tools when I want to use them.
Yeah, the annonying movies IPhone does from my photo library is something I'd love to opt out of. I get that some people love this feature but I don't. And as you say, I could ask the phone to do that for me on request.
Disable “show featured content” in Photos.
I love this feature on both Photos and Google Photos
I'll argue that face recognition, event detection and share recommendations are nice features.
They are all done locally on your device for the last decade, at least.
[flagged]
You're letting the hype men set the goalposts for you, then, as every ML thing has been retroactively rebranded as "AI."
Remember the term "smart" as applied to any device or software mode that made ~any assumptions beyond "stay on while trigger is held"? "AI" is the new "smart." Even expert systems, decision trees, and fulltext search are "AI" now.
> You're letting the hype men set the goalposts for you, then
Not really, I'm taking the hint. If they call a feature "AI", there's a 99% chance it's empty hype. If they call a feature "machine learning", there may be something useful in there.
Notice how Apple, in this event even, uses the term "machine learning" for some features (like some of their image processing stuff) and "AI" for other features. Their usage of the terms more or less matches my line of features I want and features I don't want.
Well, yeah, Apple is being reasonable now because Apple just got through a big bad PR thing with their recent failed attempt at "AI". Apple are currently trying, as much as possible, to avoid applying the term "AI" to anything.
But that's not true of any other actor in the market. Everyone else — but especially venture-backed companies trying to get/retain investor interest — are still trying to find a justification for calling every single thing they're selling "AI".
(And it's also not even true of Apple themselves as recently as six months ago. They were approaching their marketing this way too, right up until their whole "AI" team crashed and burned.)
Apple-of-H2-2025 is literally the only company your heuristic will actually spit out any useful information for. For everything else, you'll just end up with 100% false positives.
Semantically, AI has always been a superset of ML. So it's always been correct to call machine learning AI.
All machine learning is AI, not all AI is machine learning.
Yet there's a very clear distinction between when companies use the term "AI" and when they use "machine learning".
that stuff is also essentially machine learning, just more parameters and better marketing
Not to be too, um, dismissive, but one of the things we discussed in my 300 level class called _Artificial Intelligence_ in college 2 decades ago was regular expressions, so, that ship has sailed far over the horizon.
I think it is just generally the correct approach. It was not that long ago that ML (outside of research) was often talked about but it being ML was not the focus. The focus was on the actual benefits and what it did.
Of course this is going to be spun and turned into a negative, but I basically want ML to be invisible again. The benefits being clear, but the underlying tech no longer mattering.
Actually if Siri ever worked when I really need it, they could finally catch up with their promises made in 2011.
Right? Siri lets me down on a regular basis - and has for years.
Apple has so hardwired my neurons to expect "It just won't work" when it comes to Siri that if I cannot physically reach my device, I just don't bother anymore.
Indeed, a good AI Snitch runs in background on your phone. The era of intelligent spyware.
I have a similar philosophy about home automation. Every few years I geek out and set up a bunch of crap and spend a bunch of money and waste a lot of time, and then it tends to fall apart fairly quickly and I repent of everything...
...except for the motion-activated lighting in our foyer and laundry room. $15, 15 minutes to install, no additional charges, no external services, no security issues, and just works year after year with 100% reliability.
"Any sufficiently advanced home automation is indistinguishable from a haunting."
They did talk about ML in their image processing software for their cameras, though. And I don’t think they mentioned AI when talking about their babelfish AirPods, but it’s clearly there, as well.
They talk about "Apple Intelligence models" running on the phone. They dropped that name for their AI stuff in a few other spots, too.
Yeah. I think they talked about ML for the Watch workout feature as well (or was it the AirPods?) And I think for other features as well but I am not certain and I am not rewatching the whole thing.
They’ve been putting AI in a lot of places over the years.
If you've read the Steve Jibs biography, you’d know his obsession with creating the future products.
I feel like he’d be obsessively working to combine AI, robotics and battery technology into the classic sci fi android.
Instead, modern Apple seems to be innovating essentially nothing unless you count the VR thing and the rumors of an Apple car, which sounds to me much like the Apple Newton.
I'd disagree: his obsession was with watching everyone fail to create the products of the future, learning from their mistakes, and only then committing to a product strategy (e.g: iPod, iPhone, iPad).
That's because people don't trust AI that much... and most people don't differentiate between a really cool feature and AI so it's more confusing marketing than anything.
> That's because people don't trust AI that much
ChatGPT being the number one app is a weird way for people to express they don't trust AI: https://apps.apple.com/us/charts/iphone
And with Gemini being the 2nd.
Number one free app. Generative AI doesn’t sell phones.
I think it's more because they're way behind on AI and they literally have nothing to actually tell us.
They’re not way behind on AI, they’re way behind on LLMs. Several features they presented depend heavily on AI.
As an Apple user, what are they behind on? What features are other makers shipping that Apple is missing?
(Genuinely curious, perhaps there are third-party apps I can use to bridge the gap.)
Apple fans would be going crazy with excitement if Apple unveiled Gemini 2.5 Flash Image (AKA "nano banana"). It's the most powerful image editing tool since Photoshop 1.0.
Pardon my ignorance, but is this part of Android or a standalone app?
Is it something that is not usable on Apple devices?
It powers part of the Google Photos app on Android. It can be used on Apple devices through Gemini, but it is not well known yet.
Probably because Google marketing is the worst. Wtf is “nano-banana.” This is straight out of a Silicon Valley HBO skit.
You need to spend a week to disable all AI features in Pixel. You need to spend just a day to disable all AI features in iPhone.
How are they behind, in terms of things normal people actually use? Apps like ChatGPT work just fine, and I have literally never heard an Android user I know personally talk about something they can do which isn’t available on iOS.
That’s not true. There were multiple AI features presented, just very well-known ones like the babel fish AirPods Pro. Nothing fancy, but it’s still AI and it’s useful.
in some ways (hardware) they're ahead, their dominance in mobile chip power lets them do a lot more machine learning on-device
also hard to find a better laptop for running an LLM locally too
Good.
I feel like they don’t want to get burned the way they did with the AVR/metaverse thing. Those headsets are worthless. Probably a lot of ptsd going on with people thinking how Jobs would have had their ass over it. I suspect they’re being very cautious with “AI”.
How did they get burned with AVP?
This is what I like about Apple. They just use technology judiciously without making a big deal about it, and talk up the product instead. As it should be.
You’re absolutely correct!
I have a Samsung A55 but I shudder to think that soon the AI crap will trickle down to midrange devices. Samsung is "all in" on AI.