It was natural to expect Apple to do with AI what it has done with so many of its features and apps: wait, take notes, and redefine. But even as it has smoothed out some of the sharp edges of its controversial technology, the company appears to have hit the same wall as others: Apple Intelligence, like all AI, doesn't actually do anything.
Sure, it can do something. In fact, it can do some things. But like any AI tool, it seems to be a very computationally intensive shortcut to perform regular tasks. This isn't necessarily a bad thing, especially as inference (i.e. doing the actual text analysis, generation, etc.) becomes more efficient enough that it can be moved to the device itself.
But Tim Cook said at the start of his “Glowtime” event on Monday that Apple Intelligence's “breakthrough capabilities” will have an “incredible impact,” and Craig Federighi said it will “change so much of what you can do with your iPhone.”
function:
Paraphrase snippets of text Summarize emails and messages Generate fake emojis and clip art Find photos of people, places and events Look up things
Do any of these seem groundbreaking to you? There are countless writing helpers. Summarization features are in almost every LLM. Generative art has become synonymous with effortlessness. This way we can easily search for photos across different services. And our “dumb” voice assistants were searching Wikipedia entries for us 10 years ago.
It's certainly an improvement – doing these things locally and privately is definitely preferable – and it opens up new opportunities for people who can't easily use a regular touchscreen UI – so it's definitely a convenience improvement.
But literally none of it is new or interesting, and it looks like no meaningful changes have been made to these features since the beta was released after WWDC, outside of the expected bug fixes.
With the title “Apple's first phone built from the ground up for Apple Intelligence,” you would expect that to be the case, but as it turns out, the 16 won't have all of the mentioned features, which will arrive in a separate update.
Is this a lack of imagination? Or a lack of technology? AI companies are starting to reposition their models as just another enterprise SaaS tool (after all, they were mostly just repeating stuff they found on the web) rather than the “transformative” use cases we hear so much about. AI models can be extremely valuable in the right place, but that place doesn't seem to be in your hands.
There's a strange discrepancy between how commonplace these AI features are and how hyperbolic their descriptions are. Apple has increasingly turned to the kind of breathless advertising that it once displayed with restraint and innovation. Monday's event was one of the most uninspiring in recent memory, but the language was, if anything, more hyperbolic than usual.
So, like other AI providers, Apple is playing a multi-billion dollar game of making these models seem transformative and groundbreaking, even if almost no one thinks they are. How can anyone justify spending the money these companies have when the end result is the same as it was five years ago?
AI models could truly be game-changers in certain areas of scientific research, some coding tasks, perhaps materials and structural design, and perhaps (though not for the better) media.
But if we believe our eyes and thumbs, rather than Cook and Federighi’s reality-warping hours, the things we should be excited about don’t seem to be doing much of anything useful, let alone revolutionary. Ironically, Apple’s announcement failed to deliver an “iPhone moment” for AI.