Artificial intelligence, despite its incredible capabilities, is starting to get a bad rap among consumers. AI chatbots are prone to hallucinations – making up answers when they don't know how to respond and confidently presenting false information as if it were fact. Google's search AI overhaul went so wrong that the company was forced to admit that it never intended to encourage users to put glue on pizza or eat stones, and then rolled back the feature for some search queries after it made so many mistakes. Microsoft's AI-powered recording feature “recall” will now be turned off by default after security researchers found worrying flaws.
In this environment, launching an AI-powered iPhone could be seen as a risk.
But with iOS 18, which Apple showed off at WWDC 2024, the company is taking a more measured approach. Rather than overwhelming users with countless AI features, the Cupertino tech giant is carefully deploying AI in places where it thinks it will be useful. That means the technology won't be built into any places where it could pose a major threat to the carefully crafted consumer experience of using an Apple device.
Not only is Apple rebranding AI to “Apple Intelligence” for its own purposes, but it's also integrating iOS 18's new AI features in more practical ways.
Beyond quirky additions like AI emojis, Apple Intelligence will also be coming to everyday apps and features, adding things like assistive writing and proofreading tools, AI summarizing and transcription, notification prioritization, smart replies, better search, photo editing features, and a version of Do Not Disturb that automatically understands and notifies you of important messages.
Image credit: Apple
These features combined may not be as exciting as a chatbot like ChatGPT that can answer almost any question and put a world of knowledge gleaned from the internet at your fingertips. And while these features are not as mind-blowing as a tool that can create AI photos in the style of any artist, they are still controversial.
Instead, Apple has defined the minimum requirements that an AI-powered device should be able to run.
Right now, it should help you understand what's important from long bodies of text, such as notes, emails, documents, and mass notifications. It should make it easy to search for things using natural language queries, such as what's in a photo. It should transcribe audio, spot grammar and spelling mistakes, rewrite text in different styles, and suggest common responses. It should be able to perform basic photo editing, such as removing unwanted objects or people from photos. And it should be able to produce images on request, but with strict guardrails in place.
Image credit: Apple
Viewed this way, some of Apple Intelligence's new features don't feel like AI, but simply like smarter tools.
This is a deliberate move on Apple's part. The company says it focused on use cases that could identify specific problems that are much easier to solve, rather than dealing with the complexities that come with using AI chatbots. This narrow focus gives Apple more confidence that it can deliver users the results they expect, not hallucinations, and limits the dangers and safety concerns that can arise from misusing or rapid engineering AI.
Moreover, Apple's AI strikes a careful balance between providing guidance to end users and being an independent source of creativity. The latter doesn't always please creators, who are a large customer base for Apple products. If you want to make a sentence more concise or summarize an email, Apple Intelligence can help. If you want to quickly reply to an email, the suggested replies can help. But if you want to create an entire bedtime story from scratch, Apple gives you the ability to ask ChatGPT to help you do so.
Image credit: Apple
The company is following a similar path when it comes to image creation. You can use Apple Intelligence to create an image while texting a friend, but the feature relies on understanding who you're talking to and what the topic is, so you probably won't be prompted to create an AI image if you're texting about something explicit or inappropriate. The same goes for adding an image in other apps like Keynote, Pages, and Freeform. The new standalone AI image generation app, Image Playground, also guides you through suggestions and limits you to the style you choose, meaning you can't create photorealistic deepfakes with Apple's apps.
If you want to ask Siri a question it can't answer, Siri will suggest switching to ChatGPT (with your consent), allowing you to explore the wider world of chatbots and the different answers they can offer (if you so wish).However, when ChatGPT inevitably fails, that failure will be ChatGPT's, not Apple's.
In fact, much of what Apple offers isn't a way to “chat” with an AI at all. It's a way to leverage the AI for limited use cases, like translating text with the click of a button or letting the AI intuitively know what you need to see — an urgent text notification from your mom, not a coupon from DoorDash. The AI is often in the background or off to the side as a tool, not the main user interface to get the work done.
Image credit: Apple
And that's where Apple Intelligence succeeds: it feels like an extra layer of functionality added to existing apps, solving everyday problems (or maybe just letting you play with emojis). Apple Intelligence isn't trying to take over the world, even though experts and runaway OpenAI executives keep warning us that AI will one day take over the world. Apart from silly features like Genmoji, Apple Intelligence is boring and practical. Maybe that's why it actually works.
Apple Intelligence is expected to be released as a beta this fall.