The iPhone maker's iOS 18 will bring a new set of AI features called “Apple Intelligence” that lays the foundation for new ways to use apps.
Currently, the outdated App Store model is under constant regulatory attack. Meanwhile, users can accomplish many tasks by simply asking AI assistants like ChatGPT fairly simple questions. Proponents believe that AI could become the preferred way to find answers, be more productive at work, and even test our creativity.
So what does that mean for the world of apps and the growing services revenue they bring in for Apple (more than $6 billion last quarter)?
The answer goes to the heart of Apple's AI strategy.
Apple Intelligence itself offers only a small selection of features out of the box, such as a writing helper, summarizing tool, generative art, and other basic features.
But at its Worldwide Developers Conference (WWDC) in June, Apple announced new features that will enable developers' apps to more deeply integrate with both Siri and Apple Intelligence.
Our improved smart assistant will allow Siri to call up any item from your app's menus without any additional work on your part. This means, for example, that when a user asks Siri to “show me the presenter notes” on a slide deck, Siri will know what to do. Siri will also have access to any text that appears on the page, allowing users to see and interact with what's on the screen.
So if you're looking at a reminder to say “Happy Birthday” to a family member, you can say something like, “FaceTime me,” and Siri will know what action to take.
Image credit: Apple
This is an upgrade from the basic features that Siri offers today, but it's not the only one. Apple is also giving developers the tools to use Apple Intelligence in their own apps. At WWDC, the company indicated that Apple Intelligence will initially be available in certain categories of apps, such as Books, Browser, Camera, Document Reader, File Management, Journal, Mail, Photos, Presentations, Spreadsheets, Whiteboard, and Word Processors. Over time, Apple may open up these features to all developers across the App Store.
The AI capabilities are built on the App Intents framework, which is being extended with new intents for developers, with the ultimate goal being to enable users to interact with Siri from within the app, not just to open the app.
This means that users no longer have to hunt through app menus to find the feature they need to accomplish a task; they can simply ask Siri.
Users can make these requests while speaking naturally (conversationally) and can also refer to things that are relevant to their personal situation.
For example, you can ask a photo editing app like Darkroom to “add a cinematic effect to the photo I took of Ian yesterday.” The current version of Siri doesn't respond to this type of request, but an AI-powered Siri will instead recognize the app's intent to apply a filter and which photo to apply the filter to.
Apple says Siri will be able to respond even if you stumble over something or refer to earlier parts of the conversation in your command.
You can also perform actions across apps: for example, you can edit a photo and then ask Siri to move it to another app, like Notes, without tapping anything.
Image credit: Apple
Additionally, Spotlight, the iPhone's search feature, will be able to search app data by incorporating app entities into its index, which means Apple Intelligence understanding of photos, messages, files, calendar events, and more.
Of course, these more nuanced uses of AI require developer adoption. Apple has long alienated some big developers, and even some indie ones, with revenue-sharing rules that allow the company to keep 30% of revenue from products and services sold through apps. But Siri could re-engage developers by making apps that were previously hidden in the app library on the back of the phone easily accessible via voice command.
Instead of teaching users how to navigate and use the app through tedious onboarding screens, developers can focus on making sure Siri understands how the app works and how users can ask the app for what they want, so that users can interact with the app through Siri or by voice or command input, similar to how we currently interact with AI chatbots such as ChatGPT.
Third-party developers will also see other benefits from Apple's new AI architecture.
Screenshot images courtesy of Apple
The partnership with OpenAI will allow Siri to pass queries to ChatGPT if it can't find an answer. With visual search on the iPhone 16 series, Apple will allow users to access OpenAI's chatbot or Google search with just a tap of the new camera control button on the side, helping to translate what they see through the camera viewfinder into actionable queries.
Developer adoption rates will likely vary, so these developments won’t feel as immediately revolutionary as the introduction of something like ChatGPT.
Moreover, this promise of the future seems a long way off. In the latest iOS 18 beta, the features feel incomplete. I was often amazed by the new Siri capabilities, but often left confused by what it couldn't do. This is true even within Apple's own apps. For example, in the Photos app, I can ask Siri to send the photo I'm viewing to someone, but not to do something more complicated, like turn a photo into a sticker. Until Siri stops running into these roadblocks, the features may end up feeling a bit clunky to use.