Apple's new generative AI service, Apple Intelligence, won't just be a consumer-facing feature, but developers will also be able to take advantage of the latest technology. Apple announced during its WWDC 2024 keynote on Monday that developers will be able to integrate Apple Intelligence-powered experiences into their own apps.
Apple's SDK (Software Development Kit) has been updated with a range of new APIs and frameworks, allowing app developers to integrate Image Playground (genAI image creation) with just a few lines of code. Apple showed off how apps like Craft can use this feature to make users' documents more visual by allowing them to add AI images.
AI-powered writing tools will be automatically available in all apps that use a standard editable text view, so Apple demonstrated how apps like Bear Notes can let users automatically rewrite, proofread and summarize their notes.
Additionally, Apple is building out more ways for developers to use Siri to take actions within their apps.
According to Apple, developers who have already adopted SiriKit, the SDK for integrating Siri into their apps, will immediately get many of Siri's new features powered without any additional work from them, including in areas such as lists, notes, media, messaging, payments, restaurant reservations, VoIP calling and workouts.
During its developer keynote, Apple said there are two new Siri features that developers can take advantage of without any additional work: First, Siri can invoke any item from an app's menu, meaning users can say things like “Show me the presenter's notes” on a slide deck, as well as more conversational things like “I need to see the speaker's notes.”
Secondly, Siri will now have access to any text that appears on the page, using Apple's standard text system, allowing you to see and act on the text on your screen. For example, if you have a reminder or note to “Celebrate Grandpa's Birthday,” you can simply say “FaceTime him” to take action based on that note.
Meanwhile, Apple's App Intents framework, which enables lightweight app-like interaction without installing an app, will also gain access to Apple Intelligence.
Apple is defining new intents and making them available to developers across categories, starting with a subset of categories such as books, browser, camera, document reader, file management, journal, mail, photos, presentations, spreadsheets, whiteboards, and word processors.
Image credit: Apple
Apple claims that these intents are defined and tested, making them easier for developers to adopt.
This intent allows photo editing apps like Darkroom to leverage the Apply Filter intent, allowing users to simply say, “Apply Cinematic Filter to Ian's photo from yesterday,” and the app will perform the action. More domains will be added in the future.
Initially, users will be able to use developments in the Shortcuts app, but over time, Siri will gain the ability to invoke your app's intents in supported domains.
Additionally, Apple said during the keynote that apps that fit into the existing SiriKit domain will be able to benefit from Siri's enhanced conversational capabilities, such as responding correctly when users stumble over something and understanding references to earlier parts of a conversation.
Siri can also search data from apps using the new Spotlight API, which allows apps to include entities in the index. These entities reference Apple Intelligence's semantic index of photos, messages, files, calendar events, and more.
The company on Monday also announced its own password management app, AI-generated Bitmoji and a calculator for the iPad.
This post was updated after publication to reflect more information from the Developer Keynote.