Last year, Apple's WWDC keynote highlighted the company's ambitious advances in AI. This year, the company focused on Apple Intelligence, focusing on updating to operating systems, services and software, introducing a new aesthetic called “Liquid Glass” along with a new naming treaty.
Despite this, Apple has tried to soothe the crowd with several AI-related announcements, including image analysis tools, workout coaches and live translation features.
Visual Intelligence
Visual Intelligence is Apple's AI-powered image analysis technology that allows you to collect information about your surroundings. For example, you can identify plants in your garden, teach them about restaurants, or recognize the jacket someone is wearing.
Image credit: Apple
This feature will now be able to interact with information on the iPhone screen. For example, when you come across a post on a social media app, Visual Intelligence can perform image searches related to what you see while browsing. This tool uses Google Search, ChatGPT, and similar apps to perform searches.
To access Visual Intelligence, open the Control Center or customize the Action buttons (the same buttons that are usually used to take screenshots). This feature will be available on iOS 26 when it is released later this year. read more.
ChatGpt will come to the playground in the image
Apple has integrated ChatGpt into Image Playground and integrated it into an AI-powered image generation tool. With ChatGpt, the app can now generate images in new styles such as “anime”, “oil painting”, and “watercolor”. You also have the option to prompt ChatGpt to create additional images. read more.
Training Buddy
Apple's latest AI-driven workout coach sounds just like that. Use a text-to-speak model to provide encouragement during exercise and mimic the voice of a personal trainer. Once you start running, AI within the workout app will provide motivational talks by highlighting key moments like when you run the fastest miles or average heart rate. After completing your training, the AI summarises whether you have achieved your average pace, heart rate, and milestones. read more.
Live Translation
Apple Intelligence has enhanced new live translation capabilities for messaging, FaceTime and phones. This technology automatically converts text or spoken language into your preferred language in real time. In FaceTime Calls, users will see live captions, but on the phone, Apple will translate the conversations aloud. read more.
Image credit: Apple
AI helps unknown callers
Apple has introduced two new AI-powered features for phones. It is initially called call screening, and automatically answers calls from unknown numbers in the background. This allows users to ask the caller's name and the reason for the call before deciding whether to respond or not.
The second feature, Hold Assist, automatically detects Hold Music when waiting for a call center agent. Users can choose to stay connected while on hold, allowing their iPhone to be used for other tasks. Notifications alert live agents when they become available. read more.
Message voting proposal
Apple also introduced a new feature that allows users to create votes within the messaging app. This feature uses Apple Intelligence to suggest voting based on the context of the conversation. For example, if people in group chat are struggling to decide where to eat, Apple Intelligence recommends starting votes to land on the decision. read more.
Image credit: Apple
AI-equipped shortcuts
Shortcuts apps are becoming more useful with Apple Intelligence. The company explained that when building shortcuts, users can select AI models to enable features such as AI summary. read more.
Image credit: Apple
Context-aware spotlight
A minor update has been introduced to Spotlight, the Mac's on-device search feature. This allows Apple Intelligence to be built into improving context awareness, providing suggestions for actions that users normally perform and tailor to their current tasks. read more.
Basic models for developers
Apple now has access to AI models offline for developers. The company has introduced the Foundation Models Framework. This allows developers to build more AI capabilities in third-party apps that use Apple's existing systems. This could be aimed at encouraging more developers to create new AI features as Apple competes with other AI companies. read more.
Retreat of Apple's AI-powered Siri
The most disappointing news that emerged from this event was that Siri's long-awaited development was not yet ready. Participants were keen to get a glimpse into the promised AI-powered features that were expected to debut. However, Craig Federighi, Apple's software engineering SVP, said they won't share any more until next year. This delay could raise questions about Apple's strategy for voice assistants in an increasingly competitive market. read more.