After much hype, Apple Intelligence, the company's suite of AI features, finally shipped to users this week with the iOS 18.1 update. I've been using these features for several months through beta software, and the feature set rolling out this week is more than just helping users plan and brainstorm ideas with ChatGPT or search the web with Perplexity. , I noticed that they focus on improving the user experience.
One of my favorite features in this rollout is the ability to type into Siri and have a conversation. This makes it easy to set a timer or request currency conversion without asking Siri to say “I don't understand.” To enable this,[設定]>[Apple Intelligence と Siri]>[Siri に話して入力]>[Siri に入力]toggle and double-tap the bottom bar to summon Siri at any time.
This is even more convenient. The new Siri may be able to better understand you when you stutter or change your timer from 10 to 15 minutes. However, for many questions, such as “What can I use as a substitute for pine nuts?” — Siri will still retrieve information from the web or redirect you to a website.
One of the features I've used the most in recent weeks is Apple's Photo Cleanup. This allows you to remove certain objects in a photo, such as a passerby taking a selfie, through automatic or manual selection.
Removal is not perfect. If you look closely at the removed area, you can see that there are scratches. In some cases, these artifacts may be more noticeable. But with some cropping and the use of clever filters, these photos can at least be useful for posting on social media. I demoed this feature to a few friends and they started regularly sending me photos to organize. Notably, Google has its own photo cleanup feature called Magic Eraser, available on Pixel phones and Google Photos.
Image credit: Ivan Mehta / TechCrunch
The notification summary is a tricky feature. You can choose to summarize notifications from all apps, nothing, or notifications from selected apps. The summaries are mostly accurate, but can sometimes sound strange. For example, my friend said in another message that she doesn't know why her feet hurt. The final result looked like this:
Image credit: Ivan Mehta / TechCrunch
Indeed, the summary feature helped me remove informational messages or messages that didn't require my attention at the time from my notification tray.
I'm not the only one who finds notification summaries weird sometimes. Many people have posted about these summaries on social media, including one person who said he knew he was being dumped after seeing them.
The Apple Intelligence summary is a little crazy.
A friend sent an image of a car in a group chat. The screenshot on the right shows how the model summarized it in the notification panel. pic.twitter.com/MJ3a4Ns1Oa
— Deep (@mehtadeep) October 21, 2024
It's hard to know what the right mix of useful and interesting is, but Apple's Craig Federighi told the Wall Street Journal that summaries may not be interesting to users. Additionally, the system does not automatically summarize notifications for potentially sensitive contexts.
Apple's Writing Tools feature is something I haven't used much on my iPhone. Part of that is because I do a lot of writing on my desktop, but also because I didn't feel the need to make my emails more “professional” with AI. I occasionally used the proofreading function, but it was sufficient for basic checks. In the early stages of testing, we noticed that the tool stumbled when dealing with foul language and other topics such as drugs, murder, and murder. It may not be useful if you are writing a thriller plot.
Apple's initial feature set does not generate text or images for users, unlike tools like ChatGPT, Gemini, and Claude. The second set of AI features available in the developer beta has the potential to perform some magic tricks while integrating with ChatGPT for answer and text generation. For now, Apple Intelligence might make your life a little better, but it might not be enough to buy a new iPhone that's “made for Apple Intelligence.”