Apple announced at its Apple Event 2024 on Monday that visual search, powered by Apple Intelligence, the company's suite of AI features, is coming to iPhone.
The new camera control button on the iPhone 16 and 16 Plus can activate a feature Apple calls “visual intelligence,” which is essentially a combination of reverse image search and text recognition.
For example, when you use visual intelligence to search for a restaurant, Apple says it will show you options to view the restaurant's hours, ratings, and menu, as well as make a reservation. And if you come across an event flyer, you can use visual intelligence to quickly add the title, time, date, and location to your calendar.
The company says the feature respects user privacy and that Apple's services will never store user images.
Apple's partnership with OpenAI will also allow you to use the new camera control button on iPhone 16 models to send queries to ChatGPT instead — as an example, Apple suggests this feature could be useful when you get stuck on a homework assignment.
Visual Intelligence, along with other Apple Intelligence features, will be released in beta for US English users in October, with availability for users in other countries in December and early 2025.