Apple's plan to improve App Store discovery using AI tagging techniques is now available in the Developer Beta build of IOS 26.
However, the tag is not yet visible in the public app store and does not inform the public store's App Store search algorithm.
Image credit: App Store screenshots (Developer Beta 1, iOS 26)
Of course, there is speculation about how the changes will affect your app's search rankings with future App Store updates.
For example, a new analysis by App Intelligence Provider Appfigures suggests that metadata extracted from app screenshots affects rankings.
The company theorized that Apple was extracting text from screenshot captions. Previously, it said that only app names, subtitles, and keyword lists count in search rankings.
The conclusion that screenshots inform app discoverability based on what Apple announced at the Worldwide Developer Conference (WWDC 25) is accurate, but as Appl has speculated, how Apple extracts AI rather than OCR techniques.
At its annual developer meeting, Apple explained that it can use screenshots and other metadata to help improve the discoverability of the app. The company said it uses AI techniques to extract information that otherwise is buried in app descriptions, category information, screenshots or other metadata. This means that developers don't need to add keywords to screenshots or take other steps to affect tags.
This allows Apple to assign tags to better categorize apps. Ultimately, developers have control over which of these AI-assigned tags are associated with the app, the company said.
Additionally, Apple has assured developers that humans will review tags before they go live.
It is important for developers to understand tags better when tags reach global app store users.