The App Store’s new AI-generated tags are live in the beta

🗓️ 2025-06-14 16:35

Latest

AI

Amazon

Apps

Biotech & Health

Climate

Cloud Computing

Commerce

Crypto

Enterprise

EVs

Fintech

Fundraising

Gadgets

Gaming

Google

Government & Policy

Hardware

Instagram

Layoffs

Media & Entertainment

Meta

Microsoft

Privacy

Robotics

Security

Social

Space

Startups

TikTok

Transportation

Venture

Events

Startup Battlefield

StrictlyVC

Newsletters

Podcasts

Videos

Partner Content

TechCrunch Brand Studio

Crunchboard

Contact Us

Apple’s plans to improve App Store discoverability using AI tagging techniques are now available in the developer beta build of iOS 26.

However, the tags do not appear on the public App Store as of yet, nor are they informing the App Store Search algorithm on the public store.

Of course, with any upcoming App Store upgrade, there’s speculation about how changes will impact an app’s search ranking.

A new analysis by app intelligence provider Appfigures, for example, suggests metadata extracted from an app’s screenshots is influencing its ranking.

The firm theorized that Apple was extracting text from screenshot captions. Previously, only the app’s name, subtitle, and keyword list would count towards its search ranking, it said.

The conclusion that screenshots are informing app discoverability is accurate, based on what Apple revealed at its Worldwide Developer Conference (WWDC 25), but the way Apple is extracting that data involves AI, not OCR techniques, as Appfigures had guessed.

At its annual developer conference, Apple explained that screenshots and other metadata would be used to help improve an app’s discoverability. The company said it’s using AI techniques to extract information that would otherwise be buried in an app’s description, its category information, its screenshots, or other metadata, for example. That also means that developers shouldn’t need to add keywords to the screenshots or take other steps to influence the tags.

This allows Apple to assign a tag to better categorize the app. Ultimately, developers would be able to control which of these AI-assigned tags would be associated with their apps, the company said.

Plus, Apple assured developers that humans would review the tags before they went live.

In time, it will be important for developers to better understand tags and which ones will help their app get discovered, when the tags reach global App Store people.

Topics

Consumer News Editor

Sarah has worked as a reporter for TechCrunch since August 2011. She joined the company after having previously spent over three years at ReadWriteWeb. Prior to her work as a reporter, Sarah worked in I.T. across a number of industries, including banking, retail and software.

From seed to Series C and beyond—founders and VCs of all stages are heading to Boston. Be part of the conversation. Save $200+ now and tap into powerful takeaways, peer insights, and game-changing connections.

Google Cloud outage brings down a lot of the internet

Waymo rides cost more than Uber or Lyft — and people are paying anyway

Europe, we’re not leaving. Period.

OpenAI releases o3-pro, a souped-up version of its o3 AI reasoning model

WWDC 2025: Everything revealed, including Liquid Glass, Apple Intelligence updates, and more

Major US grocery distributor warns of disruption after cyberattack

Anthropic’s AI-generated blog dies an early death

© 2025 TechCrunch Media LLC.

← Back to articles