The real power of Apple Intelligence will show up in third-party apps

Date:

Share post:


Apple Intelligence, the iPhone maker’s new set of AI capabilities arriving in iOS 18, is laying the groundwork for a new way to use apps.

Today, the dated App Store model is under constant regulatory attack. Meanwhile, users can accomplish a lot of tasks with fairly simple questions to an AI assistant like ChatGPT. Proponents believe AI could become the preferred way we’ll search for answers, be productive at work, and experiment with creativity.

Where does that leave the world of apps, and the growing services revenue (more than $6 billion last quarter) they generate for Apple?

The answer cuts to the core of Apple’s AI strategy.

Apple Intelligence itself only offers a small set of capabilities out-of-the-box, like writing helpers, summarization tools, generative art, and other baseline offerings.

But earlier this year at its Worldwide Developers Conference (WWDC) in June, Apple presented new features that will allow developers’ apps to connect more deeply with both Siri and Apple Intelligence.

Improvements to the smart assistant will allow Siri to invoke any item from an app’s menu without additional work on a developer’s part. That means users could ask Siri to “show me my presenter notes” in a slide deck, for instance, and Siri would know what to do. Siri will also be able to access any text displayed on the page, allowing users to reference and act on what’s on their screen.

So, if you were looking at your reminder to wish a family member a “happy birthday,” you could say something like “FaceTime him” and Siri would know what action to take.

Image Credits: Apple

That’s already an upgrade from the basic functionality today’s Siri offers, but it doesn’t end there. Apple is also providing developers with tools to use Apple Intelligence in their own apps. At WWDC, the company indicated that Apple Intelligence would first be made available to certain categories of apps, including Books, Browsers, Cameras, Document readers, File management, Journals, Mail, Photos, Presentations, Spreadsheets, Whiteboards, and Word processors. Over time, Apple is likely to open up these capabilities to all developers across the App Store.

The AI functionality will be built on top of the App Intents framework, which is being expanded with new intents for developers. The eventual goal is to allow users to interact with Siri not just to open their apps, but also to use them.

That means a user wouldn’t have to dig around in an app’s menus to find the feature they needed to perform a task. They could just ask Siri.

Users could also make these requests while speaking naturally — conversationally — and could reference things that related to their personal context.

So, for instance, you could ask a photo-editing app like Darkroom to “apply a cinematic present to the photo I took of Ian yesterday.” Today’s version of Siri would balk at this sort of request, but the AI-powered Siri would instead know to leverage the app’s Apply Filter intent, as well as which photo you’re asking to use it on.

Siri will be able to take action even if you stumbled over your words or referenced an earlier part of the conversation in your instructions, Apple has said.

You could also take action across apps. For example, after editing your photo, you could ask Siri to move it into another app, like Notes, without having to tap on anything.

wwdc24 SiriKit 02
Image Credits: Apple

In addition, the iPhone’s search feature, Spotlight, will be able to search data from apps by incorporating app entities into its index. This refers to Apple Intelligence’s understanding of things like photos, messages, files, calendar events, and more.

This subtler use case for AI, of course, requires developer adoption. Apple has over the years alienated some of its larger developers and even some of its indies with its revenue-sharing rules, which generally allow the company to keep 30% of revenues for products and services sold through any app. But developers could be drawn back in as Siri takes apps that were previously hidden in a back-of-the-phone App Library and makes them easily accessible through voice commands.

Instead of boring onboarding screens to train users on how to navigate and use their app, developers could instead focus on making sure Siri understands how their app works, and how users might ask for the things they want to do in it. That way, users could engage with the app via Siri either by speaking or typing in commands, similar to how they today engage with an AI chatbot like ChatGPT.

Third-party developers will gain other benefits from Apple’s new AI architecture, too.

iPhone 16 search, phone pointed at bike
Screenshot
Image Credits: Apple

With its OpenAI partnership, Siri will be able to hand off queries to ChatGPT when it doesn’t have the answer. With its visual search feature on the iPhone 16 lineup, Apple will also allow users to access OpenAI’s chatbot or Google Search just by tapping on the new Camera Control button on the side, turning what they’re seeing through the camera’s viewfinder into an actionable query.

These developments won’t feel as immediately revolutionary as the introduction of something like ChatGPT did because the rate of developer adoption will likely vary.

Moreover, these future promises seem like they’re still a ways out. In the latest iOS 18 betas, the functionality feels incomplete. As often as I was surprised by what the new Siri can do, I was just as often confused by those things it can’t. That includes within Apple’s own apps. For instance, you can ask Siri in the Photos app to send a photo you’re viewing to someone, but you can’t ask it to do something more complex, like turn the photo into a sticker. Until Siri stops hitting these kinds of roadblocks, the functionality may end up feeling frustrating to use.



Source link

Lisa Holden
Lisa Holden
Lisa Holden is a news writer for LinkDaddy News. She writes health, sport, tech, and more. Some of her favorite topics include the latest trends in fitness and wellness, the best ways to use technology to improve your life, and the latest developments in medical research.

Recent posts

Related articles

Meta COO Sheryl Sandberg sanctioned by judge for allegedly deleting emails

A Delaware judge has sanctioned Sheryl Sandberg, Meta’s former COO and board member, for allegedly deleting emails...

Microsoft is no longer OpenAI’s exclusive cloud provider

Microsoft was once the exclusive provider of data center infrastructure for OpenAI to train and run its...

Scale AI’s Alexandr Wang has published an open letter lobbying Trump to invest in AI

Alexandr Wang, the CEO of Scale AI, has taken out a full-page ad in The Washington Post...

Perplexity launches Sonar, an API for AI search

Perplexity on Tuesday launched an API service called Sonar, allowing enterprises and developers to build the startup’s...

Trump targets EV charging funding programs Tesla benefits from

President Donald Trump is trying to halt the flow of funding for EV charging infrastructure from two...

Spotify introduces educational audio courses, starting in the UK

Spotify is expanding its streaming service to now include educational courses in addition to music, podcasts, and...

Funding to fintechs continues to decline, but at a slower pace

Welcome to TechCrunch Fintech!  This week, we’re looking at just how much fintech startups raised in 2024, a...

Forum software NodeBB joins the fediverse

Before there was social media, there were internet forums. Millions of forum sites continue to operate, which...