Apple released new Apple Intelligence features, including image playground, genmoji, writing tool enhancements, and ChatGPT integration with iOS 18.2, iPadOS 18.2, and macOS Sequoia 15.2.
The company also expanded language support for Apple Intelligence to localized English in Australia, Canada, Ireland, New Zealand, South Africa, and the U.K. Previously, the company supported only U.S. English for its AI suite of features.
ChatGPT integration is one of the most awaited AI features, as Siri can give you the option to ask ChatGPT certain questions it can’t ask directly. While Apple Intelligence-powered writing tools can’t write text for you, the company now gives you a “compose” option, which calls on ChatGPT to generate text based on the prompt.
Apple had already shipped writing tools with iOS 18.1 that were limited to rewriting a block of text in a friendly, professional, or concise style or just tapping the rewrite button without any input. With iOS 18.2, you can describe the changes you want to Apple Intelligence, allowing you to customize editing.
Users can choose to control ChatGPT integration, and by default, you don’t need to sign in to your ChatGPT account to use the feature, and OpenAI won’t store your requests. You can log in to your account to access higher-quality models.
The new image playground feature allows you to create images in different styles through a dedicated app. You can also use your or your friend’s image to make an AI-powered remix by describing different scenarios and settings. Users can pick an image style from animation or illustration. Apple is baking in Image Playground in apps like Messages and will provide an API to third-party developers, too.
Similar to Image Playground, Apple will help you create a new emoji when you describe the one that doesn’t exist when you have the emoji keyboard open. You will see the description highlighted when creating a new emoji is available. You can also use images from the photo library to create custom genmoji.
Apple is also debuting visual intelligence for iPhone 16, which lets you point at an object by pressing the camera control long and lets you Google search or ask ChatGPT about things in a frame (remember the person pointing at the dog demo?). The company can also point to the text to save a phone number (already available through Live Text) or translate a menu.
What’s more, the Notes app is getting the Image Wand tool, which turns your rough sketches into illustrative images. You can circle a sketch in the note, and the tool will automatically convert it into an image.
Apple’s next big thing for AI is giving developers access to its Apple Intelligence-related APIs to integrate into their apps. The company said in the coming months, Siri will understand you more, provide contextual suggestions for them, and gain onscreen awareness.