As artificial intelligence (AI) begins to transform user experiences across devices and apps, ‘Apple Intelligence’ now aims to puts powerful generative models at the core of iPhone, iPad, and Mac systems, which will be part of iOS 18, iPadOS 18, and macOS Sequoia this fall in beta.
Claiming to set a new standard for privacy in AI, ‘Apple Intelligence’ understands personal context to deliver intelligence that is helpful and relevant. ChatGPT will also come to iOS 18, iPadOS 18, and macOS Sequoia later this year, powered by GPT-4o.
This is how this new AI is going to change the way people use devices to communicate, work and express themselves.
First of all, the generative models have been infused with personal context to deliver intelligence that’s incredibly useful and relevant.
According to the company, with Private Cloud Compute, it has set a new standard for privacy in AI, with the ability to flex and scale computational capacity between on-device processing and larger, server-based models that run on dedicated Apple silicon servers.
AI-powered writing tools will help you rewrite, proofread, and summarise text.
With the ‘TextView’ delegate API, you can customize how you want your app to behave while writing tools is active — for example, by pausing syncing to avoid conflicts while Apple Intelligence is processing text.
Using the Image Playground API, you can add the same experience to your app and enable your users to quickly create delightful images using context from within your app.
Since images are created entirely on device, you don’t have to develop or host your own models for your users to enjoy creating new images in your app, according to Apple.
While emoji are represented as text, Genmoji will be represented as inline images.
‘Apple Intelligence’ also provides Siri with enhanced action capabilities. Developers can take advantage of predefined and pre-trained App Intents across a range of domains to not only give Siri the ability to take actions in your app, but to make your app’s actions more discoverable in places like Spotlight, the Shortcuts app, Control Center, and more.
With ‘App Entities’, Siri can understand content from your app and provide users with information from your app from anywhere in the system.
In the Notes and Phone apps, users can now record, transcribe, and summarise audio. When a recording is initiated while on a call, participants are automatically notified, and once the call ends, Apple Intelligence generates a summary to help recall key points.Searching for photos and videos becomes even more convenient with Apple Intelligence. AGENCIES