What is Apple intelligence? The new AI tools that want to transform your iPhone

 (Getty Images)
(Getty Images)

Apple has finally revealed its response to the current hype over AI: its own technology called “Apple Intelligence”.

The tools are a “personal intelligence system for iPhone, iPad, and Mac that combines the power of generative models with personal context to deliver intelligence that’s incredibly useful and relevant”, Apple said as it announced the feature.

In some ways they are similar to the AI tools that have been announced by most major technology companies in recent months. After the update, iPhones will be able to liaise with ChatGPT and create images using generative AI.

But Apple looked to separate itself from those rivals in a variety of ways. Many of them relied on the architecture of its artificial intelligence offering: it is built in ways that ensure it is both personal and private.

Apple hopes those features will allow it to answer the critics who say that it has been slow to adopt new AI features, and risks being left behind. But it also hopes its particular focus on features such as privacy will allow it to stand out from those competitors, and avoid the potential pitfalls of leaning too strongly into a technology that has proven controversial and even dangerous.

The company also stressed that its capabilities were unique, since they rely on its commitment to privacy as well as its work on both the software and hardware running its devices.

The hardware integration means they will not be available for most iPhone users: “Apple Intelligence” will only work on the iPhone 15 Pro and Pro Max, which Apple says is necessary because it depends on those phones’ increased processing power. It will also be available on Macs and iPads, so long as they have one of Apple’s M-series chips, and will come to all of those platforms through a free software update.

“We’re thrilled to introduce a new chapter in Apple innovation. Apple Intelligence will transform what users can do with our products — and what our products can do for our users,” said Tim Cook, Apple’s chief executive. “Our unique approach combines generative AI with a user’s personal context to deliver truly helpful intelligence.

“And it can access that information in a completely private and secure way to help users do the things that matter most to them. This is AI as only Apple can deliver it, and we can’t wait for users to experience what it can do.”

Rumours had suggested that Apple executive’s response to the rise of artificial intelligence had been to ask its engineers across the company to examine how they could integrate it into their own apps. That appears to have happened, with AI features now present in almost all of Apple’s own apps.

In practise, those features mean using the latest developments in generative artificial intelligence – large language models to manipulate text, and diffusion models that can create images – by plugging them into Apple’s own apps and the data that they hold. Language tools mean that users can generate and check their work across apps, for instance.

Apple’s tools will also be examining information as it is used by the iPhone. It will use language models to spot and then prioritise important notifications, for instance, and summarise long and busy group chats in one alert.

The iPhone can now also record, transcribe and summarise audio, including from the phone app.

Apple also added image tools that will be available throughout Apple’s platforms. Users can create an emoji or illustration just by describing it, for instance – and even include pictures of their friends.

Siri is also receiving considerable upgrades. That starts with a new design: when Siri is invoked, it comes with a fresh animation that comes across the whole screen, and users can now chat to it by typing as well as talking, and Apple said it will be better at understanding even if someone stumbles over their words.

Siri itself can do more. It is moving towards understanding what is on screen, for instance, so that a user can ask for things to happen based on a message that is being displayed; it can also bring together information from across the phone, so that a user can for instance ask “when is Jamie’s flight landing”, and receive an answer that will pull from different apps and services.

That speaks to one of Apple’s central focuses for its AI work: that it has a unique access to people’s personal data, so that the tools can be made individual using that information.

But it also concentrated on the privacy issues that such personalisation brings up. The iPhone will try and do most of its work on the device itself, only accessing the cloud when it needs to; when it does, a technology called “Private Cloud Compute” means that only the data required to answer a question will be sent, and will be deleted when it is done with.

Apple will also start integrating other platforms, however, and announced a new partnership with OpenAI to use what it said is the “best” third-party AI tool, in the form of ChatGPT. Siri will now be able to call on ChatGPT for help, but Apple attempted to address any privacy concerns by ensuring that customers opt into that each time.

Analysts suggested that Apple had at least begun to address criticism of its AI strategy, as well as offering ways that artificial intelligence might actually be useful.

“What does it mean for users? AI will be integrated so deeply and broadly across all apps, devices and experiences. It will mean that users will be able to accomplish much more in their daily lives – more time-saved, more life hacks, more seamless interactions, more creative ways to communicate, and more fun,” said Paolo Pescatore, analyst and founder of PP Foresight.

“Apple’s trust and credibility is critical to adoption. There still remains ongoing concerns around privacy and trust from AI fakes which Apple is addressing with strict opt-in measures. This will be an interesting issue to watch play out.”