WWDC 2024 should kick off huge Apple and OpenAI partnership — here's how it could transform your iPhone

 Tim Cook on stage at WWDC 2023 .
Tim Cook on stage at WWDC 2023 .

We've already heard that Apple's upcoming iOS 18 software could represent the 'biggest' update in history, thanks to a ton of new AI features coming to the iPhone. This includes everything from summarizing emails and webpages to transcribing voice notes and AI generated images and emoji.

But Apple is reportedly not going it alone when it comes to rolling out new AI features. Back in late April, reports surfaced that Apple reopened dialog with OpenAI to bring ChatGPT features to the iPhone; by mid-May, we heard that Apple and Open AI were reportedly close to a deal.

Now this deal looks to be sealed, as The Information is reporting that OpenAI CEO Sam Altman has secured an agreement with Apple. This is apparently a major blow to Google, which had been hoping to bring its Gemini AI tools to the iPhone to join its existing multi-billion-dollar Google search partnership.

So how could bringing ChatGPT AI to Apple's coveted smartphone transform your iPhone experience? If Apple decided to leverage the latest GPT-4o technology, it could supercharge Siri with a much more powerful voice and visual assistant. Our AI editor Ryan Morrisson got a live demo of ChatGPT-4o voice, and he said if anything, the technology is underhyped.

Imagine Siri but with much more intelligence as well as more emotion. You could interrupt it, ask follow up questions and get live translations on the fly. Most important of all, GPT-4o is more conversational, so Siri could easily get that injection.

Siri + ChatGPT = ?

According to The Information's report, Apple engineers have been testing ChatGPT behind the scenes and have connected it to Siri. As a result, they've been able to create "impressive demonstrations of Siri handling more complex queries than usual, including by better understanding the context of what users were saying."

This bit about context is hugely important, as the existing Siri had a terrible time keeping up with the infamous Rabbit R1 AI device when I tested the two head-to-head with a series of queries. Siri just isn't very good about answering follow-up questions right now.

To be clear, Apple has not announced an OpenAI partnership, and it hasn't announced any plans about how it might use that technology. But Apple has reportedly discussed using OpenAI for the Siri voice assistant to answer questions it normally couldn't answer by itself. Apple may also decide to roll out a separate chatbot-like app powered by OpenAI.

But it seems clear now that the buzzy GPT-4o demos were explicitly designed to seal the deal with Apple. And a big part of the demo was OpenAI's enhanced vision features. The gist is that you should be able to point your phone's camera at objects for identification but also at people to gauge their mood or emotional state. We had a chance to try ChatGPT-4o's vision features with several prompts and came away impressed, though it also made some mistakes.

Expect OpenAI disclaimers

So it's no surprise that Apple will likely let users know when they're getting responses generated by OpenAI and ChatGPT versus its own models, as AI will get things wrong and can "hallucinate."

Regardless, potentially adding vision features to Siri could be a game changer, as Apple tries to catch up to Google Lens, which is getting an even bigger boost with Gemini Live. Even without OpenAI, Siri is reportedly getting a major upgrade with iOS 18, as we've heard that Siri will be able to provide voice control for various apps, including Notes, Photos and Mail. This capability will be coming to third-party apps in the future.

The bottom line is that iOS 18 and Siri are about to get way smarter with the help of OpenAI, assuming this deal is legit and is announced at WWDC next week.

More from Tom's Guide