“I just have to see anything particularly useful that AI can do,” a tech journalism veteran told me ahead of Apple’s WWDC 2024 event. To a large extent, I agree with the sentiment, even though I have pushed consumer-grade AI tools in every scenario that my hardware selection allowed. By the time Apple’s event concluded, I had a strong feeling that Apple may just have delivered the most practical dose of AI on a smartphone.
We have entered the era of Apple Intelligence on iPhones. I will drop the bad news first: The whole AI platter has been served only on the latest and greatest “Pro” iPhones. They are not even available for the iPhone 15 or the iPhone 15 Plus. It seems the silicon and the onboard NPU are to blame, or maybe it’s all-important memory restrictions. Similar restrictions apply for iPads, which need at least an M-class processor.
Google’s approach isn’t too different. The company kept on-device Gemini Nano limited to the Pixel 8 series phones, while Samsung also started the race with Galaxy AI features exclusive to the Galaxy S24 series phones. Some course corrections happened in the weeks to follow.
But that’s where the similarities end. Apple Intelligence — Apple’s packaging for its iPhone AI features — is leagues ahead of AI advancements we’ve seen on Android phones. Even this early on, Apple’s lead in this race isn’t even close.
What makes Apple Intelligence special
To its credit, Apple says it wants to do most of the AI processing on-device, a task that needs a beefy AI accelerator and silicon. Why won’t the A16 inside the iPhone 15 cut it? We don’t know, despite the fact that it comfortably beats the Pixel 8’s Tensor silicon, and Qualcomm’s best inside the Galaxy S24 also struggles to catch up with it.
But hey, I’d take the safety of my personal data any given day rather than put my trust in hardware that offloads demanding AI tasks to a cloud server run by another company. But how does AI in iOS 18 stand taller than what Android phones have accomplished with their own fancy Gemini Nano licensing and deals with players like Perplexity? How do we measure the functional efficacy?
Well, simply check how deeply the AI works with the onboard system and apps instead of acting as a lone warrior that has its fixed set of boundaries. This is where Apple Intelligence leaves the competition behind by a healthy margin. It is baked in at the system level in apps like Mail, Notes, Pages, and even third-party apps.
Apple’s Writing Tools, for example, will let you change the style of on-screen text, summarize it, turn it into bullet points, and more. Previously, you would need to pull up a dedicated app or tool like Paragraph AI for the same task. It will also proofread and make appropriate editing suggestions as you move ahead with your work.
AI tricks that are legitimately useful
AI should add practical value, and there is no better avenue to flex those muscles than utilitarian apps like email. In the Mail app, Apple’s AI will not only show you high-priority messages at the top. It will also generate summarized versions of emails so that you don’t have to lose your sanity reading every single one in a long thread.
Apple has extended a similar courtesy to notifications. My iPhones have given me a real case of notification anxiety, and that long stacked view of cards on the lock screen certainly doesn’t help. In iOS 18, the onboard AI will not only prioritize notifications but will also summarize them.
A similar facility is being extended to emails. It seems Apple’s engineers were enamored by a certain very good AI-loaded email app called Shortwave. Either way, I’ll take the convenience. On the topic of summarization, we have transcription.
Thanks to AI, your iPhone will finally let you record and transcribe calls. Audio clips added to notes are also automatically transcribed, with the added facility of summarization. As a journalist who has to carry a Pixel smartphone solely for its fantastic Recorder app every day to conduct interviews and transcribe them, Apple’s new transcription and summarization features are nothing short of a godsend.
Apple is also putting some AI wizardry into Focus Mode. Now, AI will surface important alerts based on their content so that you don’t miss out on something urgent. This is yet another thoughtful addition. I’ve lost track of time how many times I’ve missed messages from Slack or Teams because I’ve enabled Focus mode for work hours. And since it’s synced across Apple hardware, there’s no way you will accidentally stumble across one of those message notifications.
Google walked so Apple could run
Another neat feature is Smart Reply in the Mail app. I’ve loved this facility in Gmail, but Apple seems to be going deeper. The AI won’t just suggest an overarching reply for an email. Instead, it will identify all the questions asked in an email and suggest appropriate responses for each one. Pretty cool and convenient, I’d say.
For folks who take a lot of pictures and share them, the Photos app will let them search media with text prompts, like “find me pictures of my cat wearing a purple cap.” There’s also a neat trick that will remove unwanted elements from your pictures and perform an intelligent pixel-fill. Yes, it’s inspired by Google Photos, but so are a lot of Apple Intelligence features that were announced at WWDC.
The most powerful — and meaningful — AI-driven facility of iOS 18 is that Siri can now talk with other apps installed on your phone and perform necessary actions. “Awareness of your personal context,” Apple says. For example, you can ask, “Find me coffee shops recommended by Joe,” and Siri will look up conversations with Joe and find the relevant information. Apple’s demos show Siri digging into Messages, Mail, and file manager.
These are the AI features we’ve been waiting for
Notably, thanks to the App Intents framework, developers can integrate Siri within their apps to perform a wide range of tasks. Image Playgrounds and Writing Tools (powered by ChatGPT) will also leverage this integration. This takes Siri’s utility to a whole new level, one where it can handle tasks in virtually any app with a voice command without users ever having to touch or tap their phone’s screen.
Virtual assistants were always supposed to make our phone interactions easier, take the drudgery out of it, and act like smart digital companions that won’t leave us frustrated at every step. Deeper app integrations and context awareness are a step in the right direction. It’s just surprising to see that in its first attempt, Apple has managed to surpass the early lead by the likes of Google and pushed AI in a rewarding fashion on your phone.
But will you be OK with granting Siri access to your end-to-end encrypted chats on WhatsApp in the name of convenience? Which Apple Intelligence tricks can work offline? Is Apple being 100% ethical with the image generation capabilities from a copyright perspective? Why have uber-powerful iPhones been excluded from accessing Apple Intelligence? What are the fine details of Apple’s partnership with OpenAI and baking ChatGPT in iOS to such an extent that it essentially exists as a sidekick to Siri?
There are many questions that still need to be answered. I’ll wait until Apple Intelligence is out for testing, but so far, it seems like the finest execution of the “AI on phone” idea — and has put competing Android offerings to shame.