The upcoming iOS 18 AI update is generating excitement, with a significant focus on locally running “On-Device” LLM models.
Reports suggest the update will introduce new AI features powered by an “entirely on-device” large language model (LLM) developed by Apple.
On-Device processing promises significant advantages for iPhone and iPad users in terms of privacy and speed.
The LLM behind the iOS 18 AI features
An LLM is a complex AI system trained on massive amounts of text data. This training allows the LLM to understand and respond to language in a way that mimics human conversation.
These AI features in the upcoming iOS 18 update will leverage LLMs to perform various tasks, such as generating text, translating languages, and even creating different creative content.
Interestingly, reports from 9to5Mac discovered code references in iOS 17.4 that hinted at an on-device model codenamed “Ajax”. This suggests Apple has been working on this technology for some time, possibly even before the current update. Additionally, reports indicate that Apple might also be developing server-hosted versions of Ajax alongside the on-device model.
Why On-Device processing for iOS 18 AI features?
Traditionally, AI features on mobile devices rely on cloud-based LLMs. This means user data needs to be sent to remote servers for processing. Apple’s decision to develop an on-device LLM suggests a move away from this approach.
There are two key benefits to this on-device approach:
- Enhanced privacy: By keeping user data on the device itself, Apple can ensure that sensitive information stays on the phone or tablet, not transmitted to the cloud. This could be particularly appealing to users concerned about data privacy with the new AI features.
- Increased speed: Processing data on the device itself can be much faster than sending it to a remote server and waiting for a response. This could lead to a more responsive and fluid user experience for the AI-powered features in the upcoming update.
Specific details about Apple’s on-device LLM are still under wraps. However, according to Mark Gurman, in the “Power On” newsletter, the processing power needed to run the LLM might limit its capabilities to newer iPhone models, possibly the iPhone 16 and onwards.
It’s also worth noting that Apple might still utilize cloud-based AI for some features in the future. Reports suggest they might be exploring partnerships with companies like Google or Baidu to leverage large-scale hardware infrastructure for specific cloud-powered AI tasks within the update.
This is still a developing story. We’ll likely learn more about Apple’s on-device LLM and its capabilities when the new iOS update is officially unveiled, most likely at WWDC in June 2024.
Featured image credit: Richard Hadiwijaya/Unsplash