In a recent ZDNET article, my friend and colleague David Gewirtz explained why he sees the upcoming iPhone 16 with a focus on iOS 18 and Apple Intelligence as a major upgrade.
While I appreciate David’s perspective, I have to differ.
Also: Apple Intelligence is coming in iOS 18.1 developer beta. Here’s what’s new for iPhone
David says the inclusion of artificial intelligence (AI) in iOS 18 makes the iPhone 16 a must-have upgrade for him, and highlights the potential for Apple Intelligence to revolutionize the way we interact with our devices. Although I agree with his opinion in the long termI’m not convinced that the first version of Apple Intelligence will be the great leap forward in usability that so many expect.
Every year my wife and I eagerly await the release of the new iPhones. Be part of Apple Upgrade Program, we return our device, reset our loan with Citizens Bank and get the latest model. In recent years I have decided to For Maxand my wife chose basic model. The expected annual improvements were incremental but appreciated.
However, despite the buzz surrounding the iPhone 16’s new features and Apple Intelligence integration, my enthusiasm for this year’s upgrade is tempered by a few concerns.
What Apple isn’t telling us about Apple Intelligence
Apple Intelligence represents a significant leap in on-device AI capabilities, bringing advanced machine learning and natural language processing directly to our phones. Unfortunately, this technology is still in its infancy. On-device LLM and generative AI are essentially in alpha or beta, and there’s a lot of uncertainty about how well they’ll work on Apple’s current mobile hardware.
David sees AI integration in iOS 18.1 as a significant leap forward, but let’s not kid ourselves. These AI features on the device are in their infancy, which means they won’t deliver the seamless experience Apple users expect. When Apple releases Apple Intelligence to the public in the fall of 2024, it will still be a beta version, not a finished product.
Apple Intelligence isn’t just another random or routine update to iOS or even MacOS features. The devices load a scaled-down version of Apple’s Foundation Models, an in-house large language model (LLM) that will be several gigabytes in size and have up to 3 billion parameters. (Compare that to the hundreds of trillions of parameters that models like GPT-3.5 and GPT-4 use — or what Apple will run in its data centers for its Apple Intelligence “Private Cloud Compute” feature.)
Also: Apple Intelligence will improve Siri in 2024, but don’t expect most updates until 2025
Apple hasn’t yet fully detailed to developers how this will work on iOS, iPadOS, and MacOS, but the model will have to be loaded — at least partially — into memory, potentially taking up 750MB to 2GB of RAM when running, according to current estimates, depending on , how good Apple’s memory compression technology is, and other factors.
This is a significant amount of memory allocated to a core operating system function that will not always be used. As a result, parts of it will have to be dynamically loaded into and out of memory as needed, adding new system constraints for applications and potentially increasing CPU load.
Current iPhone hardware does not handle this
Earlier this month, I discussed how older—as well as the current generation—iOS devices are not powerful enough to handle generative AI tasks on the device. The base iPhone 15, which only has 6GB of RAM, may struggle to meet Apple Intelligence’s requirements as it evolves and becomes more integrated into iOS, Apple’s core apps, and developer apps. Older iPhones have 6GB of RAM or less.
An iPhone 15 Pro with 8GB of RAM may be more suitable for these tasks. It’s the only device iOS developers can use to test Apple Intelligence (besides their Macs and iPad Pros) before the iPhone 16 ships, likely in October. This means that many users may still experience sub-optimal performance on 8GB devices when Apple Intelligence is fully implemented.
Also: iOS 18.1 update: Every iPhone model to support Apple’s new AI features (so far)
Early adopters may find AI features more useful to developers than regular users, as the system may need some fine-tuning to reach its full potential. I also expect that, like owners of the base iPhone 15 and earlier iPhones who won’t be able to access it when upgrading to iOS 18.1, Apple Intelligence will be a feature that users can simply turn off to save memory for app use.
The upcoming iPhone 16, while it may have more advanced hardware, may also struggle with new AI capabilities due to design cycles that didn’t include these features. It may take another product cycle or two for the hardware to fully align with the new AI capabilities coming in iOS 18.1 and later. As a result, users may experience suboptimal performance and a less seamless user experience.
Why you shouldn’t buy the iPhone 16 for Apple Intelligence
For these reasons, I see the iPhone 16 (and potentially the iPhone 17) as a transitional product in Apple’s journey to artificial intelligence on devices.
In addition to additional silicon optimizations, future iPhones will likely require more RAM to fully support these AI features, which could drive up costs. If the base iPhone 16 needs 8GB of RAM to run Apple Intelligence efficiently, the starting price could be pushed to $899 or higher. Pro models may require 12GB or even 16GB of RAM, further increasing the price. That would also mean a new A18 chip for the Pro models, while the base iPhone 16 could only get the current A17 – though perhaps Apple could build an “A17X” with 10GB of RAM to give the phone more breathing room.
Also: How iOS 18 changes the way you charge your iPhone
memory, AI processing also requires a lot of energy and other computing resources. Without significant advances in battery and power management technology, users may have to charge their phones more often. This can lead to increased battery drain, reduced battery life, and potential performance issues. The additional processing power required to run LLM on a device could stress the CPU, cause the device to heat up, and affect its overall performance and reliability.
How Apple Intelligence is likely to evolve
Apple’s AI capabilities are expected to improve significantly in the coming years. By 2025, we may see more advanced and addictive integration of Apple Intelligence not only in mobile devices and Macs, but also in products such as the Apple Watch, HomePod, Apple TV and the consumer-oriented version of the Vision headset.
To extend Apple Intelligence to these less powerful devices, as the company does with its “Private Cloud Compute” by running secure Darwin-based servers in its data centers for more advanced LLM processing, Apple can leverage cloud resources for these less powerful systems thanks to fully developed the possibilities of data centers and partnerships with companies such as OpenAI or Google.
Also: Apple needs to do these 3 things to save the Vision Pro
Alternatively, they could consider a distributed or “networked” AI processing system where idle devices in a home or business can assist less powerful ones with LLM queries.
Apple could achieve this by equipping MacOS, iOS and iPadOS with Apple Intelligence and on-device LLM as planned. Subsequent changes could allow all devices to communicate their generative AI capabilities and idle processing state. This would allow them to act as proxies for each other’s Apple Intelligence requests.
Enterprises can also use mobile device management solutions to facilitate access to LLM on enterprise Mac devices. In addition, iPhones or Macs can be used as proxies for Apple Watch or HomePod requests for mobile users. We may also see a more powerful Apple TV with more memory and processing that will act as an Apple Intelligence “hub” for every Apple device in the home.
Imagine your iPhone using the unused processing power of your Mac or iPad, all equipped with on-device LLM, to solve complex AI tasks. This would increase the availability of AI features across Apple’s product line.
I’m still an optimist
Despite the hype surrounding Apple Intelligence, there are many other reasons to consider upgrading to the iPhone 16, such as improved camera quality, display and overall performance. The iPhone 16 is likely to feature better sensors, improved computational photography, and superior video capabilities. The display may also see improvements in brightness, color accuracy and refresh rate, making it a better device for media consumption and gaming.
Also: Apple may be cooking up something big with its new Game Mode. Here are 3 things we know
However, if you’re considering the iPhone 16 solely for its AI capabilities — which are still developing and unlikely to deliver the performance expected in Apple’s WWDC 2024 keynote — you might want to keep your expectations in check.