Apple was the first major tech company to launch a voice assistant back in 2011. But a common criticism of Siri is that it’s struggled to compete with Amazon’s Alexa, Google’s Assistant and, more recently, ChatGPT-powered assistants.
If rumors are right, however, Siri is expected to receive a big brain power boost this year, thanks mostly to AI. The integration of large language models, the technology behind ChatGPT, is poised to transform Siri into what one leaker is envisioning as the “ultimate virtual assistant.”
Read more: Best iPhone to Buy for 2024
In December, Apple published research showing it can make LLM AI models run on-device in a similar way that Qualcomm and MediaTek have done for their chips in Android phones. This may indicate that Siri will get a long-awaited overhaul that iPhone fans have been waiting for, including the ability to chat like ChatGPT.
Only Apple knows what’s next for the iPhone and its other products, but here’s how Siri could change in the iPhone 16.
Siri could improve follow-up requests
Imagine you ask Siri about when the Olympics are taking place. It quickly spits out the correct dates in the summer of this year. But if you follow that up with, “Add it to my calendar,” the virtual assistant tends to respond imperfectly with “What should I call it?” The answer to that question would be obvious to us humans. Even when I responded, “Olympics,” Siri replied, “When should I schedule it for?”
The reason Siri tends to falter is that it lacks contextual awareness. That limits its ability to follow a conversation like a human can. However, that could change in June of this year, when Apple is rumoured to unveil improvements to Siri via iOS 18.
The iPhone maker is training Siri (and the iPhone’s Spotlight search tool) on large language models in order to improve the virtual assistant’s ability to answer more questions accurately, according to the October edition of Mark Gurman’s Bloomberg newsletter PowerOn. A large language model is a specific kind of AI that excels at understanding and producing natural language. With advancements in LLMs, Siri is likely to become more skilled at processing the way people speak. This should not only allow Siri to understand more complex and nuanced questions, but also provide accurate responses. All in all Siri is expected to become a more context-aware and powerful virtual assistant.
Siri may get better at executing multistep tasks
Apart from understanding people better, Siri is also expected to become more capable and efficient in the coming months. Apple plans to use large language models to make Siri smarter, according to a September report from the Information. The article detailed an example explaining how Siri might respond to simple voice commands for more complex tasks, such as turning a set of photos into a GIF and then sending them to one of your contacts, which would be a significant step forward in Siri’s capabilities.
Watch this: iOS 17 Brings Big Changes to Old Habits: Live Voicemail, AirDrop and Siri
Read more: Did You Know Siri Can Do This?
Siri may improve its interactions with the Messages app (an other apps)
Apart from answering questions, the next version of Siri could become better at automatically completing sentences, according to a Bloomberg report published in October.
Thanks to LLMs, which are trained on troves of data, Siri is expected to up its predictive text game. Beyond that, Apple is rumored to be planning to add AI to as many Apple apps as possible which could even include a feature in the Messages app to craft complex messages.
Apple never talks specifics about products before they launch. Since Apple usually unveils new iPhone software features at WWDC in June, we’ll likely know more about iPhone AI plans then.
Editors’ note: CNET is using an AI engine to help create some stories. For more, see this post.
I Took 600+ Photos With the iPhone 15 Pro and Pro Max. Look at My Favorites