Tom’s Guide editor-in-chief Mark Spoonauer, and TechRadar editor-at-large Lance Ulanoff sat down with Apple’s software engineering chief Craig Federighi and Apple’s marketing chief Greg Joswiak for a WWDC interview. During the interview, Federighi discussed why “Personalized Siri” isn’t yet ready for prime time.
While Apple did unveil new Apple Intelligence features during Monday’s WWDC25 keynote – such as Live Translation in iOS 26, a new Visual Intelligence feature that can read your screen, an AI-powered Shorcuts app, and more – Personalized Siri was nowhere to be found.
Federighi said that its first-generation AI architechture turned out to be too limited to allow for the Siri features it had promised during last year’s WWDC keynote to meet the company’s high quality standards. So, Apple decided this spring to move Personalized Siri to its second-generation architecture that is under development. This resulted in the delay in the promised features.
“We found that the limitations of the V1 architecture weren’t getting us to the quality level that we knew our customers needed and expected…if we tried to push that out in the state it was going to be in, it would not meet our customer expectations or Apple standards and we had to move to the V2 architecture.”
Apple’s marketing chief Greg Joswiak confirm that when Apple says Personalized Siri will be available “in the coming year,” it means 2026. This means we’ll likely see it appear in an update to iOS 26 sometime next year.
“We will announce the date when we’re ready to seed it, and you’re all ready to be able to experience it,” said Federighi.
The delay in Personalized Siri features has caused legal trouble for Apple, as it is facing multiple class action lawsuits, both in the United States and Canada over the delayed features, which the company advertised heavily in the last quarter of last year.
The commercial in question, which starred actor Bella Ramsey, showed a Siri that had improved understanding of a user’s personal context, and that had on-screen awareness. Unfortunately, those new features never made an appearance in iOS 18, and Apple earlier this year announced that there would be a longer delay, due to the need for more development time.
Apple says it wants to “meet people where they are” with AI. Federighi and Joswiak gave the example of the new Live Translate feature in the Messages, Phone, and FaceTime apps. When you receive a message from someone in a different langiage than yours, Live Translate will translate it for you.
“It’s integrated so it’s there within reach whenever you need it in the way you need it with it being contextually relevant and having access to the tools necessary to accomplish what you want to accomplish at that moment,” said Federighi.
“Apple’s job is to figure out the right experiences that make sense in the context of what we offer to customers and to make that technology,” said Joswiak. “The features that you’re seeing in Apple Intelligence isn’t a destination for us. There’s no app on intelligence. [It’s about] making all the things you do every day better.”