Apple Intelligence Unleashed Craig & Joz Spill the AI Secrets!

Apple WWDC Interview: Craig Federighi and Joz on Siri delay, new Apple Intelligence features and what's next for AI - Tom's Guide

Apple WWDC Interview: Craig Federighi and Joz on Siri Delay, New Apple Intelligence Features, and What's Next for AI

Apple's Worldwide Developers Conference (WWDC) 2025 brought a wave of exciting announcements, with Apple Intelligence taking center stage despite initial expectations of a low-key presence. New features like Live Translation in iOS 26, improved Visual Intelligence capable of reading your screen, and AI-enhanced capabilities for the Phone app (Call Screen and Hold for You) and Shortcuts app, promise to redefine user experience.

However, one key element remains elusive: the revamped Siri. While Apple continues to develop features like understanding personal context, on-screen awareness, and in-app actions, the full realization of these promises has been pushed to 2026, after the launch of iOS 26 this fall.

Recently, Craig Federighi, Apple’s senior vice president of software engineering, and Greg Joswiak, the senior vice president of worldwide marketing, sat down to provide deeper insights into the future of Siri, Apple’s AI strategy, and its fundamental differences from approaches taken by OpenAI and Google Gemini.

So, What's the Deal with Siri's Delay?

Apple unveiled enhancements to Siri with iOS 18, including a more conversational experience, contextual awareness, and the ability to type to Siri. However, the most anticipated features have been delayed. The core reason, according to Craig Federighi, lies in architectural limitations:

"We found that the limitations of the V1 architecture weren't getting us to the quality level that we knew our customers needed and expected...if we tried to push that out in the state it was going to be in, it would not meet our customer expectations or Apple standards and we had to move to the V2 architecture," said Federighi.

To break it down:

  • Apple initially developed Siri using a "V1" architecture.
  • While V1 showed promise, it didn't meet Apple's rigorous quality standards.
  • Simultaneously, Apple was working on a "V2" architecture designed for a complete customer solution.
  • The decision was made to transition fully to the V2 architecture, resulting in the delay.

The exact timeline for the updated Siri remains unconfirmed. Apple's stance is clear: they will announce the release date only when the update is fully ready for public use.

No Siri as Your Therapist (Yet)

The rise of AI-powered voice assistants has led to their use in various capacities, including providing life advice and even acting as stand-in therapists. While Federighi acknowledges the potential of such applications, he indicates that this isn't Apple's primary focus right now.

“As a therapist, it's a reasonable thing to do," said Federighi. “I know a lot of people find it to be a real powerful way to gather their thoughts, you know, brainstorm, do all kinds of things. And so sure, these are great things but are they the most important thing for Apple to develop well?"

Apple's AI Philosophy: Not Just Another Chatbot

Apple's approach to AI centers on integrating intelligence seamlessly into existing apps and services, rather than creating a standalone chatbot. This means delivering Apple Intelligence features within apps to enhance user experience. Examples include:

  • Call Screening and Hold for Me in the Phone app
  • Live Translate in Messages, Phone, and FaceTime

“Apple's job is to figure out the right experiences that make sense in the context of what we offer to customers and to make that technology,” said Joswiak. "The features that you’re seeing in Apple Intelligence isn't a destination for us. There's no app on intelligence. [It’s about] making all the things you do every day better.”

What's on the Horizon for Apple Intelligence?

Apple is currently focused on delivering impactful AI features as part of its upcoming software releases, including iOS 26, iPadOS 26, macOS 26 Tahoe, watchOS 26, and the new Workout Buddy feature on the Apple Watch.

Furthermore, Apple is opening its large language models to third-party developers, enabling them to leverage Apple Intelligence capabilities on-device. Visual Intelligence, with its ability to identify objects on-screen and facilitate instant purchases (e.g., buying an item on Etsy), exemplifies Apple's AI evolution.

In Conclusion

Apple Intelligence is an evolving endeavor, with the company committed to delivering the promised Siri enhancements and a broad spectrum of AI-driven features designed to strengthen its ecosystem. The emphasis is on enhancing existing user experiences and providing practical, contextually relevant AI solutions.

“In the end, people buy products right, they buy experiences,” said Joswiak. “We're very proud of the fact that across each of our hero product categories, we're number one of customer satisfaction right?
"There's a reason for that, and we're trying to make those product experiences better and better and make those products better and better and that's what customers care about.”

Tags: Apple Intelligence, iOS 26, Live Translation, WWDC 2025, Siri AI, Craig Federighi, Greg Joswiak, AI Features, Visual Intelligence, Apple Keynote

Source: https://www.tomsguide.com/ai/apple-intelligence/wwdc-interview-apples-craig-federighi-and-greg-joswiak-on-siri-delay-voice-ai-as-therapist-and-whats-next-for-apple-intelligence

Comments