Report Hints at Improved Siri This Fall After Inner Apple Turmoil

1 day ago 24

According to a report by The New York Times, the first round of Siri improvements could arrive as soon as this fall to shore up the struggling digital assistant, which is supposed to be a linchpin of Apple Intelligence.

In an article about Apple challenges, which ranges from US tariffs on China to strife among the executives and teams responsible for advancing Siri, writer Tripp Mickle included this tidbit: "The company plans to release a virtual assistant in the fall capable of doing things like editing and sending a photo to a friend on request, three people with knowledge of its plans said."

That description is still a far cry from the interconnected smart assistant teased at WWDC 24 and during the launch of the iPhone 16 series, where Siri would be able to pull context from texts and emails about family members arriving on incoming flights. In fact, it doesn't seem to address that the current state of Siri appears to be regressing.

In a rare move by the secretive company, Apple acknowledged in a March statement to Daring Fireball from spokeswoman Jacqueline Roy, that its efforts to usher in a smarter Siri digital assistant as part of Apple Intelligence are "going to take us longer than we thought" and that the company "anticipate(s) rolling [these features] out in the coming year." Now the Times report says we could see improvements as early as the fall.

Basic queries shouldn't be difficult

Siri and Apple Intelligence have taken several knocks lately. Behind the scenes, Apple shook up its executive ranks and removed John Giannandrea from his role overseeing Siri, a transition detailed in the Times article as well as a more detailed behind-the-scenes look published by The Information (and summarized well at MacRumors).

But Siri also seems to lack context for basic queries. Apple did fix an earlier problem where if you asked Siri, "What month is it?" the answer was a curt, "Sorry, I don't understand." Still, when I ask that same question now, I don't get the month; I get the current full date. And when I phrase the question as, "What is the current month?" I'm told, "It was Tuesday, April 1, 2025." (If I wasn't aware of Siri's issues, I might wonder if the digital assistant was trying to play an April Fools' joke on me.)

 "What month is it?" answered with "It's Firday, April 11, 2025". At right: "What is the current month?" answered with "It was Tuesday, April 1, 2025."

Siri can't quite parse basic queries, but at least it's no longer replying with "I don't understand."

Screenshot by Jeff Carlson/CNET

Parsing a basic question like that doesn't seem to be a heavy request. Perhaps it never came up because it's the kind of question only someone waking from a coma or being rescued from a deserted island would ask.

All of this is frustrating for shareholders, journalists (though we've grudgingly gotten accustomed to it) and customers, especially when they expect a level of assistant competency from Apple that just isn't there. And the secrecy invites the same kind of months-long drumbeat of "Apple is falling behind on AI" that led up to the reveal of Apple Intelligence.

By taking the unusual (for Apple) step of responding to investor and media pressure -- and announcing features that aren't close to ready -- the company may have made things worse by confirming that analysts, reporters and fans were right.

The smart play would be for Apple to adhere to its secretive ways, not previewing its features and capabilities until they're much closer to being ready to ship. This week's leak suggests the company might be getting the message.

Apple's inflated expectations

Apple's approach to product development has been to work on projects secretly, over years if needed, until they're ready to see the light. They're often not 100% baked at release, but when they're ready to be introduced to the public, the core features and functions are there.

I could cite plenty of examples. It's a valid argument that the Vision Pro is not a successful product -- it's expensive, it hasn't been broadly adopted by customers or developers, it's uncomfortable and so on -- but the essential elements such as processing power, micro-OLED screens and VisionOS are all there as a solid foundation.

When a product's existence is heavily leaked ahead of time, Apple typically unveils a finished version -- even if it's still limited in functionality. It was generally expected leading up to Macworld Expo in 2007 that Apple would announce a phone -- particularly following the embarrassment of the Motorola ROKR E1 phone. But no one expected it to break from other smartphones of the time with its large screen, lack of physical keyboard and full web browser.

Steve Jobs demonstrating pinch to zoom on the original iPhone was just the beginning of a multitude of interactive gestures we use every day. Now it's second nature to tap, swipe and pinch on screens.  But devices have multiplied since then – including so

When Steve Jobs revealed the original iPhone, it was a radical idea to not include a physical keyboard.

CNET / Screenshot by Jeff Carlson

What's different this time is that Apple's promise for an advanced Siri, to anchor Apple Intelligence, seems to be in reaction to investors, the media and early-adopters obsessed with not just the presence of AI but also the immediacy of AI. Apple needs to be seen as an active player in the AI space with competitive features -- and that those are just around the corner. 

Also around that corner? The yearly iPhone refresh. Apple, like other phone-makers, sees AI as an important driver of new phone sales, since only its iPhone 15 Pro and iPhone 16 series models have the processing power to run Apple Intelligence. And that's how we got a WWDC keynote in 2024 focused on Apple Intelligence and promising that very soon Siri would become an intelligent agent that can pull data from every corner of your iPhone to respond to queries such as "What time does my mom's flight arrive?"

LLMs don't follow a traditional release model

Large language models (LLMs) such as ChatGPT are advancing at a record pace. They're now much more naturally conversational and can summarize large amounts of information well. Real-time audio transcription, for example, is game-changing for someone like me who has always struggled to hand-write notes.

At the same time, these AI technologies are not making the kinds of gains that tech giants like Google and OpenAI expect. Apple isn't the only company hanging its AI future on intelligent agents that know everything about us.

Perhaps Apple, like Google, saw the brain-bending pace of advancement in LLMs' capabilities and figured the bumps and stumbles it's facing now could be solved with a few quick bug fixes and AI model recompiles. With those smoothed over, connecting the pieces and presenting them as the next generation of Siri would take a few months.

But that's not how it's playing out. AI hallucinations and bad data are still a problem -- are you getting your recommended dietary requirement of rocks?

I suspect Apple is smarting not just from having to delay its Siri plans, but from being forced to do so publicly. And yet, even if Future Siri doesn't make an appearance in the near future, there are plenty of opportunities coming up to continue improving Apple Intelligence features. Work on iOS 19 and iPhone 17 models, plus preparations for WWDC 2025 are no doubt well under way. Now that there are fewer expectations for the stalwart assistant, perhaps Siri's year will improve from here.

Read Entire Article