Reading view

There are new articles available, click to refresh the page.

⌥ Apple’s Supposed A.I. Strategy Shift Is the Company’s Normal Strategy

By: Nick Heer

Mark Gurman, last week in Bloomberg:

Apple Inc. plans to open Siri to outside artificial intelligence assistants, a major move aimed at bolstering the iPhone as an AI platform.

The company is preparing to make the change as part of a Siri overhaul in its upcoming iOS 27 operating system update, according to people with knowledge of the matter. The assistant can already tap into ChatGPT through a partnership with OpenAI, but Apple will now allow competing services to do the same.

This is not unexpected. In the Apple Intelligence introduction at WWDC 2024, Craig Federighi said “we want you to be able to use these external models without having to jump between different tools”, and that they were “starting” with ChatGPT. Gurman points this out and also notes Federighi’s teased Google Gemini integration. Tim Cook, in an October 2025 earnings call, said much the same. (Gurman also notes that this integration is “separate from Apple’s work with Google to rebuild Siri using Gemini models”, but “the news initially weighed on shares of Google”, which I am sure is exactly the reason for them dropping 3.4% and nothing to do with an existing weeklong slide but, then again, I do not work at Bloomberg so who the hell am I to say?)

Gurman, in his “Power On” newsletter over the weekend, further explored what he calls Apple “doubl[ing] down” on a “revamped A.I. and Siri strategy”:

That reality is shaping the company’s new approach, set to be unveiled at the Worldwide Developers Conference on June 8. Rather than engaging in an AI arms race, Apple is focusing on its core strengths: selling highly profitable hardware and making money off the services that run on it.

Historically, Apple’s software — iMessage, Maps and Photos, for example — has been about driving product sales rather than generating revenue in their own right. Rivals, in contrast, are aggressively monetizing AI through subscriptions and premium apps. Apple understands that few, if any, users will pay for Siri or its other AI technology. The opportunity to turn Apple Intelligence into a moneymaker has effectively passed.

What would have been more newsworthy here is if Apple’s A.I. strategy were anything other than building software exclusively for its proprietary hardware. This does not sound like a “revamped” strategy; it sounds like Apple’s whole deal. If it can use Apple Intelligence or Siri in the future, it certainly might; it is putting ads in Apple Maps after all. Services is a money-printing machine with less risk. But it is still a hardware company.

This part made me double-take and wonder if I missed something. In February 2024, following Apple’s cancellation of its car project, Gurman predicted that hardware would continue to be Apple’s primary business “for now”, as though that will change in the near future. This has been constant since Apple Intelligence was announced at WWDC that year.

What one could argue has been a change of strategy is the rumoured development of a chatbot; Gurman called it a “strategic shift” when he broke the news. But that, too, is somewhat inaccurate in two ways: Gurman’s description of it is as an overhauled version of Siri that will let people do normal Siri stuff — setting timers, end of list — plus some of the features Apple announced in 2024 but has not yet shipped which, confusingly, were also first set to ship in an update to iOS 26 without the wholly new version of Siri but also depending on Gemini. Got it?

But even that is not much of a strategy shift. Gurman tweeted in May 2024 — before WWDC and the debut of Apple Intelligence — that “Apple isn’t building its own chatbot but knows the market wants it so it’s going elsewhere for it. It’s the same playbook as search.” So, again, it is just borrowing from its ages-old playbook. It will continue to have proprietary stuff that ostensibly works seamlessly across a user’s Apple-branded hardware, allow installation of third-party add-ons, and rely on Google for some core functionality. How, exactly, is this a “revamp”?

Anyway, here is what Gurman wrote in January after the Gemini announcement and before the first build of iOS 26.4 was released:

Today, Apple appears to be less than a month away from unveiling the results of this partnership. The company has been planning an announcement of the new Siri in the second half of February, when it will give demonstrations of the functionality.

Whether that takes the form of a major event or a smaller, tightly controlled briefing — perhaps at Apple’s New York media loft — remains unclear. Either way, Apple is just weeks away from finally delivering on the Siri promises made at its Worldwide Developers Conference back in June 2024. At long last, the assistant should be able to tap into personal data and on-screen content to fulfill tasks.

Apple today shipped the first build of iOS 26.5 to developers without any sign of those features. While they may come in a later build, Juli Clover, of MacRumors, speculates they have been kicked to iOS 27.

Does not seem like much has changed at all.

Apple Head Computer, Apple Intelligence, and Apple Computer Heads

By: Nick Heer

Benedict Evans:

That takes us to xR, and to AI. These are fields where the tech is fundamental, and where there are real, important Apple kinds of questions, where Apple really should be able to do something different. And yet, with the Vision Pro Apple stumbled, and then with AI it’s fallen flat on its face. This is a concern.

The Vision Pro shipped as promised and works as advertised. But it’s also both too heavy and bulky and far too expensive to be a viable mass-market consumer product. Hugo Barra called it an over-engineered developer kit — you could also call it an experiment, or a preview or a concept. […]

The main problem, I think, with the reception of the Vision Pro is that it was passed through the same marketing lens as Apple uses to frame all its products. I have no idea if Apple considers the sales of this experiment acceptable, the tepid developer adoption predictable, or the skeptical press understandable. However, if you believe the math on display production and estimated sales figures, they more-or-less match.

Of course, as Evans points out, Apple does not ship experiments:

The new Siri that’s been delayed this week is the mirror image of this. […]

However, it clearly is a problem that the Apple execution machine broke badly enough for Apple to spend an hour at WWDC and a bunch of TV commercials talking about vapourware that it didn’t appear to understand was vapourware. The decision to launch the Vision Pro looks like a related failure. It’s a big problem that this is late, but it’s an equally big problem that Apple thought it was almost ready.

Unlike the Siri feature delay, I do not think the Vision Pro’s launch affects the company’s credibility at all. It can keep pushing that thing and trying to turn it into something more mass-market. This Siri stuff is going to make me look at WWDC in a whole different light this year.

Mark Gurman, Bloomberg:

Chief Executive Officer Tim Cook has lost confidence in the ability of AI head John Giannandrea to execute on product development, so he’s moving over another top executive to help: Vision Pro creator Mike Rockwell. In a new role, Rockwell will be in charge of the Siri virtual assistant, according to the people, who asked not to be identified because the moves haven’t been announced.

[…]

Rockwell is known as the brains behind the Vision Pro, which is considered a technical marvel but not a commercial hit. Getting the headset to market required a number of technical breakthroughs, some of which leveraged forms of artificial intelligence. He is now moving away from the Vision Pro at a time when that unit is struggling to plot a future for the product.

If you had no context for this decision, it looks like Rockwell is being moved off Apple’s hot new product and onto a piece of software that perennially disappoints. It looks like a demotion. That is how badly Siri needs a shakeup.

Giannandrea will remain at the company, even with Rockwell taking over Siri. An abrupt departure would signal publicly that the AI efforts have been tumultuous — something Apple is reluctant to acknowledge. Giannandrea’s other responsibilities include oversight of research, testing and technologies related to AI. The company also has a team reporting to Giannandrea investigating robotics.

I figured as much. Gurman does not clarify in this article how much of Apple Intelligence falls under Giannandrea’s rubric, and how much is part of the “Siri” stuff that is being transferred to Rockwell. It does not sound as though Giannandrea will have no further Apple Intelligence responsibilities — yet — but the high-profile public-facing stuff is now overseen by Rockwell and, ultimately, Craig Federighi.

⌥ Permalink

Siri Invented a Calendar Event and Then Hallucinated a Helpful Suggestion

By: Nick Heer

Go figure — just one day after writing about how Apple’s ambiguous descriptions of supposedly clever features has the potential to rob trust, my phone has become haunted.

I saw a suggestion from Siri that I turn on Do Not Disturb until the end of an event in my calendar — a reservation at a restaurant from 8:30 until 10:00 this morning. No such matching event was in Fantastical. It was, however, shown in the Calendar app as a Siri Suggestion.

What I think happened is that I was looking at that restaurant on OpenTable at perhaps 8:00 this morning. I was doing so in my web browser on my Mac, and I was not logged into OpenTable. My Mac and iPhone are both running operating system beta builds with Apple Intelligence enabled. Siri must have interpreted this mere browsing as me making a reservation, and then added it to my calendar without my asking, and then made a suggestion based on that fictional event.

This was not helpful. It was, in fact, perplexing and creepy. I do not know how all of these things were able to work together to produce this result, but I do not like it at all. It is obvious how this would make anyone question whether they can trust Apple Intelligence, A.I. systems generally, Siri, and their personal privacy. Truly bizarre.

⌥ Permalink

⌥ Ambiguity and Trust in Apple Intelligence

By: Nick Heer

Spencer Ackerman has been a national security reporter for over twenty years, and was partially responsible for the Guardian’s coverage of NSA documents leaked by Edward Snowden. He has good reason to be skeptical of privacy claims in general, and his experience updating his iPhone made him worried:

Recently, I installed Apple’s iOS 18.1 update. Shame on me for not realizing sooner that I should be checking app permissions for Siri — which I had thought I disabled as soon as I bought my device — but after installing it, I noticed this update appeared to change Siri’s defaults.

Apple has a history with changing preferences and dark patterns. This is particularly relevant in the case of the iOS 18.1 update because it was the one with Apple Intelligence, which creates new ambiguity between what is happening on-device and what goes to a server farm somewhere.

Allen Pike:

While easy tasks are handled by their on-device models, Apple’s cloud is used for what I’d call moderate-difficulty work: summarizing long emails, generating patches for Photos’ Clean Up feature, or refining prose in response to a prompt in Writing Tools. In my testing, Clean Up works quite well, while the other server-driven features are what you’d expect from a medium-sized model: nothing impressive.

Users shouldn’t need to care whether a task is completed locally or not, so each feature just quietly uses the backend that Apple feels is appropriate. The relative performance of these two systems over time will probably lead to some features being moved from cloud to device, or vice versa.

It would be nice if it truly did not matter — and, for many users, the blurry line between the two is probably fine. Private Cloud Compute seems to be trustworthy. But I fully appreciate Ackerman’s worries. Someone in his position necessarily must understand what is being stored and processed in which context.

However, Ackerman appears to have interpreted this setting change incorrectly:

I was alarmed to see that even my secure communications apps, like Proton and Signal, were toggled by default to “Learn from this App” and enable some subsidiary functions. I had to swipe them all off.

This setting was, to Ackerman, evidence of Apple “uploading your data to its new cloud-based AI project”, which is a reasonable assumption at a glance. Apple, like every technology company in the past two years, has decided to loudly market everything as being connected to its broader A.I. strategy. In launching these features in a piecemeal manner, though, it is not clear to a layperson which parts of iOS are related to Apple Intelligence, let alone where those interactions are taking place.

However, this particular setting is nearly three years old and unrelated to Apple Intelligence. This is related to Siri Suggestions which appear throughout the system. For example, the widget stack on my home screen suggests my alarm clock app when I charge my iPhone at night. It suggests I open the Microsoft Authenticator app on weekday mornings. When I do not answer the phone for what is clearly a scammer, it suggests I return the missed call. It is not all going to be gold.

Even at the time of its launch, its wording had the potential for confusion — something Apple has not clarified within the Settings app in the intervening years — and it seems to have been enabled by default. While this data may play a role in establishing the “personal context” Apple talks about — both are part of the App Intents framework — I do not believe it is used to train off-device Apple Intelligence models. However, Apple says this data may leave the device:

Your personal information — which is encrypted and remains private — stays up to date across all your devices where you’re signed in to the same Apple Account. As Siri learns about you on one device, your experience with Siri is improved on your other devices. If you don’t want Siri personalization to update across your devices, you can disable Siri in iCloud settings. See Keep what Siri knows about you up to date on your Apple devices.

While I believe Ackerman is incorrect about the setting’s function and how Apple handles its data, I can see how he interpreted it that way. The company is aggressively marketing Apple Intelligence, even though it is entirely unclear which parts of it are available, how it is integrated throughout the company’s operating systems, and which parts are dependent on off-site processing. There are people who really care about these details, and they should be able to get answers to these questions.

All of this stuff may seem wonderful and novel to Apple and, likely, many millions of users. But there are others who have reasonable concerns. Like any new technology, there are questions which can only be answered by those who created it. Only Apple is able to clear up the uncertainty around Apple Intelligence, and I believe it should. A cynical explanation is that this ambiguity is all deliberate because Apple’s A.I. approach is so much slower than its competitors and, so, it is disincentivized from setting clear boundaries. That is possible, but there is plenty of trust to be gained by being upfront now. Americans polled by Pew Research and Gallup have concerns about these technologies. Apple has repeatedly emphasized its privacy bonafides. But these features remain mysterious and suspicious for many people regardless of how much a giant corporation swears it delivers “stateless computation, enforceable guarantees, no privileged access, non-targetability, and verifiable transparency”.

All of that is nice, I am sure. Perhaps someone at Apple can start the trust-building by clarifying what the Siri switch does in the Settings app, though.

❌