If you spend any time on tech Twitter or reading quarterly earnings coverage, you'll have seen the same argument a hundred times: Apple is behind on AI. The feature rollouts have been staggered, Siri's bigger upgrades keep slipping, and meanwhile OpenAI, Google, and Meta are shipping new models every other week. The obvious read is that Apple missed the boat.
Greg "Joz" Joswiak and John Ternus gave a different read in a Tom's Guide interview for Apple's 50th anniversary, and it's worth taking seriously. Their argument isn't that Apple is catching up. It's that the rest of the industry is racing in the wrong event.
"Early innings", not a crossroads
When Mark Spoonauer put the "AI crossroads" framing to him, the idea that we've reached a tipping point where people are either for it or afraid of it, Joswiak pushed back:
I'm not sure I'd use crossroads. I would use early innings. This is still very early days in what is going on with intelligence. This is not a sprint. This is a marathon. We're going to be doing stuff with intelligence for decades, not for months or years.
That's the whole thesis in two sentences. Apple isn't trying to win 2026. It's trying to build AI features that are still around in 2036, running on the same privacy-first architecture, integrated into products that people actually rely on every day.
It also explains the tempo. If your goal is a marathon, you don't sprint the first mile. You pace. You make sure the platform underneath holds up. You don't ship something that sets user expectations you can't meet in two years' time.
Technology versus experience
The second thing worth hearing is how Ternus frames what Apple actually ships:
We never think about shipping a technology. We always think about how can we leverage technology to ship amazing products and features and experiences for our users.
This sounds like marketing language until you look at what Apple has actually released. Live Translation on AirPods isn't sold as an AI feature. It's sold as a thing that works when you're abroad. Ternus pointed to it directly in the interview. The AirPods Max 2 got the same treatment, H2 chip upgrades that quietly power better noise cancellation and spatial audio, framed as audio improvements rather than machine learning wins.
Compare that to how most of the industry talks. Google's keynotes lead with model names. OpenAI announces GPT versions. Meta's strategy is basically "smart glasses, but with AI". The technology is the product. Apple's bet is the opposite: the experience is the product, and the technology can stay out of the way.
The "proactive" era
Joswiak dropped a line in the interview that's easy to miss but quietly important:
We actually called it proactive if you remember, Mark. We didn't even use machine learning or AI. We talked about how your devices could be proactive because they were learning these things.
That was ten-plus years ago. Suggested apps when you swipe down. Siri surfacing the transit app when you walk near a bus stop. Photo memories. Keyboard predictions. All of it ran on on-device machine learning, and Apple deliberately didn't call it that.
The reason isn't humility. It's that "AI" as a marketing term has always had a short half-life. Cloud, smart, intelligent, AI-powered, these labels come and go. The features underneath either work or they don't. Joswiak's point is that Apple has been shipping this stuff for years, under names that made sense to customers rather than names that impressed investors.
You can see the same pattern running through Apple Notes, smart folders, on-device search, handwriting recognition. None of it is badged as AI. All of it is machine learning doing quiet, reliable work.
When AI is invisible, it's working
Joswiak said something else that's stuck with me:
I love it when those things are happening and somebody doesn't even necessarily know that it's AI. It's just better. We make it so that you don't have to be a chatbot expert to get the most out of that technology.
This is the dividing line between two philosophies. One says the user should interact with AI, open a chatbot, write a prompt, wait for a response. The other says AI should interact with the system on the user's behalf, and the user just notices that the phone works better.
Neither is wrong. A ChatGPT session and a proactive notification are both genuinely useful, just for different things. But if you're Apple, with two billion devices in people's pockets and wrists and ears, invisibility scales better than chat. Most people don't want another app to learn. They want the apps they already have to get smarter.
The "death of apps" question
One of the louder predictions doing the rounds is that Siri and large language models will eventually replace app-based workflows. You won't open apps, you'll just tell your assistant what you want and it'll handle things across services. Joswiak's response was short:
The App Store is alive and well. I can tell you that. We're getting lots of great submissions and lots of great apps, and intelligence is certainly part of those apps. I think the rumours of its death may have been greatly exaggerated.
Read between the lines and the Apple position is pretty clear. AI doesn't replace apps. AI augments apps, the system layer, and the assistant that ties them together. The App Store is still the distribution channel. Developers are still the ones building the experiences. Intelligence is a feature, not a replacement.
For creators, this matters in a practical way. Your editing apps, your camera tools, the Apple Creator Studio bundle, none of these are going anywhere. They're going to get smarter, and the smart parts will mostly be invisible. That's the point.
What Apple's tempo actually buys them
Here's the thing the "Apple is behind" take misses. Every major AI feature that ships fast and breaks things sets user expectations Apple will eventually have to meet. Every hallucination, every privacy leak, every half-baked agent that books the wrong flight, these are all learning opportunities that Apple is happy to let other companies absorb.
When Apple does ship a fully autonomous Siri, it'll need to work for a parent who's never heard of a large language model. It'll need to handle private data without sending it to a server. It'll need to be reliable enough that you'd trust it with a calendar invite, a payment, or a message to your boss. The bar is much higher than "cool demo at a keynote".
Joswiak's marathon framing is the honest version of that. The race that matters isn't who ships an AI feature first. It's who ships AI that's still working, still trusted, and still making products better in ten years. That's a different game. And it's the one Apple appears to be playing on purpose.
FAQ
Is Apple really behind on AI?
Depending on how you measure it. Apple has fewer flashy generative features than OpenAI or Google, but it's been shipping on-device machine learning for over a decade under names like "proactive" and "smart". The pace is deliberate, not reactive.
What is Apple Intelligence, in simple terms?
Apple's umbrella term for its AI features, most of which run on-device for privacy. Writing Tools, notification summaries, Image Playground, and a smarter Siri are the headline pieces, with more rolling out across macOS, iOS, and iPadOS.
Will Siri replace apps?
Apple doesn't think so. Joswiak called the App Store "alive and well" in the interview and suggested AI will live inside apps and across the system, rather than replacing the app model entirely.
Source: Tom's Guide interview with Mark Spoonauer, Greg Joswiak, and John Ternus, marking Apple's 50th anniversary.
Lewis Lovelock
YouTuber, tech creator and CTO. I write about the apps, gear, and workflows I actually use — and make videos about them too.
Watch on YouTube →