Following a couple pieces in the past week – one from The Information reiterating that Apple had prioritized making the Vision Pro cheaper versus shipping a new high end model, and one from The New York Times questioning if the product was already a failure – Mark Gurman attempts to unify the story for Bloomberg. He notes that while there has indeed been some "soul searching" about the device within the group working on it, the ultimate outcome here remains a focus on a more affordable version of the device in the shorter term, while still a plan for a second iteration of the higher end down the road. So let's think of it this way: - A new, lower-priced "Vision" headset in late 2025
- A new, high-end "Vision Pro" headset in 2026 or later
It still feels like if Apple can't get the 'Vision' right here, the next Vision Pro may be at risk within an Apple that has been more publicly killing its darlings in recent years. And there remains huge risk with that would-be 'Vision' product. Namely, Apple is struggling to get the price point of even the non-'Pro' product down to the $1,500 - $2,000 target range. Per Gurman: Apple could strip out the EyeSight display — the feature that shows a user’s eyes on the outside — and reduce the specifications of the internal virtual reality screens. It also could use a less powerful chip and lower the quality of the augmented reality passthrough visuals, which show you the real-world views outside the device.
But then you’re left with a less appealing experience. Even at $1,500, the product would cost three times as much as rival devices from Meta Platforms Inc. — without the technological advances that made the Vision Pro superior to the competition.
Prototypes of the N107 also have a narrower field of view than the Vision Pro. And the company is considering making the device reliant on a tethered Mac or iPhone. That would let Apple save money on the processing power and components needed to make the Vision Pro a fully standalone product. EyeSight needs to go. Even before the device launched, it was clearly a mistake. We all get what Apple was trying to do here, but beyond the fact that it makes you look beyond weird – something Apple may or may not be able to fix with tweaks and technology – it's just not a feature that's useful in the current um, reality, of the device. That is, outside of perhaps airplanes and a few jokers in cafes, no one is wearing this thing out of the house. And even there, it's usually when they're alone. So that's an easy cut. A cut to the internal specs will be less easy, but hopefully time will heal that wound, as internal components are naturally driven cheaper. The main screens and video pass-through element they might be able to budge a bit on, but they're the key selling point of the device. Narrowing the field-of-view – which is already very narrow, more so than on Meta's far cheaper Quest devices right now – feels like a bridge too far. The most compelling idea would be allowing/forcing the 'Vision' to tether to an iPhone or Mac to run the computing of the device. Frankly, I think they should do this. I know they view it as a stand-alone computing device, one that could potentially do it all one day, but that day is far, far away. You already have to use the ridiculous (and ridiculously inconvenient) external battery to power the device. They should either put a Mac SoC in that brick or use their software screen-casting smarts to offload much of the work to an iPhone, iPad, or Mac. That sounds boring – basically the 'Vision' sounds like an external monitor in this scenario. But thanks to visionOS, Apple could handle this quite differently. And it doesn't have to be fully powered by an iPhone or Mac, those more powerful devices could simply augment the computing on the 'Vision'. Back to Gurman: There may be lessons here from the Apple Watch. When that product debuted, it wasn’t clear why someone would need it. The watch was more of a novelty item or fashion accessory that could receive digital love taps from other users. Then Apple refocused on health, fitness tracking and communication, and the appeal was much clearer.
Perhaps Apple could do something similar with the Vision Pro. Rather than being a jack-of-all-trades, the headset could concentrate on video watching, FaceTime and serving as a Mac external monitor.
The company should abandon the idea that people are going to fall in love with third-party apps (something that hasn’t happened with the watch or Apple TV) or that the Vision Pro needs to be a full-fledged computer.
That could help Apple refine its sales pitch, at least until the more mainstream products are ready. I think it's pretty clear that they should focus this product around watching content to start. Apple's own Immersive video as they build up those libraries, but also your own content both shot and now "upscaled" to 3D with visionOS 2. And yes, Hollywood content, ideally also shot or converted to 3D. And sports! And then they slowly but surely build up the AR capabilities – such as you saw with the new What If... app/experience from Disney. And the second core use case would be as a massive external monitor. The 'Vision' could still do other things – um, gaming, hello? – but these would be the key focal and selling points of a cheaper version of the device. The Vision Pro would remain focused on doing it all stand-alone and breaking new ground in "Spatial Computing", the 'Vision' would be more like a toe-dip. If they could get such a headset down to $999, we're starting to talk. But this is Apple... The other element that hurts them in the above scenario is Meta. The Quest 3 is also improving quite rapidly via software updates. Undoubtedly with a 'Quest 4' and/or cheaper Quest (and perhaps another 'Quest Pro') on the horizon. Can Apple really sell a $1,000 or $1,500 or $2,000 'Vision' in such a market? Well yes, again, they're Apple. But it's unclear how big of a hit it would be outside of the core Apple ecosystem user base. Anyway, I've been thinking about the framing of the then still-rumored Vision Pro through the lens of Apple Watch for a while. Back in 2022:
My guess would be that Apple’s initial entrant into this world is more akin to the Apple Watch than anything else. That is, an interesting piece of hardware, tied to the iPhone, that has no idea what it wants to be yet. It will have to grow up before our eyes. Which the Apple Watch has. And Apple, much like Meta, hasn’t given up despite some wayward years. So… the fight should be on.
And 2023:
For whatever reason, my mind keeps coming back to the Apple Watch. My best guess here — having not seen the device, of course! — is that Reality Pro follows more of an Apple Watch trajectory. That is, it launches with a ton of buzz and people think it’s interesting but too expensive, a bit underwhelming, and an unclear market. And then Apple does what Apple does. Iterate, iterate, iterate. And in a few years, the device is far more compelling, at better price points, with much clearer product/market fit.
The key to this strategy is that Apple isn’t going to launch Reality Pro as a trial balloon. That is, a device in a space they think is interesting but only as an experiment. The only time Apple has really done this was famously with Apple TV (unveiled a few months before the iPhone, no less!). And I actually think that device has suffered its entire existence as a result of that wobbly commitment. It’s a solid device, but it’s not what it could have and should have been.
Finally, later in 2023:
While the Apple Watch did expand its feature set, it has largely been Apple itself and not third-party developers which have driven this. I would argue that Apple messed up by launching a dev kit before the product was ready and thus soiled the early third-party Apple Watch ecosystem in both the minds of developers and users. And the ecosystem hasn’t really rebounded since despite the success of the device now.
In hindsight, Apple may have benefitted from more explicitly doing what they did with the iPhone: allowing only first-party apps to start (while working with some third-parties to co-develop apps, like Apple did with Google Maps and YouTube back in the day) and then once they have a good feel for where the product is going, opening it up to third-party developers via SDKs. Underpromise and over-deliver. Many people — certainly developers — won’t like to hear this notion, but I do wonder if it wouldn’t be the right approach with RealityPro as well.
Of course, it doesn’t sound like Apple is going down this path. And instead is going to show off RealityPro and xrOS at WWDC in part to get developers excited to develop for the platform (and perhaps launch some early SDKs/dev kits?).
It's easy to say that Apple bungled the Vision Pro roll out in hindsight, but, well, I was saying that before they launched it. I think the above is correct or, conversely, Apple should have launched this version of the Vision Pro as a dev kit. It would have kept developer excitement high without having to worry at all about sales numbers – or, just as importantly, the optics around them – and would have bought Apple more time to refine the actual device. But here we are now. In a way, the (mostly unforced) error of launching into a world enamored with AI may ultimately provide cover for Apple to iterate on their vision for the Vision lineup. But make no mistake, they're on the clock... One more thing: Gurman says the cheaper Vision device is codenamed the 'N107' while Wayne Ma and Qianer Liu of The Information say the cheaper model is the 'N109', with Gurman noting the second iteration of the Vision Pro is the 'N109'. Let's get our top-secret internal codenames straight here, okay?
|