Samsung has secured a trademark with the UK’s Intellectual Property Office for ‘Samsung Glasses’, which could mean we’re finally seeing some movement from the South Korean tech giant to release an XR headset.
Filed in August and later entered into registry in November, the Samsung Glasses trademark specifies that it covers “Virtual reality headsets; Augmented reality headsets; Headphones; Smartphones; Smart glasses.” The news was first reported by UploadVR.
Samsung announced in February it was partnering with Google and Qualcomm to develop an XR device, something the company said at the time was “not too far away.” We’re still not sure what it will be other than hardware made by Samsung, software by Google, and chipset by Qualcomm.
While unconfirmed as a related device, only a short month later Samsung filed for the US trademark ‘Samsung Galaxy Glasses’. In its description, the US filing is said to cover the same categories as the UK trademark.
Granted, this is a pretty wide range of devices which span the entire XR gamut, although both proposed ‘Glasses’ monikers seem to imply it wouldn’t be a direct competitor to either Meta Quest 3 or the soon-to-release Apple Vision Pro, the $3,500 mixed reality headset which is set to launch sometime early next year.
Vision Pro is well beyond the size of spectacles, so calling anything with that format would be an odd move. There are a few candidates though.
Something similar to Meta’s Smart Glasses from Ray-Ban could be more fitting to bear the ‘Samsung Glasses’ name. Confusingly enough, Meta Smart Glasses don’t have a display, instead packing in cameras, off-ear headphones, microphones, and voice access to Meta’s digital assistant.
Another option might be a device similar to XREAL’s Air 2 Pro, which packs in birdbath optics and micro-OLEDs for traditional content consumption, such as film, TV, and flatscreen video games.
A full-blown pair of all-day AR glasses is decidedly out of the picture though, as optics and battery technology (among other things) still aren’t at a point where they’d fit into a glasses format. These are problems that every major tech company in the XR space is working on currently, but it’s safe to say we’re years away from what many hope will be the next major computing platform.
Apple Vision Pro isn’t slated to launch until early next year, but if you’ve got an iPhone 15 Pro you can already start capturing memories as spatial videos.
With the recent release of iOS 17.2 beta, Apple quietly added its first pass at spatial video capture for the iPhone 15 Pro and iPhone 15 Pro Max.
“Capture spatial video with remarkable depth on iPhone 15 Pro to view in 3D in the Photos app on Apple Vision Pro,” Apple writes in the update’s release notes. “Turn on spatial video capture in Settings > Camera > Formats, then capture spatial videos in Video mode in the Camera app.”
Enabling the mode adds a new Vision Pro icon to the Camera app. Tapping it instructs you to rotate the phone sideways into a landscape view and locks the capture settings to 1,920 × 1,080 at 30 FPS. This allows the phone to capture two video streams from different lenses, then the footage is compared and processed to add depth information to the final video.
An exaggerated example of spatial video playback on Vision Pro
When played back on Vision Pro, the headset’s stereoscopic displays allow users to see the depth as part of the video, but on an iPhone spatial videos play back in monoscopic mode and look no different that a regular video.
Although you don’t have the option to actually watch spatial video yet, it’s kind of nice that Apple is rolling out this feature ahead of the holidays, allowing people to start capturing memories of loved ones today that they might not see for another year.
If you have an iPhone 15 Pro or Pro Max and want to try capturing spatial video yourself, you can join the Apple Beta Software Program to install the iOS 17.2 beta.
Apple is adding two new locations to its Vision Pro ‘Developer Labs so devs can get their hands on the headset before it launches early next year.
It might not feel like it but 2024 will be here before we know it, and Apple has recently said it’s on track to launch Vision Pro “early” next year.
To get developers’ hands on Vision Pro launches, Apple has a handful of ‘Developer Labs’ where developers can go to check out the device and get feedback on their apps. Today the company announced it’s opening two more locations: New York City, USA and Sydney Australia.
Even with the two new locations, Vision Pro Developer Labs are still pretty sparce, but here’s the full list to date:
Cupertino, CA, USA
London, England, UK
Munich, Germany
Shanghai, China
Tokyo, Japan
New York City, USA
Sydney, Australia
Singapore
Apple is also offering developers ‘compatibility evaluations’ where the company will test third-party Vision Pro apps and provide feedback. The company is also giving select developers access to Vision Pro development kits.
Vision Pro is Apple’s first-ever XR headset and it’s sure going to shake up the industry one way or another, perhaps starting with the way the company is approaching ‘social’ in XR.
As a leading social media company, it seemed like Meta would be in the best position to create a rich social experience on its XR headsets. But after almost a decade of building XR platforms, interacting with friends on Meta’s headsets is still a highly fragmented affair. With Vision Pro, Apple is taking a different approach—making apps social right out of the box.
Meta’s Social Strategy in a Nutshell
Horizon Worlds is the manifestation of Meta’s social XR strategy. A space where you and your friends can go to build or play novel virtual games and experiences. It’s the very beginnings of the company’s ‘metaverse’ concept: an unlimited virtual space where people can share new experiences and maybe make some new virtual friends along the way.
But if you step out of Horizon, the rest of the social experience on the Quest platform quite fragmented.
The most basic form of ‘social’ is just hanging out with people you already know, doing things you already know you like to do—like watching a movie, playing a board game, or listening to music. But doing any of that on Meta’s headsets means jumping through a fragmented landscape of different apps and different ways to actually get into the same space with your friends.
On Quest, some apps use their own invite system and some use Meta’s invite system (when it works, anyway). Some apps use your Meta avatar and some use their own. As far as the interfaces and how you get in the same place with your friends, it’s different from app to app to app. Some even have separate accounts and friends lists.
And let’s not forget, many apps on Quest aren’t social in the first place. You might have made an awesome piece of 3D art but have no way to show your friends except to figure out how to take a screenshot and get it off of your headset to send to their phone. Or you might want to watch a movie release, but you can only do it by yourself. Or maybe you want to sit back and listen to a new album…maybe you can dig through the Quest store to find an app that allows a shared browser experience so you can listen through YouTube with someone else?
Apple’s Approach to Social on Vision Pro
Apple is taking a fundamentally different approach with Vision Pro by making social the expectation rather than the rule, and providing a common set of tools and guidelines for developers to build from in order to make social feel cohesive across the platform. Apple’s vision isn’t about creating a server full of a virtual strangers and user-generated experiences, but to make it easy to share the stuff you already like to do with the people you already know.
This obviously leans into the company’s rich ecosystem of existing apps—and the social technologies the company has already battle-tested on its platforms.
SharePlay is the feature that’s already present on iOS and MacOS devices that lets people watch, listen, and experience apps together through FaceTime. And on Vision Pro, Apple intends to use its SharePlay tech to make many of its own first-party apps—like Apple TV, Apple Music, and Photos—social right out of the box, and it expects developers to do so too. In the company’s developer documentation, the company says it expects “most visionOS apps to support SharePlay.”
For one, SharePlay apps will support ‘Spatial Personas’ on Vision Pro (that’s what Apple calls its avatars which are generated from a scan of your face). That means SharePlay apps on the platform will share a common look for participants. Apple is also providing several pre-configured room layouts that are designed for specific content, so developers don’t need to think about where to place users and how to manage their movement (and to finally put an end to apps spawning people inside of each other).
For instance, if a developer is building a movie-watching app, one of the templates puts all users side-by-side in front of a screen. But for a more interactive app where everyone is expected to actively collaborate there’s a template that puts users in a circle around a central point. Another template is based on presenting content to others, with some users close to the screen and others further away in a viewing position.
With SharePlay, Apple also provides the behind-the-scenes piping to keep apps synchronized between users, and it says the data shared between participants is “low-latency” and end-to-end encrypted. That means you can have fun with your friends and not be worried about anyone listening in.
People You Already Know, Things You Already Do
Perhaps most importantly, Apple is leaning on every user’s existing personal friend graph (ie: the people you already text, call, or email), rather than trying to create a bespoke friends list that lives only inside Vision Pro.
Rather than launching an app and then figuring out how to get your friends into it, with SharePlay Apple is focused on getting together with your friends first, then letting the group seamlessly move from one app to the next as you decide what you want to do.
Starting a group is as easy as making a FaceTime call to a friend whose number you already know. Then you’re already chatting virtually face-to-face before deciding what you want to do. In the mood for a movie? Launch Apple TV and fire up whatever you want to watch—your friend is still right there next to you. Now the movie is over; want to listen to some music while you discuss the plot? Fire up Spotify and put on the movie’s soundtrack to set the scene.
Social by Default
Even apps that don’t explicitly have multi-user experience built-in can be ‘social’ by default, by allowing one user to screen-share the app with others. Only the host will be able to interact with the content, but everyone else will be able to see and talk about it in real-time.
It’s the emphasis on ‘social by default’, ‘things you already do’, and ‘people you already know’ that will make social on Vision Pro feel completely different than what Meta is building on Quest with Horizon Worlds and its ecosystem of fragmented social apps.
Familiar Ideas
Ironically, Meta experimented with this very style of social XR years ago, and it was actually pretty good. Facebook Spaces was an early social XR effort which leveraged your existing friends on Facebook, and was focused on bringing people together in a template-style layout around their own photo and video content. You could even do a Messenger Video Chat with people outside of VR to make them part of the experience.
Facebook Spaces was a eerily similar microcosm of what Apple is now doing across the Vision Pro platform. But as with many things on Quest, Meta didn’t have the follow-through to get Spaces from ‘good’ to ‘great’, nor the internal will to set a platform-wide expectation about how social should work on its headsets. The company shut down Spaces in 2019, but even at the time we thought there was much to learn from the effort.
Will Apple Succeed Where Meta Faltered?
Making basic flat apps social out of the box on Vision Pro will definitely make it easier for people to connect on the headset and ensure they can already do familiar things with friends. But certainly on Meta’s headsets the vast majority of ‘social’ is in discrete multiplayer gaming experiences.
And for that, it has to be pointed out that there’s big limitations to SharePlay’s capabilities on Vision Pro. While it looks like it will be great for doing ‘things you already do’ with ‘people you already know’, as a framework it certainly doesn’t comport to many of the multiplayer gaming experiences that people are doing on headsets today.
For one, SharePlay experiences on Vision Pro only support up to five people (probably due to the performance implications of rendering too many Spatial Personas).
Second, SharePlay templates seem like they’ll only support limited person-to-person interaction. Apple’s documentation is a little bit vague, but the company notes: “although the system can place Spatial Personas shoulder to shoulder and it supports shared gestures like a handshake or ‘high five,’ Spatial Personas remain apart.” That makes it sound like users won’t be able to have free-form navigation or do things like pass objects directly between each other.
And when it comes to fully immersive social experiences (ie: Rec Room) SharePlay probably isn’t the right call anyway. Many social VR experiences (like games) will want to be able to render different avatars that fit the aesthetic of the experience, and certainly more than five at once. They’ll also want more control over networking and how users can move and interact with each other. At that point, building on SharePlay might not make much sense, but we hope it can still be used to help with initial group formation and joining other immersive apps together.
One of the most interesting things about Vision Pro is the way Apple is positioning its fully immersive capabilities. While many have interpreted the company’s actions as relegating VR to an afterthought, the reality is much more considered.
Vision Pro is somewhat ironic. It’s an incredibly powerful and capable VR headset, but Apple has done extensive work to make the default mode feel as little like being in VR as possible. This is of course what’s called ‘passthrough AR’, or sometimes ‘mixed reality’. We’re not quite there yet, but it’s clear that in Apple’s ideal world when you first put on the headset it should feel like nothing around you has even changed.
Apple doesn’t want Vision Pro to take over your reality… at least not all the time. It has gone to extensive lengths to try to seamlessly blend virtual imagery into the room around you. When floating UI panels are created, the are not only subtly transparent (to reveal the real world behind them), but the system even estimates the room’s lighting to cast highlights and shadows on the panels to make them look like they’re really floating there in front of you. It’s impressively convincing.
But none of this negates the fact that Vision Pro is a powerful VR headset. In my hands-on demo earlier this year, Apple clearly showed the headset is not only capable of fully immersive VR experiences, but that VR is a core capability of the platform. It even went so far as to add the ‘digital crown’ dial on the top of the headset to make it easy for people to transition between passthrough AR and a fully immersive view.
Much of the commentary surrounding Vision Pro focused on the fact that Apple never actually said the words “virtual reality,” and how the headset lacks the kind of dedicated controllers that are core to most VR headsets today. It was reasoned that this is because the company doesn’t really want Vision Pro to have anything to do with VR.
As I’ve had more time to process my experience of using the headset and my post-demo discussions with some of the people behind the product, it struck me that Apple doesn’t want to avoid fully immersive VR, it’s actually embracing it—but in a way that’s essentially the opposite of what we seen in most other headsets today. And frankly, I think their way is probably the approach the entire industry will adopt.
To understand that, let’s think about Meta’s Quest headsets. Though things might be changing soon with the release of Quest 3, up to this point the company has essentially used VR as the primary mode on its headsets, while passthrough AR was a sort of optional and occasional bonus mode—something apps only sometimes used, or something the user has to consciously toggle on.
On Vision Pro, Apple is doing the reverse. Passthrough AR is the default mode. But fully immersive VR is not being ignored; to the contrary, the company is treating VR as the most focused presentation of content on the headset.
In short, Apple is treating VR like a ‘full-screen’ mode for Vision Pro; the thing you consciously enable when you want to rid yourself of other distractions and get lost in one specific piece of media.
If you think about it, that’s exactly how we use full-screen on our computers and phones today.
Not every application on my computer launches in full-screen and removes my system UI or hides my other windows. In fact, the majority of apps on my computer don’t work this way. Most of the time I want to see my taskbar and my desktop and the various windows and controls that I use to manipulate data on my screen.
But if I’m going to watch a movie or play a game? Full-screen, every time.
That’s because these things are focused experiences where we don’t want to be distracted by anything else. We want to be engrossed by them so we remove the clutter and even let the application hide the mouse and give us a custom interface to better blend it with the media we’re about to engage with.
In the same way that you wouldn’t want every application on your computer to be in full-screen mode—with its own interface and style—Apple doesn’t think every application on your headset should be that way either.
Most should follow familiar patterns and share common interface language. And most do not need to be full-screen (or immersive). In fact, some things not only don’t benefit from being more immersive, in some cases they are made worse. I don’t need a fully immersive environment to view a PDF or spreadsheet. Nor do I need to get rid of all of my other windows and data if I want to play a game of chess. All of those things can still happen, but they don’t need to be my one and only focus.
Most apps can (and should) work seamlessly alongside each other. It’s only when we want that ‘full-screen’ experience that we should give an app permission to take over completely and block out the rest.
And that’s how Apple is treating fully immersive VR on Vision Pro. It isn’t being ignored; the company is simply baking-in the expectation that people don’t want their apps ‘full-screen’ all the time. When someone does want to go full-screen, it’s always a conscious opt-in action, rather than opt-out.
As for the dial on the top of the headset—while some saw this as evidence that Apple wants to make it quick and easy for people to escape fully immersive VR experiences on the headset, I’d argue the company sees the dial as a two way street: it’s both an ‘enter full-screen’ and ‘exit full-screen’ button—the same we expect to see on most media apps.
Ultimately, I think the company’s approach to this will become the norm across the industry. Apple is right: people don’t want their apps full-screen by all the time. Wanting to be fully immersed in one thing is the exception, not the rule.
With the early 2024 release of Vision Pro quickly approaching, Apple is steadily updating its products to prepare for the new headset.
In addition to an upcoming spatial capture feature on iPhone 15 Pro, Apple also says its latest AirPods Pro wireless earbuds (2nd-gen, now with USB-C) will support lossless audio with ‘ultra-low latency’ to ensure that what you see and what you hear are closely synchronized for an immersive experience.
What Apple is calling a “groundbreaking wireless audio protocol” is powered by the H2 chip in the AirPods Pro 2 and Vision Pro. The specifics of the protocol haven’t been divulged, but the company says it will deliver 20-bit, 48 kHz lossless audio with a “massive reduction in audio latency.”
Low latency in XR is important because a headset’s visuals need to be as low latency as possible in order to keep users comfortable. Having audio that’s just as responsive (in order to keep sight and sounds in sync) sometimes comes at the cost of quality. The audio protocol Apple is now touting seems designed specifically to maintain lossless audio while also keeping latency as low as possible.
The AirPods Pro 2 have been out for a while, but when the company revealed its latest phones earlier this month with USB-C connectors for the first time, it also took the time to release the refreshed version of the Airpods Pro 2, now with USB-C as well.
This is also when we saw the first mention of the new low latency audio protocol; though considering that the original AirPods Pro 2 (with lightning connector) also has an H2 chip, we certainly hope it will also support the new protocol. As for the non-Pro version of AirPods—which only have an H1 chip—it isn’t clear if they will get support. We’ve reached out to Apple for more clarity on which devices will be supported.
Unity, makers of the popular game engine, announced earlier this week it’s getting ready to levy some pretty significant fees on developers, causing many to rethink whether it makes more sense to actually go with the main competition, Unreal Engine from Epic Games. It seems Epic isn’t wasting any time to help transition those creating projects for Apple Vision Pro.
According to Victor Lerp, Unreal Engine XR Product Specialist at Epic Games, the company is now “exploring native Unreal Engine support for Apple Vision Pro,” the upcoming mixed reality headset due to launch in early 2024.
Lerp says it’s still early days though, noting that it’s “too early for us to share details on the extent of support or timelines.”
Lerp posted the statement on Unreal Engine’s XR development forum. You can read it in full below, courtesy of Alex Coulombe, CEO of the XR creative studio Agile Lens:
During Vision Pro’s unveiling at WWDC in June, Apple prominently showcased native Unity support in its upcoming XR operating system, visionOS. Unity began offering beta access to its visionOS-supported engine shortly afterwards, making it feel like something of a ‘bait and switch’ for developers already creating new games, or porting existing titles to Vision Pro.
As explained by Axios, Unity’s new plan will require users of its free tier of development services to pay the company $0.20 per installation once their game hits thresholds of both 200,000 downloads and earns $200,000 in revenue. Subscribers to Unity Pro, which costs $2,000 a year, have a different fee structure that scales downwards in proportion to the number of installs. What constitutes an ‘install’ is still fairly nebulous at this point despite follow-up clarifications from Unity. Whatever the case, the change is set to go into effect starting on January 1st, 2024.
In the meantime, the proposed Unity price increase has caused many small to medium-size teams to reflect on whether to make the switch to the admittedly more complicated Unreal Engine, or pursue other game engines entirely. A majority of XR game studios fit into that category, which (among many other scenarios) could hobble teams as they look to replicate free-to-play success stories like Gorilla Tag, which generated over $26 million in revenue when it hit the Quest Store late last year.
Apple’s “Wonderlust” product launch event featured the official unveiling of iPhone 15 and both Watch Series 9 and Ultra 2. While XR wasn’t a major focus of the event, Apple confirmed its upcoming mixed reality standalone Vision Pro isn’t seeing any delays to push it off its early 2024 release.
First unveiled at WWDC in June, Apple CEO Tim Cook said last night during the product event that Vision Pro is still “on track for release in early 2024.”
Vision Pro, which comes along with the very ‘pro’ price tag of $3,500, has reportedly been the subject of multiple delays in the past. The MR headset was widely thought to arrive sometime in 2022, although several successive reports maintained it was delayed multiple times since then.
With an “early 2024” launch in site, Apple seems to be making some of the right moves in the background, as the company has already opened up applications for developer units which are undoubtedly already in the hands of studios.
Meanwhile, the Cupertino tech giant also announced it’s prepping iPhone 15 Pro to take stereoscopic video which can be viewed on Vision Pro. It’s an interesting choice, as features on company’s most premium ‘Pro’ phone offerings tend to trickle down in successive generations. Here, the phone’s ultrawide and main cameras work together to create what Apple calls a “three-dimensional video.”
Apple today announced its iPhone 15 lineup of smartphones, including the iPhone 15 Pro which will be the company’s first phone to capture spatial video for immersive viewing on Vision Pro.
While Apple Vision Pro itself works as a spatial camera, allowing users to capture immersive photos and videos, I think we can all agree that wearing a camera on your head isn’t the most convenient way to capture content.
Apple seems to feel the same way. Today during the company’s iPhone 15 announcement, it was revealed that the new iPhone 15 Pro will be capable of capturing spatial video which can be viewed immersively on the company’s upcoming Vision Pro headset. The base versions of the phone, the iPhone 15 and iPhone 15 Plus, won’t have the spatial capture capability.
Details on exactly how this function works are slim for the time being.
“We use the ultrawide and main cameras together to create a three-dimensional video,” the company said during its announcement. But it isn’t clear if “three-dimensional” means stereoscopic footage with a fixed viewpoint, or some kind of depth projection with a bit of 6DOF wiggle room.
Given that the iPhone 15 Pro cameras are so close together—not offering enough distance between the two views for straightforward stereo capture—it seems that some kind of depth projection or scene reconstruction will be necessary.
Apple didn’t specifically say whether the phone’s depth-sensor was involved, but considering the phone uses it for other camera functions, we wouldn’t be surprised to find that it has some role to play. Curiously, Apple didn’t mention spatial photo capture, but ostensibly this should be possible as well.
While users will be able to watch their immersive videos on Vision Pro, Apple also said they’ll be able to share the footage with others who can watch on their own headset.
While the new iPhone 15 lineup will launch on September 22nd, Apple says the spatial capture capability won’t be available until “later this year”—which is curious considering the company also said today that Vision Pro is “on track to launch in early 2024.” Perhaps the company plans to allow creators to access the spatial video files for editing and use outside of Apple’s platform?
Resolution Games revealed its popular tabletop dungeon-crawling RPG Demeo now features hand-tracking support on Quest 2 and Quest Pro with its latest mixed reality update. It also includes a few new features to its previously released MR mode that are aiming to appeal to both future owners of Meta Quest 3 and Apple Vision Pro.
Ahead of Demeo’s confirmed launch-day release on Quest 3 sometime this Fall, Resolution Games announced it’s now pushing out the game’s Mixed Reality 2.0 update, which is available starting today on the Quest 2 and Quest Pro.
Added to Demeo last Fall, the game’s mixed reality mode lets players take the gameboard out of the virtual world and into their living rooms. Now, in the game’s Mixed Reality 2.0 update, players can put down their controllers and use their hands to pick up miniatures, play cards, and roll the die.
The update also adds two new MR features: co-location to optimize local mixed reality multiplayer, and decorations (including candles and posters) that can be placed to set the mood.
Notably, the studio says the original Quest won’t be receiving the Mixed Reality 2.0 update, but instead is getting a separate final update, bringing ongoing support for the game on Quest 1 to a close. Cross-play support for Quest 1 has also been discontinued.
Demeo Battles, the game’s upcoming PvP mode, is set to feature a similar MR mode on Quest 2, Quest 3, and Quest Pro when it launches later this year, Resolution Games says.
At the moment, Resolution Games says it’s currently focusing half the studio’s workload on the creation of MR projects. The studio tells Road to VR it has “multiple mixed reality titles actively in development across a number of devices for 2024 and beyond, including several with dedicated controller-free play.”
Additionally, the studio confirmed Demeo is in active development for Apple Vision Pro for “fully virtual as well as mixed reality gameplay,” with a flatscreen version planned for release on Mac.
Meta is reportedly teaming up with South Korean tech giant LG Electronics to offer up competition to the Apple’s forthcoming Vision Pro mixed reality headset, which is slated to arrive sometime in 2024.
South Korea’s Maekyung (Korean) is reporting on two new Meta headsets: a low-cost Quest model that will be priced at “less than $200” coming in 2024, and a high-priced model in a joint venture with LG in 2025, which is supposedly set to take on Apple Vision Pro.
The report maintains the name of the Meta/LG headset will be ‘Meta Quest 4 Pro’.
Mass production of the so-called Quest 4 Pro is allegedly being handled by LG Electronics, and LG Display, with LG Innotek and LG Energy Solution supplying parts.
Provided the report is true, it seems some very distinct battle lines are being drawn. Samsung announced earlier this year that it was working with Qualcomm and Google to develop an Android-powered XR device, which may also be positioned to compete against Apple and Meta.
Apple Vision Pro is coming next year, not only making for the Fruit Company’s much awaited first XR headset, but also spurring a resurgence in public interest (and likely investment) in the XR space. At $3,500, Vision Pro is undoubtedly an expensive steppingstone to the company’s future augmented reality ambitions, but even if it’s ostensibly ignoring virtual reality in the meantime, it probably won’t forever.
Apple has a tendency to undervalue gaming initially, though perhaps reluctantly, eventually acknowledges its importance. Gaming in XR is considerably enhanced by fully immersive experiences and motion controllers, and Apple will probably start feeling the pressure of that demand from gamers and developers alike when it kicks off a consumer headset sometime down the road, causing them to relent (if only just).
What is Vision Pro?
Like many, Apple is investing in AR today because the headsets and glasses of tomorrow have a good chance of supplanting smartphones and becoming the dominant mobile computing platform of the future. Long considered the holy grail of immersive computing, all-day AR headsets represent a way of interacting with new layers of information in daily life which would span everything from turn-by-turn directions to gaming applications—like Google Maps directions floating on the street in front of your car or a city-wide version of Pokémon Go.
Granted, Vision Pro isn’t yet the sort of device you’ll take out to the park to catch a random Zubat or Rattata—it’s very much an indoor device that Apple envisions you’ll use to sit down and watch a virtual TV screen or stand up in place to have an immersive chat with a work colleague. But as an opening gambit, Apple’s initial pitch of Vision Pro has been fairly telling of its strategy for XR.
In the ‘one more thing’ bit of the WWDC keynote, Apple lauded Vision Pro’s AR capabilities thanks to its color passthrough cameras, impressively responsive UI, and, from our hands-on with the headset, rock-solid hand-tracking. The company focused almost entirely on the work and lifestyle benefits of AR, and much less on the comparatively more closed-off fully immersed capabilities of virtual reality.
Considering just how much time and effort Apple has spent talking about AR, you may be surprised to find out Vision Pro can actually play VR games. After all, like Meta Quest Pro or the upcoming Quest 3, it’s basically a VR headset with passthrough cameras—what we’d call a mixed reality headset. In fact, the headset is already confirmed to support one of VR’s most prominent social VR games.
Instead, Vision Pro is focusing on eye-tracking and hand-tracking as primary input methods, with support for traditional peripherals like keyboards and mice and gamepads filling in the gaps for work and traditional flatscreen gaming. This means many VR developers looking to target Vision Pro will need to pare down input schemes to refocus on hand-tracking, or create games from the ground-up that don’t rely on the standard triggers, grip buttons, sticks, and half-dozen buttons.
Still, many VR games simply won’t translate without controllers, which above all provide important haptic feedback and a bevy of sticks and buttons for more complex inputs. Not only that, Vision Pro’s room-scale VR gaming chops are hobbled by a guardian limit of 10 feet by 10 feet (3m × 3m)—if the player moves any further, the VR experience will fade away, returning to the headset’s default AR view. There’s no such limit for AR apps, putting VR more or less into a virtual corner.
Denny Unger, CEO of pioneering VR studio Cloudhead Games, nails it on the head in a recent guest article, saying that Vision Pro “appears to be a VR headset pretending not to be a VR headset.”
Apple’s Chronically Late Adoption
Without speculating too far about into its XR ambitions, it appears Apple is turning somewhat of a new leaf with Vision Pro. The company is reportedly departing from tradition by creating a dedicated Vision Products Group (VPG), which is tasked with spearheading XR product development. Apple typically distributes its product development efforts across more general departments, such as hardware, software, design, services, etc, instead of sectionalizing hardware development into individual product teams, like Mac, Watch, iPad, iPhone, etc.
Not only that, but the company is also publicly accepting applications for development kits of the headset and hosting a handful of ‘developer labs’ around the world so that developer can get their hands and heads into the device ahead of time. It’s a decidedly different tactic than what we usually see from Apple.
The company’s wider strategy still seems to be in play however. Apple traditionally enters markets where it believes it can make a significant impact and actually own something, making it oftentimes not the first, but in many cases, the most important Big Tech company to validate an emerging market. The paradox here is Apple is actually early to AR, but late to VR. Deemphasizing the now fairly mature VR in favor of potentially creating a stronger foundation for its future AR devices makes a certain amount of sense coming from Apple.
Meanwhile, Apple is reportedly preparing a more consumer-focused follow-up to Vision Pro that will hopefully cost less than a high mileage, but still serviceable 2008 Honda Civic. Whenever Apple pitches that cheaper Vision headset to everyday people, they’ll likely need more entertainment-focused experiences, including fully immersive VR experiences with VR controllers.
And while Apple still isn’t positioning Vision Pro as a fully-fledged VR headset, that doesn’t mean it won’t relent in the future like it does with many crowd-pleasing features on iOS that in many cases don’t appear until years after they’ve been available on Android. In classic Apple style, it could offhandedly announce a pair of slick and ergonomic VR controllers as a pricey accessory during any of its annual product updates, and of course pretend it’s some great home-grown achievement.
Another big reason Apple may eventually decide to un-hobble a future Vision headset is its strong hold on app revenue. Apple’s XR headsets are on the same path as its iOS devices, which means the company captures a slice of revenue from every app you buy on iPhone, iPad and Apple TV. Unlike Mac, which by all accounts is a second-class citizen for gaming, iOS devices seem to be getting their act together. Kind of.
In some ways the company has only just fully embraced gaming on iOS with the launch of Apple Arcade in 2019, which serves up a curated collection of high-quality games on iOS and Apple TV without any ads or in-app purchases. Still, it’s pretty clear Apple doesn’t have big gaming ambitions—it doesn’t hoover up game publishers or studios like Meta or Microsoft tend to—so if it does unharness Vision’s VR capabilities, it may do so without the same raison virtuelle d’être as Meta or ByteDance (the latter being the TikTok parent company that also owns the Pico XR platform).
Provided Apple can secure the same hefty market share with future Vision headsets as it does with iPhone today though, which is around 30%, it may be more inclined to stay competitive with more VR-forward companies. But it isn’t emphasizing VR now, or even really competing against anyone, which may be a safer bet as it ventures into some truly unknown territory. Once the ball gets rolling though, the Cupertino tech giant will have less and less excuse to not toss out a pair of VR controllers and remove some of the arbitrary restrictions it’s imposed.
When that might happen, we don’t know, but it does sound awfully Apple-like to sit on much wanted features and eventually release them with a flick of the wrist.