Today, Apple announced a slate of more than a dozen upcoming Immersive Videos for its Vision Pro spatial reality headset. The first, titled Boundless, launches tonight at 9 pm ET. More will follow in the coming weeks and months.
The announcement follows a long, slow period for new Vision Pro-specific video content from Apple. The headset launched in early February with a handful of Immersive Video episodes ranging from five to 15 minutes each. Since then, only three new videos have been added.
On March 28, Apple released a highlight reel of Major League Soccer plays from the season that had ended months prior. A second episode of Prehistoric Planet, Apple’s Immersive Video dinosaur nature documentary, went live on April 19. Likewise, a new episode of the Adventure series titled “Parkour” landed on May 24.
The MLS video played more like a short ad for Apple’s MLS programming than anything else, but other Immersive Videos have impressed with their quality if not their creative ambition. They’re all short videos that put the viewer inside a moment in space and time with either animals or people doing their thing. The videos are high-resolution, and the 3D is generally well done. The production values are high, even if the narratives are light. They come across as tech demos, as much as anything, but they are impressive.
Tonight’s Boundless episode will allow viewers to see what it’s like to ride in a hot air balloon over sweeping vistas. Another episode titled “Arctic Surfing” will arrive this fall, Apple says. Sometime next month, Apple will publish the second episode of its real wildlife documentary, simply titled Wild Life. The episode will focus on elephants in Kenya’s Sheldrick Wildlife Trust. Another episode is in the works, too. “Later this year,” Apple writes in its newsroom post, “viewers will brave the deep with a bold group of divers in the Bahamas, who come face-to-face with apex predators and discover creatures much more complex than often portrayed.”
More on the way
In September, we’ll see the debut of a new Immersive Video series titled Elevated. Apple describes it as an “aerial travel series” in which viewers will fly over places of interest. The first episode will take viewers to Hawaii, while another planned for later this year will go to New England.
Apple is additionally partnering with Red Bull for a look at surfing called Red Bull: Big-Wave Surfing.
In addition to those documentary episodes, there will be three short films by year’s end. One will be a musical experience featuring The Weeknd, and another will take basketball fans inside the 2024 NBA All-Star Weekend. There will also be Submerged, the first narrative fictional Immersive Video on the platform. It’s an action short film depicting struggles on a submarine during World War II.
It’s good to see Apple finally making some movement here; the drought of content after the launch didn’t inspire confidence in the platform. Many people with mixed reality headsets use them a bunch for a few weeks but either fail to find ways to fit them into their daily habits or run out of compelling content and taper off before long. To keep people invested in visionOS, Apple needs to keep a rapid cadence of new content that users look forward to at least every week. Otherwise, some users will see their headsets sit on shelves, forgotten.
When I reviewed the Vision Pro, I assumed that the Immersive Video episodes would roll out weekly. That hasn’t proven the case, and it still doesn’t look like it will. Apple is going to have to invest more in content (and take more risks with that content, moving beyond short tech demo documentaries) to make the Vision Pro stick with customers.
Apple’s Vision Pro headset went on sale outside the United States for the first time today, in the first of two waves of expanded availability.
The $3,499 “spatial computing” device launched back in February in the US, but it hasn’t taken the tech world by storm. Part of that has been its regional launch, with some of the biggest markets still lacking access.
Apple announced that the product would be sold internationally during its keynote at the Worldwide Developers Conference earlier this month.
The first new markets to get Vision Pro shipments are China, Japan, and Singapore—those are the ones where it went on sale today.
A second wave will come on July 12, with the headset rolling out in Australia, Canada, France, Germany, and the United Kingdom.
When we first tested the Vision Pro in February, we wrote that it was a technically impressive device with a lot of untapped potential. It works very well as a personal entertainment device for frequent travelers, in particular. However, its applications for productivity and gaming still need to be expanded to justify the high price.
Of course, there have been conflicting rumors of late about just how expensive Apple plans to keep its mixed reality devices. One report claimed that the company put the brakes on a new version of the Vision Pro for now, opting instead to develop a cheaper alternative for a 2025 launch.
But another report in Bloomberg suggested that’s an overstatement. It simply noted that the Vision Pro 2 has been slightly delayed from its original target launch window and reported that the cheaper model will come first.
In any case, availability will have to expand and the price will ultimately have to come down if augmented reality can become the major computing revolution that Apple CEO Tim Cook has predicted. This international rollout is the next step to test whether there’s a market for that.
Meta will open up the operating system that runs on its Quest mixed reality headsets to other technology companies, it announced today.
What was previously simply called Quest software will be called Horizon OS, and the goal will be to move beyond the general-use Quest devices to more purpose-specific devices, according to an Instagram video from Meta CEO Mark Zuckerberg.
There will be headsets focused purely on watching TV and movies on virtual screens, with the emphasis on high-end OLED displays. There will also be headsets that are designed to be as light as possible at the expense of performance for productivity and exercise uses. And there will be gaming-oriented ones.
The announcement named three partners to start. Asus will produce a gaming headset under its Republic of Gamers (ROG) brand, Lenovo will make general purpose headsets with an emphasize on “productivity, learning, and entertainment,” and Xbox and Meta will team up to deliver a special edition of the Meta Quest that will come bundled with an Xbox controller and Xbox Cloud Gaming and Game Pass.
Users running Horizon OS devices from different manufacturers will be able to stay connected in the operating system’s social layer of “identities, avatars, social graphs, and friend groups” and will be able to enjoy shared virtual spaces together across devices.
The announcement comes after Meta became an early leader in the relatively small but interesting consumer mixed reality space but with diminishing returns on new devices as the market saturates.
Further, Apple recently entered the fray with its Vision Pro headset. The Vision Pro is not really a direct competitor to Meta’s Quest devices today—it’s far more expensive and loaded with higher-end tech—but it may only be the opening volley in a long competition between the companies.
Meta’s decision to make Horizon OS a more open platform for partner OEMs in the face of Apple’s usual focus on owning and integrating as much of the software, hardware, and services in its device as it can mirrors the smartphone market. There, Google’s Android (on which Horizon OS is based) runs on a variety of devices from a wide range of companies, while Apple’s iOS runs only on Apple’s own iPhones.
Meta also says it is working on a new spatial app framework to make it easier for developers with experience on mobile to start making mixed reality apps for Horizon OS and that it will start “removing the barriers between the Meta Horizon Store and App Lab, which lets any developer who meets basic technical and content requirements release software on the platform.”
Pricing, specs, and release dates have not been announced for any of the new devices. Zuckerberg admitted it’s “probably going to take a couple of years” for this ecosystem of hardware devices to roll out.
Tonight, Apple will debut some new Immersive Video content for the Vision Pro headset—the first sports content for the device. It doesn’t seem like much after two months of no new content, though.
Starting at 6 pm PT/9 pm ET, Vision Pro users will be able to watch a sports film captured for the platform’s Immersive Video format. The video will be a series of highlights from last year’s Major League Soccer (MLS) playoffs, and according to Six Colors, it will run just five minutes. It will be free for all Vision Pro users.
On February 2, Apple released what appeared to be the first episodes of three Immersive Video series: Adventure, Prehistoric Planet, and Wildlife. Each debuted alongside the Vision Pro’s launch with one episode labeled “Episode 1” of “Season 1.”
However, it’s been almost two months, and none of those series have received new episodes. The only other piece of Immersive Video content available is an Alicia Keyes performance video that also debuted on February 2. Most of these videos were only a few minutes long.
That means that this short soccer video depicting sports moments from 2023 will be the only new piece of Immersive Video content Apple has put out since the device launched at the beginning of February.
When I reviewed the Vision Pro as an entertainment device, I lauded its capabilities for viewing 2D films and videos, but I also talked a bit about its 3D video capabilities. I said the first pieces of original 3D content from Apple seemed promising and that I looked forward to future episodes. Given that they were labeled just like Apple TV+ series in the TV app, I assumed they would arrive at a weekly cadence. Further episodes haven’t come.
Notably, Apple didn’t include a first-party app for playing 3D videos downloaded from the web with the Vision Pro, though an independent developer filled that gap with an app called Reality Player. There are a few 3D video streaming or downloading services in the visionOS App Store, but the selection is very anemic compared to what you have access to with other headsets.
Apple hasn’t been calling the Vision Pro a VR headset, opting instead for the term “spatial computing”—and that’s understandable because it does a lot more than most VR headsets.
But if you’re looking for new examples of the sorts of passive viewing content you can enjoy on other headsets, the Vision Pro is still far behind the competition two months in.
The device can display a wealth of 2D video content, but this drives home the initial impression that the Vision Pro is meant for viewing flat, 2D content as windows in 3D space. The situation isn’t quite as dire with apps and games, with a handful of new spatial apps in those categories rolling out in recent weeks.
Most apps behave just like iPad apps, with 2D viewports at the content; you can position those viewports wherever you want in the room around you. Most video content is also 2D.
There are situations where that’s neat to have, but it’s surprising Apple hasn’t invested more in actual 3D content yet. In terms of new stuff, this short soccer video debuting tonight is all we have right now.
Dive into the comments on those videos and you’ll see a consistent ratio: about 20 percent of the commenters herald this as the future, and the other 80 mock it with vehement derision. “I’ve never had as much desire to disconnect from reality as this guy does,” one reads.
Over the next few weeks, I’m going all-in on trying the Vision Pro in all sorts of situations to see which ones it suits. Last week, I talked about replacing a home theater system with it—at least when traveling away from home. Today, I’m going over my experience trying to find a use for it out on the streets of Chicago.
I’m setting out to answer a few questions here: Does it feel weird wearing it in public spaces? Will people judge you or react negatively when you wear it—and if so, will that become less common over time? Does it truly disconnect you from reality, and has Apple succeeded in solving virtual reality’s isolationist tendencies? Does it provide enough value to be worth wearing?
As it turns out, all these questions are closely related.
The potential of AR in the wild
I was excited about the Vision Pro in the lead-up to its launch. I was impressed by the demo I saw at WWDC 2023, even though I was aware that it was offered in an ideal setting: a private, well-lit room with lots of space to move around.
Part of my excitement was about things I didn’t see in that demo but that I’ve seen augmented reality developers explore in smartphone augmented reality (AR) and niche platforms like HoloLens and Xreal. Some smart folks have already produced a wide variety of neat tech demos showing what you can do with a good consumer AR headset, and many of the most exciting ideas work outside the home or office.
I’ve seen demonstrations of real-time directions provided with markers along the street while you walk around town, virtual assistant avatars guiding you through the airport, menus and Yelp reviews overlaid on the doors of every restaurant on a city strip, public art projects pieced together by multiple participants who each get to add an element to a virtual statue, and much more.
Of course, all those ideas—and most others for AR—make a lot more sense for unintrusive glasses than they do for something that is essentially a VR headset with passthrough. Nonetheless, I was hoping to get a glimpse at that eventuality with the Vision Pro.
For decades now, potential Apple customers have been able to wander in to any Apple Store and get some instant eyes-on and hands-on experience with most of the company’s products. The Apple Vision Pro is an exception to this simple process; the “mixed-reality curious” need to book ahead for a guided, half-hour Vision Pro experience led by an Apple Store employee.
As a long-time veteran of both trade show and retail virtual-reality demos, I was interested to see how Apple would sell the concept of “spatial computing” to members of the public, many of whom have minimal experience with existing VR systems. And as someone who’s been following news and hands-on reports of the Vision Pro’s unique features for months now, I was eager to get a brief glimpse into what all the fuss was about without plunking down at least $3,499 for a unit of my own.
After going through the guided Vision Pro demo at a nearby Apple Store this week, I came away with mixed feelings about how Apple is positioning its new computer interface to the public. While the short demo contained some definite “oh, wow” moments, the device didn’t come with a cohesive story pitching it as Apple’s next big general-use computing platform.
Setup snafus
After arriving a few minutes early for my morning appointment in a sparsely attended Apple Store, I was told to wait by a display of Vision Pro units set on a table near the front. These headsets were secured tightly to their stands, meaning I couldn’t try a unit on or even hold it in my hands while I waited. But I could fondle the Vision Pro’s various buttons and straps while getting a closer look at the hardware (and at a few promotional videos running on nearby iPads).
After a few minutes, an Apple Store employee, who we’ll call Craig, walked over and said with genuine enthusiasm that he was “super excited” to show off the Vision Pro. He guided me to another table, where I sat in a low-backed swivel chair across from another customer who looked a little zoned out as he ran through his own Vision Pro demo.
Craig told me that the Vision Pro was the first time Apple Store employees like him had gotten early hands-on access to a new Apple device well before the public, in order to facilitate the training needed to guide these in-store demos. He said that interest had been steady for the first few days of demos and that, after some initial problems, the store now mostly managed to stay on schedule.
Unfortunately, some of those demo kinks were still present. First, Craig had trouble tracking down the dedicated iPhone used to scan my face and determine the precise Vision Pro light seal fit for my head. After consulting with a fellow employee, they decided to have me download the Apple Store app and use a QR code to reach the face-scanning tool on my own iPhone. (I was a bit surprised this fit scanning hadn’t been offered as part of the process when I signed up for my appointment days earlier.)
It took three full attempts, scanning my face from four angles, before the app managed to spit out the code that Craig needed to send my fit information to the back room. Craig told me that the store had 38 different light seals and 900 corrective lens options sitting back there, ready to be swapped in to ensure maximum comfort for each specific demo.
After a short wait, another employee brought my demo unit out on a round wooden platter that made me feel like I was at a Japanese restaurant. The platter was artistically arranged, from the Solo Knit Band and fuzzy front cover to the gently coiled cord leading to the battery pack sitting in the center. (I never even touched or really noticed the battery pack for the rest of the demo.)
At this point, Craig told me that he would be able to see everything I saw in the Vision Pro, which would stream directly to his iPad. Unfortunately, getting that wireless connection to work took a good five minutes of tapping and tinkering, including removing the Vision Pro’s external battery cord several times.
Once everything was set, Craig gave me a brief primer on the glances and thumb/forefinger taps I would use to select, move, and zoom in on things in the VisionOS interface. “You’re gonna pretend like you’re pulling on a piece of string and then releasing,” he said by way of analogy. “The faster you go, the faster it will scroll, so be mindful of that. Nice and gentle, nice and easy, and things will go smoothly for you.”
Fifteen minutes after my appointed start time, I was finally ready to don the Vision Pro.
A scripted experience
After putting the headset on, my first impression was how heavy and pinchy the Vision Pro was on the bridge of my nose. Thankfully, Craig quickly explained how to tighten the fit with a dial behind my right ear, which helped immediately and immensely. After that, it only took a minute or two to run through some quick calibration of the impressively snappy eye and hand tracking. (“Keep your head nice and still as you do this,” Craig warned me during the process.)
As we dove into the demo proper, it quickly became clear that Craig was reading from a prepared script on his iPhone. This was a bit disappointing, as the genuine enthusiasm he had shown in our earlier, informal chat gave way to a dry monotone when delivering obvious marketing lines. “With Apple Vision Pro, you can experience your entire photo library in a brand new way,” he droned. “Right here, we have some beautiful shots, right from iPhone.”
Craig soldiered through the script as I glanced at a few prepared photos and panoramas. “Here we have a beautiful panorama, but we’re going to experience it in a whole new way… as if you were in the exact spot in which it was taken,” Craig said. Then we switched to some spatial photos and videos of a happy family celebrating a birthday and blowing bubbles in the backyard. The actors in the video felt a little stilted, but the sense of three-dimensional “presence” in the high-fidelity video was impressive.
After that, Craig informed me that “with spatial computing, your apps can exist anywhere in your space.” He asked me to turn the digital crown to replace my view of the store around me with a virtual environment of mountains bathed in cool blue twilight. Craig’s script seemed tuned for newcomers who might be freaked out by not seeing the “real world” anymore. “Remember, you’re always in control,” Craig assured me. “You can change it at any time.”
From inside the environment, Craig’s disembodied voice guided me as I opened a few flat app windows, placing them around my space and resizing them as I liked. Rather than letting these sell themselves, though, Craig pointed out how webpages are “super beautiful [and] easy to navigate” on Vision Pro. “As you can also see… text is super sharp, super easy to read. The pictures on the website look stunning.” Craig also really wanted me to know that “over one million iPhone/iPad apps” will work like this on the Vision Pro on day one.
Apple’s highly anticipated mixed-reality Vision Pro headset will be available starting on February 2 at US retail Apple locations and on the Apple Store website, the company announced this morning. Preorders for the $3,499 “spatial computing” headset will start on January 19 at 5 am PST.
The stock model of the Vision Pro will include 256GB of storage, which can be used to store existing iOS apps or apps made specifically for the Vision Pro’s new spatialOS. The package will include the flexible Solo Knit Band seen in previous marketing materials, as well as a newly revealed “Dual Loop Band,” which adds a portion that goes over the top of the skull. Apple says the extra included band will “give users two options for the fit that works best for them.”
Here are the other items included in the Vision Pro box, according to Apple:
Light seal
Two light seal cushions
Apple Vision Pro cover
Polishing cloth
External battery USB-C charge cable
USB-C power adapter
For those needing corrective lenses while inside the Vision Pro, Apple says Zeiss “Readers” will be available for $99, while prescription Optical Inserts will run $149. Both options attach magnetically to the device to give an unobstructed view and allow for eye tracking while in Vision Pro.
What can this thing do, again?
In its press release announcing the launch details, Apple highlighted the availability of “more than 150” movies and TV shows that can be viewed in 3D on the device. The company also mentions Vision Pro’s support for “new spatial games,” including Game Room, What the Golf?, and Super Fruit Ninja, which “take advantage of the powerful capabilities of Apple Vision Pro to transform the space around players, offering unique and engaging gameplay experiences.”
Aside from those visionOS-enhanced experiences, Apple’s release talks up Vision Pro’s support for 2D productivity apps, including “Fantastical, Freeform, JigSpace, apps from Microsoft 365, and Slack.” Vision Pro users can also view streaming services like Apple TV+, Disney+, and Max, which can be viewed “on a screen that feels 100 feet wide with support for HDR content” or play more than 250 Apple Arcade titles on the device.
Ars Technica went hands-on with the Vision Pro shortly after its original announcement last June, testing out its eye-tracking interface, avatar-based FaceTime calls, and immersive, 3D mixed reality content. We’re eager to get more time with the device for testing in Ars’ Orbiting HQ soon.
Laser Dance is an upcoming mixed reality game that seeks to use Quest’s passthrough capability as more than just a background. In this Guest Article, developer Thomas Van Bouwel explains his approach to designing an MR game that adapts to different environments.
Thomas is a Belgian-Brazilian VR developer currently based in Brussels. Although his original background is in architecture, his work in VR spans from indie games like Cubism to enterprise software for architects and engineers like Resolve. His Latest project, Laser Dance, is coming to Quest 3 late next year.
For the past year I’ve been working on a new game called Laser Dance. Built from the ground up for Mixed Reality (MR), my goal is to make a game that turns any room in your house into a laser obstacle course. Players walk back and forth between two buttons, and each button press spawns a new parametric laser pattern they have to navigate through. The game is still in full development, aiming for a release in 2024.
If you’d like to sign up for playtesting Laser Dance, you can do so here!
Laser Dance’s teaser trailer, which was first shown right after Meta Connect 2023
The main challenge with a game like this, and possibly any roomscale MR game, is to make levels that adapt well to any room regardless of its size and layout. Furthermore, since Laser Dance is a game that requires a lot of physical motion, the game should also try to accommodate differences in people’s level of mobility.
To try and overcome these challenges, having good room-emulation tools that enable quick level design iteration is essential. In this article, I want to go over how levels in Laser Dance work, and share some of the developer tools that I’m building to help me create and test the game’s adaptive laser patterns.
Laser Pattern Definition
To understand how Laser Dance’s room emulation tools work, we first need to cover how laser patterns work in the game.
A level in Laser Dance consists of a sequence of laser patterns – players walk (or crawl) back and forth between two buttons on opposite ends of the room, and each button press enables the next pattern. These laser patterns will try to adapt to the room size and layout.
Since the laser patterns in Laser Dance’s levels need to adapt to different types of spaces, the specific positions of lasers aren’t pre-determined, but calculated parametrically based on the room.
Several methods are used to position the lasers. The most straightforward one is to apply a uniform pattern over the entire room. An example is shown below of a level that applies a uniform grid of swinging lasers across the room.
An example of a pattern-based level, a uniform pattern of movement is applied to a grid of lasers, covering the entire room.
Other levels may use the button orientation relative to each other to determine the laser pattern. The below example shows a pattern that creates a sequence of blinking laser walls between the buttons .
Blinking walls of lasers are oriented perpendicular to the imaginary line between the two buttons.
One of the more versatile tools for level generation is a custom pathfinding algorithm, which was written for Laser Dance by Mark Schramm, guest developer on the project. This algorithm tries to find paths between the buttons that maximize the distance from furniture and walls, making a safer path for players.
The paths created by this algorithm allow for several laser patterns, like a tunnel of lasers, or placing a laser obstacle in the middle of the player’s path between the buttons.
This level uses pathfinding to spawn a tunnel of lasers that snakes around the furniture in this room.
Room Emulation
The different techniques described above for creating adaptive laser patterns can sometimes lead to unexpected results or bugs in specific room layouts. Additionally, it can be challenging to design levels while trying to keep different types of rooms in mind.
To help with this, I spent much of early development for Laser Dance on building a set of room emulation tools to let me simulate and directly compare what a level will look like between different room layouts.
Rooms are stored in-game as a simple text file containing all wall and furniture positions and dimensions. The emulation tool can take these files, and spawn several rooms next to each other directly in the Unity editor.
You can then swap out different levels, or even just individual laser patterns, and emulate these side by side in various rooms to directly compare them.
A custom tool built in Unity spawns several rooms side by side in an orthographic view, showing how a certain level in Laser Dance would look in different room layouts.
Accessibility and Player Emulation
Just as the rooms that people play in may differ, the people playing themselves will be very different as well. Not everyone may be able to crawl on the floor to dodge lasers, or feel capable of squeezing through a narrow corridor of lasers.
Because of the physical nature of Laser Dance’s gameplay, there will always be a limit to its accessibility. However, to the extent possible, I would still like to try and have the levels adapt to players in the same way they adapt to rooms.
Currently, Laser Dance allows players to set their height, shoulder width, and the minimum height they’re able to crawl under. Levels will try and use these values to adjust certain parameters of how they’re spawned. An example is shown below, where a level would typically expect players to crawl underneath a field of lasers. When adjusting the minimum crawl height, this pattern adapts to that new value, making the level more forgiving.
Accessibility settings allow players to tailor some of Laser Dance’s levels to their body type and mobility restrictions. This example shows how a level that would have players crawl on the floor, can adjust itself for folks with more limited vertical mobility.
These player values can also be emulated in the custom tools I’m building. Different player presets can be swapped out to directly compare how a level may look different between two players.
Laser Dance’s emulation tools allow you to swap out different preset player values to test their effect on the laser patterns. In this example, you can notice how swapping to a more accessible player value preset makes the tunnel of lasers wider.
Data, Testing, and Privacy
A key problem with designing an adaptive game like Laser Dance is that unexpected room layouts and environments might break some of the levels.
To try and prepare for this during development, there is a button in the settings players can choose to press to share their room data with me. Using these emulation tools, I can then try and reproduce their issue in an effort to resolve it.
Playtesters can press a button in the settings to share their room layout. This allows for local reproduction of potential issues they may have seen, using the emulation tools mentioned above.
This of course should raise some privacy concerns, as players are essentially sharing parts of their home layout with me. From a developers standpoint, it has a clear benefit to the design and quality control process, but as consumers of MR we should also have an active concern on what personal data developers should have access to and how it is used.
Personally, I think it’s important that sharing sensitive data like this requires active consent of the player each time it is shared – hence the button that needs to be actively pressed in the settings. Clear communication on why this data is needed and how it will be used is also important, which is a big part of my motivation for writing this article.
When it comes to MR platforms, an active discussion on data privacy is important too. We can’t always assume sensitive room data will be used in good faith by all developers, so as players we should expect clear communication and clear limitations from platforms regarding how apps can access and use this type of sensitive data, and stay vigilant on how and why certain apps may request access to this data.
Do You Need to Build Custom Tools?
Is building a handful of custom tools a requirement for developing adaptive Mixed Reality? Luckily the answer to that is: probably not.
We’re already seeing Meta and Apple come out with mixed reality emulation tools of their own, letting developers test their apps in a simulated virtual environment, even without a headset. These tools are likely to only get better and more robust in time.
There is still merit to building custom tools in some cases, since they will give you the most flexibility to test against your specific requirements. Being able to emulate and compare between multiple rooms or player profiles at the same time in Laser Dance is a good example of this.
– – — – –
Development of Laser Dance is still in full swing. My hope is that I’ll end up with a fun game that can also serve as an introduction to mixed reality for newcomers to the medium. Though it took some time to build out these emulation tools, they will hopefully both enable and speed up the level design process to help achieve this goal.
Less than 48 hours after Meta fully unveiled Quest 3, John Carmack, legendary programmer and former CTO of Oculus, expressed doubts about mixed reality’s ability to increase headset sales.
Carmack departed Meta late last year, concluding what he called at the time his “decade in VR.” Still, it’s clear the key cohort in Oculus’ genesis story has a lot to offer when it comes to all things XR.
While Carmack doesn’t mention Quest 3 by name, it’s fairly clear he’s talking about Meta’s first consumer mixed reality headset, having tweeted a message of skepticism about the headset-selling power of MR apps:
“I remain unconvinced that mixed reality applications are any kind of an engine for increasing headset sales. High quality pass through is great, but I just don’t see applications built around integrating rendering with your real world environment as any kind of a killer app. I consider it interesting and challenging technology looking for a justification. The power of VR is to replace your environment with something much better, not to hang a virtual screen in your real environment. In all the highly produced videos demonstrating the MR future, the environments are always stylish, clean, and spacious. That is not representative of the real world user base. There is certainly some value in the efforts, but I have always thought there was much more low hanging fruit to be grabbed first.”
In a follow-up tweet, Carmack maintains he’s not criticizing the future of augmented reality, but rather how MR-capable VR headsets are being served up today:
“I am specifically talking about MR in todays [sic] VR headsets. The magical, all-day wear, full FOV AR headsets of people’s dreams would be great, but they don’t exist, even in labs with billions of dollars.”
Meta announced relatively few MR games for Quest 3 at its full unveiling last week, emphasizing that 50+ new VR games are coming by the end of this year, many of which will feature “MR features” of some sort.
Still, increasing headset sales to rival Quest 2 ought to be a big focus for Meta, as the company revealed at Connect 2023 that it had just broken $2 billion in Quest game and app revenue to date.
While impressive, it signifies a dramatic slowing of content sales over the past year, putting Quest 3 in the metaphorical hot seat to continue the upward trend if Meta intends on defending its $4 billion-per quarter investments in its Reality Labs XR division.
At Meta Connect 2023 today, among the avalanche of news we learned Quest 3 is shipping October 10th, accessories are raining from the sky, and there will be “another 50+ titles” coming by year’s end.
Meta says over 100 new and upgraded titles are coming to Quest 3 by year’s end, with over half of them brand new.
Many of those 100+ apps and games are getting some form of “MR features” too, Meta says, which ought to help fill out what so far seems to be a fairly shallow pool of mixed reality content currently. Mixed reality games announced today include multiplayer tabletop battle game BAM! from I-Illusions, a mixed reality mode for Ghostbusters: Rise of the Ghost Lord,Lego Bricktales and Meta’s First Encounters tutorial app for Quest 3 (we’ll be filling out this list as we learn more). Meta also showed MR modes for Stranger Things VR and Less Mills Body Combat.
While full-color passthrough allows for mixed reality games, Quest 3 is thankfully also backwards compatible with Quest 2’s entire library of over 500+ VR games and apps.
Granted, individual developers will need to push Quest 3-specific updates that overhaul things like texture quality and render resolution to get the most out of the Meta’s latest and greatest.
Its higher-resolution, independent displays and second-gen Snapdragon XR2 (see the full specs here) will also boost Quest 2 content out of the gate though, making what’s there a little sharper and clear, and a little less resource intensive too.
– – — – –
Connect 2023 kicks off today, taking place September 27th and 28th at Meta’s Menlo Park headquarters. There’s been a ton of news already, so make sure to follow along with Connect for all of the latest XR stuff from Meta.
Unity, makers of the popular game engine, announced earlier this week it’s getting ready to levy some pretty significant fees on developers, causing many to rethink whether it makes more sense to actually go with the main competition, Unreal Engine from Epic Games. It seems Epic isn’t wasting any time to help transition those creating projects for Apple Vision Pro.
According to Victor Lerp, Unreal Engine XR Product Specialist at Epic Games, the company is now “exploring native Unreal Engine support for Apple Vision Pro,” the upcoming mixed reality headset due to launch in early 2024.
Lerp says it’s still early days though, noting that it’s “too early for us to share details on the extent of support or timelines.”
Lerp posted the statement on Unreal Engine’s XR development forum. You can read it in full below, courtesy of Alex Coulombe, CEO of the XR creative studio Agile Lens:
During Vision Pro’s unveiling at WWDC in June, Apple prominently showcased native Unity support in its upcoming XR operating system, visionOS. Unity began offering beta access to its visionOS-supported engine shortly afterwards, making it feel like something of a ‘bait and switch’ for developers already creating new games, or porting existing titles to Vision Pro.
As explained by Axios, Unity’s new plan will require users of its free tier of development services to pay the company $0.20 per installation once their game hits thresholds of both 200,000 downloads and earns $200,000 in revenue. Subscribers to Unity Pro, which costs $2,000 a year, have a different fee structure that scales downwards in proportion to the number of installs. What constitutes an ‘install’ is still fairly nebulous at this point despite follow-up clarifications from Unity. Whatever the case, the change is set to go into effect starting on January 1st, 2024.
In the meantime, the proposed Unity price increase has caused many small to medium-size teams to reflect on whether to make the switch to the admittedly more complicated Unreal Engine, or pursue other game engines entirely. A majority of XR game studios fit into that category, which (among many other scenarios) could hobble teams as they look to replicate free-to-play success stories like Gorilla Tag, which generated over $26 million in revenue when it hit the Quest Store late last year.
Meta is reportedly teaming up with South Korean tech giant LG Electronics to offer up competition to the Apple’s forthcoming Vision Pro mixed reality headset, which is slated to arrive sometime in 2024.
South Korea’s Maekyung (Korean) is reporting on two new Meta headsets: a low-cost Quest model that will be priced at “less than $200” coming in 2024, and a high-priced model in a joint venture with LG in 2025, which is supposedly set to take on Apple Vision Pro.
The report maintains the name of the Meta/LG headset will be ‘Meta Quest 4 Pro’.
Mass production of the so-called Quest 4 Pro is allegedly being handled by LG Electronics, and LG Display, with LG Innotek and LG Energy Solution supplying parts.
Provided the report is true, it seems some very distinct battle lines are being drawn. Samsung announced earlier this year that it was working with Qualcomm and Google to develop an Android-powered XR device, which may also be positioned to compete against Apple and Meta.