AR development

apple-joins-pixar,-nvidia,-&-more-to-“accelerate-next-generation-of-ar-experiences”-with-3d-file-protocol

Apple Joins Pixar, NVIDIA, & More to “accelerate next generation of AR experiences” with 3D File Protocol

Today, big tech companies including Apple, Pixar, Adobe, Autodesk, and NVIDIA, announced the formation of the Alliance for OpenUSD (AOUSD), which is dedicated to promoting the standardization and development of a 3D file protocol that Apple says will “help accelerate the next generation of AR experiences.”

NVIDIA has been an early supporter of Pixar’s Universal Scene Description (USD), stating last year it thinks Pixar’s solution has the potential to become the “HTML of the metaverse.”

Much like HTML forms a sort of description of a webpage—being hostable anywhere on the Internet and retrievable/renderable locally by a web browser—USD can be used to describe complex virtual scenes, allowing it to be similarly retrieved and rendered on a local machine.

Here’s how the alliance describes their new OpenUSD inititive:

Created by Pixar Animation Studios, OpenUSD is a high-performance 3D scene description technology that offers robust interoperability across tools, data, and workflows. Already known for its ability to collaboratively capture artistic expression and streamline cinematic content production, OpenUSD’s power and flexibility make it an ideal content platform to embrace the needs of new industries and applications.

“Universal Scene Description was invented at Pixar and is the technological foundation of our state-of-the-art animation pipeline,” said Steve May, Chief Technology Officer at Pixar and Chairperson of AOUSD. “OpenUSD is based on years of research and application in Pixar filmmaking. We open-sourced the project in 2016, and the influence of OpenUSD now expands beyond film, visual effects, and animation and into other industries that increasingly rely on 3D data for media interchange. With the announcement of AOUSD, we signal the exciting next step: the continued evolution of OpenUSD as a technology and its position as an international standard.”

Housed by the Linux Foundation affiliate Joint Development Foundation (JDF), the alliance is hoping to attract a diverse range of companies and organizations to participate in shaping the future of OpenUSD actively. For now it counts Apple, Pixar, Adobe, Autodesk, and NVIDIA as foudning memebers, with general members including Epic Games, Unity, Foundry, Ikea, SideFX, and Cesium.

“OpenUSD will help accelerate the next generation of AR experiences, from artistic creation to content delivery, and produce an ever-widening array of spatial computing applications,” said Mike Rockwell, Apple’s VP of the Vision Products Group. “Apple has been an active contributor to the development of USD, and it is an essential technology for the groundbreaking visionOS platform, as well as the new Reality Composer Pro developer tool. We look forward to fostering its growth into a broadly adopted standard.”

Khronos Group, the consortium behind the OpenXR standard, launched a similar USD initiative in the past via its own Metaverse Standards Forum. It’s unclear how much overlap these initiatives will have, as that project was supported by AOUSD founders Adobe, Autodesk, and NVIDIA in addition to a wide swath of industry movers, such as Meta, Microsoft, Sony, Qualcomm, and AMD. Notably missing in the Metaverse Standards Forum was support from Apple and Pixar themselves.

We’re hoping to learn more at a long-form presentation of AOUSD during the Autodesk Vision Series on August 8th. There are a host of events leading up to SIGGRAPH 2023 though, which goes from August 6th – 10th, so we may learn more at any one of the companies’ own presentations on USD.

Apple Joins Pixar, NVIDIA, & More to “accelerate next generation of AR experiences” with 3D File Protocol Read More »

meta-reveals-new-prototype-vr-headsets-focused-on-retinal-resolution-and-light-field-passthrough

Meta Reveals New Prototype VR Headsets Focused on Retinal Resolution and Light Field Passthrough

Meta unveiled two new VR headset prototypes that showcase more progress in the fight to solve some persistent technical challenges facing VR today. Presenting at SIGGRAPH 2023, Meta is demonstrating a headset with retinal resolution combined with varifocal optics, and another headset with advanced light field passthrough capabilities.

Butterscotch Varifocal Prototype

Revealed in a developer blogpost, Meta showed off a varifocal research prototype that demonstrates a VR display system which provides “visual clarity that can closely match the capabilities of the human eye,” says Meta Optical Scientist Yang Zhao. The so-called ‘Butterscotch Varifocal’ prototype provides retinal resolution of up to 56 pixels per degree (PPD), which is sufficient for 20/20 visual acuity, researchers say.

Since its displays are also varifocal, it can support from 0 to 4 diopter (i.e. infinity to 25 cm), and matching what researchers say are “the dynamics of eye accommodation with at least 10 diopter/s peak velocity and 100 diopter/s2 acceleration.” The pulsing motors below control the displays’ focal distance in an effort to match the human eye.

Varifocal headsets represent a solution to the vergence-accommodation conflict (VAC) which has plagued standard VR headsets, the most advanced consumer headsets included. Varifocal headsets not only include the same standard support for the vergence reflex (when eyes converge on objects to form a stereo image), but also the accommodation reflex (when the lens of the eye changes shape to focus light at different depths). Without support for accommodation, VR displays can cause eye strain, make it difficult to focus on close imagery, and may even limit visual immersion.

Check out the through-the-lens video below to see how Butterscotch’s varifocal bit works:

Using LCD panels readily available on the market, Butterscotch manages its 20/20 retinal display by reducing the field of view (FOV) to 50 degrees, smaller than Quest 2’s ~89 degree FOV.

Although Butterscotch’s varifocal abilities are similar to the company’s prior Half Dome prototypes, the company says Butterscotch is “solely focused on showcasing the experience of retinal resolution in VR—but not necessarily with hardware technologies that are ultimately appropriate for the consumer.”

“In contrast, our work on Half Dome 1 through 3 focused on miniaturizing varifocal in a fully practical manner, albeit with lower-resolution optics and displays more similar to today’s consumer headsets,” explains Display Systems Research Director Douglas Lanman. “Our work on Half Dome prototypes continues, but we’re pausing to exhibit Butterscotch Varifocal to show why we remain so committed to varifocal and delivering better visual acuity and comfort in VR headsets. We want our community to experience varifocal for themselves and join in pushing this technology forward.”

Flamera Lightfield Passthrough Prototype

Another important side of making XR more immersive is undoubtably the headset’s passthrough capabilities, like you might see on Quest Pro or the upcoming Apple Vision Pro. The decidedly bug-eyed design of Meta’s Flamera research prototype is looking for a better way to create more realistic passthrough by using light fields.

Research Scientist Grace Kuo wearing the Flamera research prototype | Image courtesy Meta

In standard headsets, cameras are typically placed a few inches from where your eyes actually sit, capturing a different view than what you’d see if you weren’t wearing a headset. While there’s a lot of distortion and placement correction going on in standard headsets of today, you’ll probably still notice a ton of visual artifacts as the software tries to correctly resolve and render different depths of field.

“To address this challenge, we brainstormed optical architectures that could directly capture the same rays of light that you’d see with your bare eyes,” says Meta Research Scientist Grace Kuo. “By starting our headset design from scratch instead of modifying an existing design, we ended up with a camera that looks quite unique but can enable better passthrough image quality and lower latency.”

Check out the quick explainer below to see how Flamera’s ingenious capture methods work:

Now, here’s a comparison between an unobstructed view and Flamera’s light field capture, showing off some pretty compelling results:

As research prototypes, there’s no indication when we can expect these technologies to come to consumer headsets. Still, it’s clear that Meta is adamant about showing off just how far ahead it is in tackling some of the persistent issues in headsets today—something you probably won’t see from the patently black box that is Apple.

You can read more about Butterscotch and Flamera in their respective research papers, which are being presented at SIGGRAPH 2023, taking place August 6th – 10th in Los Angeles. Click here for the Butterscotch Varifocal abstract and Flamera full paper.

Meta Reveals New Prototype VR Headsets Focused on Retinal Resolution and Light Field Passthrough Read More »

vision-pro-dev-kit-applications-will-open-in-july

Vision Pro Dev Kit Applications Will Open in July

Apple says it will give developers the opportunity to apply for Vision Pro dev kits starting sometime in July.

In addition to releasing a first round of developer tools last week, including a software ‘Simulator’ of Vision Pro, Apple also wants to give developers a chance to get their hands on the headset itself.

The company indicates that applications for a Vision Pro development kit will open starting in July, and developers will be able to find details here when the time comes.

There’s no telling how many of the development kits the company plans to send out, or exactly when they will start shipping, but given Apple’s culture of extreme secrecy you can bet selected developers will be locked down with strict NDAs regarding their use of the device.

The Vision Pro developer kit isn’t the only way developers will be able to test their apps on a real headset.

Developers will also be able to apply to attend ‘Vision Pro developer labs’:

Apply for the opportunity to attend an Apple Vision Pro developer lab, where you can experience your visionOS, iPadOS, and iOS apps running on Apple Vision Pro. With direct support from Apple, you’ll be able to test and optimize your apps and games, so they’ll be ready when Apple Vision Pro is available to customers. Labs will be available in six locations worldwide: Cupertino, London, Munich, Shanghai, Singapore, and Tokyo.

Our understanding is that applications for the developer labs will also open in July.

Additionally, developers will also be able to request that their app be reviewed by Apple itself on visionOS, though this is restricted to existing iPhone and iPad apps, rather than newly created apps for visionOS:

If you currently have an iPad or iPhone app on the App Store, we can help you test it on Apple Vision Pro. Request a compatibility evaluation from App Review to get a report on your app or game’s appearance and how it behaves in visionOS.

Vision Pro isn’t planned to ship until early 2024, but Apple wants to have third-party apps ready and waiting for when that time comes.

Vision Pro Dev Kit Applications Will Open in July Read More »

apple-releases-vision-pro-development-tools-and-headset-emulator

Apple Releases Vision Pro Development Tools and Headset Emulator

Apple has released new and updated tools for developers to begin building XR apps on Apple Vision Pro.

Apple Vision Pro isn’t due out until early 2024, but the company wants developers to get a jump-start on building apps for the new headset.

To that end the company announced today it has released the visionOS SDK, updated Xcode, Simulator, and Reality Composer Pro, which developers can get access to at the Vision OS developer website.

While some of the tools will be familiar to Apple developers, tools like Simulator and Reality Composer Pro are newly released for the headset.

Simulator is the Apple Vision Pro emulator, which aims to give developers a way to test their apps before having their hands on the headset. The tool effectively acts as a software version of Apple Vision Pro, allowing developers see how their apps will render and act on the headset.

Reality Composer Pro is aimed at making it easy for developers to build interactive scenes with 3D models, sounds, and textures. From what we understand, it’s sort of like an easier (albeit less capable) alternative to Unity. However, developers who already know or aren’t afraid to learn a full-blown game engine can also use Unity to build visionOS apps.

Image courtesy Apple

In addition to the release of the visionOS SDK today, Apple says it’s still on track to open a handful of ‘Developer Labs’ around the world where developers can get their hands on the headset and test their apps. The company also says developers will be able to apply to receive Apple Vision Pro development kits next month.

Apple Releases Vision Pro Development Tools and Headset Emulator Read More »

a-concise-beginner’s-guide-to-apple-vision-pro-design-&-development

A Concise Beginner’s Guide to Apple Vision Pro Design & Development

Apple Vision Pro has brought new ideas to the table about how XR apps should be designed, controlled, and built. In this Guest Article, Sterling Crispin offers up a concise guide for what first-time XR developers should keep in mind as they approach app development for Apple Vision Pro.

Guest Article by Sterling Crispin

Sterling Crispin is an artist and software engineer with a decade of experience in the spatial computing industry. His work has spanned between product design and the R&D of new technologies at companies like Apple, Snap Inc, and various other tech startups working on face computers.

Editor’s Note:  The author would like to remind readers that he is not an Apple representative; this info is personal opinion and does not contain non-public information. Additionally, more info on Vision Pro development can be found in Apple’s WWDC23 videos (select Filter → visionOS).

Ahead is my advice for designing and developing products for Vision Pro. This article includes a basic overview of the platform, tools, porting apps, general product design, prototyping, perceptual design, business advice, and more.

Overview

Apps on visionOS are organized into ‘scenes’, which are Windows, Volumes, and Spaces.

Windows are a spatial version of what you’d see on a normal computer. They’re bounded rectangles of content that users surround themselves with. These may be windows from different apps or multiple windows from one app.

Volumes are things like 3D objects, or small interactive scenes. Like a 3D map, or small game that floats in front of you rather than being fully immersive.

Spaces are fully immersive experiences where only one app is visible. That could be full of many Windows and Volumes from your app. Or like VR games where the system goes away and it’s all fully immersive content that surrounds you. You can think of visionOS itself like a Shared Space where apps coexist together and you have less control. Whereas Full Spaces give you the most control and immersiveness, but don’t coexist with other apps. Spaces have immersion styles: mixed, progressive, and full. Which defines how much or little of the real world you want the user to see.

User Input

Users can look at the UI and pinch like the Apple Vision Pro demo videos show. But you can also reach out and tap on windows directly, sort of like it’s actually a floating iPad. Or use a bluetooth trackpad or video game controller. You can also look and speak in search bars. There’s also a Dwell Control for eyes-only input, but that’s really an accessibility feature. For a simple dev approach, your app can just use events like a TapGesture. In this case, you won’t need to worry about where these events originate from.

Spatial Audio

Vision Pro has an advanced spatial audio system that makes sounds seem like they’re really in the room by considering the size and materials in your room. Using subtle sounds for UI interaction and taking advantage of sound design for immersive experiences is going to be really important. Make sure to take this topic seriously.

Development

If you want to build something that works between Vision Pro, iPad, and iOS, you’ll be operating within the Apple dev ecosystem, using tools like XCode and SwiftUI. However, if your goal is to create a fully immersive VR experience for Vision Pro that also works on other headsets like Meta’s Quest or PlayStation VR, you have to use Unity.

Apple Tools

For Apple’s ecosystem, you’ll use SwiftUI to create the UI the user sees and the overall content of your app. RealityKit is the 3D rendering engine that handles materials, 3D objects, and light simulations. You’ll use ARKit for advanced scene understanding, like if you want someone to throw virtual darts and have them collide with their real wall, or do advanced things with hand tracking. But those rich AR features are only available in Full Spaces. There’s also Reality Composer Pro which is a 3D content editor that lets you drag things around a 3D scene and make media rich Spaces or Volumes. It’s like diet-Unity that’s built specifically for this development stack.

One cool thing with Reality Composer is that it’s already full of assets, materials, and animations. That helps developers who aren’t artists build something quickly and should help to create a more unified look and feel to everything built with the tool. Pros and cons to that product decision, but overall it should be helpful.

Existing iOS Apps

If you’re bringing an iPad or iOS app over, it will probably work unmodified as a Window in the Shared Space. If your app supports both iPad and iPhone, the headset will use the iPad version.

To customize your existing iOS app to take better advantage of the headset you can use the Ornament API to make little floating islands of UI in front of, or besides your app, to make it feel more spatial. Ironically, if your app is using a lot of ARKit features, you’ll likely need to ‘reimagine’ it significantly to work on Vision Pro, as ARKit has been upgraded a lot for the headset.

If you’re excited about building something new for Vision Pro, my personal opinion is that you should prioritize how your app will provide value across iPad and iOS too. Otherwise you’re losing out on hundreds of millions of users.

Unity

You can build to Vision Pro with the Unity game engine, which is a massive topic. Again, you need to use Unity if you’re building to Vision Pro as well as a Meta headset like the Quest or PSVR 2.

Unity supports building Bounded Volumes for the Shared Space which exist alongside native Vision Pro content. And Unbounded Volumes, for immersive content that may leverage advanced AR features. Finally you can also build more VR-like apps which give you more control over rendering but seem to lack support for ARKit scene understanding like plane detection. The Volume approach gives RealityKit more control over rendering, so you have to use Unity’s PolySpatial tool to convert materials, shaders, and other features.

Unity support for Vision Pro includes for tons of interactions you’d expect to see in VR, like teleporting to a new location or picking up and throwing virtual objects.

Product Design

You could just make an iPad-like app that shows up as a floating window, use the default interactions, and call it a day. But like I said above, content can exist in a wide spectrum of immersion, locations, and use a wide range of inputs. So the combinatorial range of possibilities can be overwhelming.

If you haven’t spent 100 hours in VR, get a Quest 2 or 3 as soon as possible and try everything. It doesn’t matter if you’re a designer, or product manager, or a CEO, you need to get a Quest and spend 100 hours in VR to begin to understand the language of spatial apps.

I highly recommend checking out Hand Physics Lab as a starting point and overview for understanding direct interactions. There’s a lot of subtle things they do which imbue virtual objects with a sense of physicality. And the Youtube VR app that was released in 2019 looks and feels pretty similar to a basic visionOS app, it’s worth checking out.

Keep a diary of what works and what doesn’t.

Ask yourself: ‘What app designs are comfortable, or cause fatigue?’, ‘What apps have the fastest time-to-fun or value?’, ‘What’s confusing and what’s intuitive?’, ‘What experiences would you even bother doing more than once?’ Be brutally honest. Learn from what’s been tried as much as possible.

General Design Advice

I strongly recommend the IDEO style design thinking process, it works for spatial computing too. You should absolutely try it out if you’re unfamiliar. There’s Design Kit with resources and this video which, while dated, is a great example of the process.

The road to spatial computing is a graveyard of utopian ideas that failed. People tend to spend a very long time building grand solutions for the imaginary problems of imaginary users. It sounds obvious, but instead you should try to build something as fast as possible that fills a real human need, and then iteratively improve from there.

Continue on Page 2: Spatial Formats and Interaction »

A Concise Beginner’s Guide to Apple Vision Pro Design & Development Read More »

apple-to-open-locations-for-devs-to-test-vision-pro-this-summer,-sdk-this-month

Apple to Open Locations for Devs to Test Vision Pro This Summer, SDK This Month

Ahead of the Apple Vision Pro’s release in ‘early 2024’, the company says it will open several centers in a handful of locations around the world, giving some developers a chance to test the headset before it’s released to the public.

It’s clear that developers will need time to start building Apple Vision Pro apps ahead of its launch, and it’s also clear that Apple doesn’t have heaps of headsets on hand for developers to start working with right away. In an effort to give developers the earliest possible chance to test their immersive apps, the company says it plans to open ‘Apple Vision Pro Developer Labs’ in a handful of locations around the world.

Starting this Summer, the Apple Vision Pro Developer Labs will open in London, Munich, Shanghai, Singapore, Tokyo, and Cupertino.

Apple also says developers will be able to submit a request to have their apps tested on Vision Pro, with testing and feedback being done remotely by Apple.

Image courtesy Apple

Of course, developers still need new tools to build for the headset in the first place. Apple says devs can expect a visionOS SDK and updated versions of Reality Composer and Xcode by the end of June so support development on the headset. That will be accompanied by new Human Interface Guidelines to help developers follow best practices for spatial apps on Vision Pro.

Additionally, Apple says it will make available a Vision Pro Simulator, an emulator that allows developers to see how their apps would look through the headset.

Developers can find more info when it’s ready at Apple’s developer website. Closer to launch Apple says Vision Pro will be available for the public to test in stores.

Apple to Open Locations for Devs to Test Vision Pro This Summer, SDK This Month Read More »

croquet-for-unity:-a-new-era-for-multiplayer-development-with-“no-netcode”-solution

Croquet for Unity: A New Era for Multiplayer Development With “No Netcode” Solution

Croquet, the multiplayer platform for web and gaming, which took home the WebXR Platform of the Year award at this year’s Polys WebXR Awards, recently announced Croquet for Unity.

Croquet for Unity is an innovative JavaScript multiplayer framework for Unity – a platform for creating interactive, real-time 3D content – that simplifies development by eliminating multiplayer code and server setup. It connects developers with the distinct global architecture of the Croquet Multiplayer Network. The framework was demonstrated at GDC last week, while early access beta is arriving in April 2023.

Effortless Networking for Developers

Croquet for Unity alleviates the developers’ need to generate and sustain networking code. By employing Croquet’s Synchronized Computation Architecture, server-side programming and traditional servers become unnecessary.

Users connect through the Croquet Multiplayer Network, which consists of Reflectors—stateless microservers located across four continents—that guarantee smooth and uniform experiences for gamers.

Synchronizing Computation for Flawless Multiplayer

At its essence, Croquet focuses on synchronizing not only the state but also its progression over time. By harmonizing computation, Croquet eliminates the need to transmit the outcomes of intricate computations like physics or AI.

It also eliminates the necessity for particular data structures or sync indicators for designated objects. As a result, crafting multiplayer code becomes akin to creating single-player code, with the full game simulation executing on-device.

Shared Virtual Computers for Perfect Sync

A shared virtual computer runs identically on all clients, providing perfect synchronization and giving each player a unique perspective. Lightweight reflectors can be positioned at the edge of the cloud or in a 5G network’s MEC, offering lower latency than older architectures.

In addition, synchronized calculations performed on each client will replace traditional server computations, resulting in reduced bandwidth and improved latency.

Unprecedented Shared Multiplayer Simulations

Croquet not only facilitates multiplayer development but also enables previously unfeasible shared multiplayer simulations. Examples include real-time interactive physics as a fundamental game feature, fully reproduced non-player character behaviors, and sophisticated player interactions that allow players to interact while the game is live.

Due to bandwidth limits and intrinsic complexity, traditional networks are incapable of supporting these simulations.

“Innately Multiplayer” Games With No Netcode

“Multiplayer games are the most important and fastest-growing part of the gaming market. But building and maintaining multiplayer games is still just too hard,” said David A. Smith, founder and CTO of Croquet, in a press release shared with ARPost. “Croquet takes the netcode out of creating multiplayer games. When we say, ‘innately multiplayer,’ we mean games are multiuser automatically from the first line of code and not as an afterthought writing networking code to make it multiplayer.”

Croquet’s goal is to simplify developing multiplayer games, making it as easy as building single-player games. By removing netcode creation and administration, developers can concentrate on improving player experiences while benefiting from reduced overall creation and distribution costs, a speedier time to market, and enhanced player satisfaction.

Opening Doors for Indie Developers

Croquet for Unity is created for a wide range of gaming developers, but it is highly advantageous for small, independent developers that often find it more difficult to create multiplayer games because of the absence of in-house networking and backend technical background.

Secure Your Spot on the Croquet for Unity Beta Waitlist

Developers can sign up for the Beta Waitlist to access the Croquet for Unity beta, launching in April.The Croquet for Unity Package will be available in the Unity Asset Store upon commercial release for free, requiring a Croquet gaming or enterprise subscription and developer API key for global Croquet Multiplayer Network access.

Croquet for Unity: A New Era for Multiplayer Development With “No Netcode” Solution Read More »

xiaomi-unveils-wireless-ar-glasses-prototype,-powered-by-same-chipset-as-meta-quest-pro

Xiaomi Unveils Wireless AR Glasses Prototype, Powered by Same Chipset as Meta Quest Pro

Chinese tech giant Xiaomi today showed off a prototype AR headset at Mobile World Congress (MWC) that wirelessly connects to the user’s smartphone, making for what the company calls its “first wireless AR glasses to utilize distributed computing.”

Called Xiaomi Wireless AR Glass Discovery Edition, the device is built upon the same Qualcomm Snapdragon XR2 Gen 1 chipset as Meta’s recently released Quest Pro VR standalone.

While specs are still thin on the ground, the company did offer some info on headline features. For now, Xiaomi is couching it as a “concept technology achievement,” so it may be a while until we see a full spec sheet.

Packing two microOLED displays, the company is boasting “retina-level” resolution, saying its AR glasses pack in 58 pixels per degree (PPD). For reference, Meta Quest Pro has a PPD of 22, while enterprise headset Varjo XR-3 cites a PPD of 70.

The company hasn’t announced the headset’s field of view (FOV), however it says its free-form light-guiding prisms “minimizes light loss and produces clear and bright images with a to-eye brightness of up to 1200nit.”

Electrochromic lenses are also said to adapt the final image to different lighting conditions, even including a full ‘blackout mode’ that ostensibly allows it to work as a VR headset as well.

Image courtesy Xiaomi

As for input, Xiaomi Wireless AR Glass includes onboard hand-tracking in addition to smartphone-based touch controls. Xiaomi says its optical hand-tracking is designed to let users to do things like select and open apps, swipe through pages, and exit apps.

As a prototype, there’s no pricing or availability on the table, however Xiaomi says the lightweight glasses (at 126g) will be available in a titanium-colored design with support for three sizes of nosepieces. An attachable glasses clip will also be available for near-sighted users.

In an exclusive hands-on, XDA Developers surmised it felt near production-ready, however one of the issues noted during a seemingly bump-free demo was battery life; the headset had to be charged in the middle of the 30-minute demo. Xiaomi apparently is incorporating a self-developed silicon-oxygen anode battery that is supposedly smaller than a typical lithium-ion battery. While there’s an onboard Snapdragon XR 2 Gen 1 chipset, XDA Developers also notes it doesn’t offer any storage, making a compatible smartphone requisite to playing AR content.

This isn’t the company’s first stab at XR tech; last summer Xiaomi showed off a pair of consumer smartglasses, called Mijia Glasses Camera, that featured a single heads-up display. Xiaomi’s Wireless AR Glass is however much closer in function to the concept it teased in late 2021, albeit with chunkier free-form light-guiding prisms than the more advanced-looking waveguides teased two years ago.

Xiaomi is actively working closely with chipmaker Qualcomm to ensure compatibility with Snapdragon Spaces-ready smartphones, which include Xiaomi 13 and OnePlus 11 5G. Possible other future contributions from Lenovo and Motorola, which have also announced their intentions to support Snapdragon Spaces.

Qualcomm announced Snapdragon Spaces in late 2021, a software tool kit which focuses on performance and low power devices which allows developers to create head-worn AR experiences from the ground-up, or add head-worn AR to existing smartphone apps.

Xiaomi Unveils Wireless AR Glasses Prototype, Powered by Same Chipset as Meta Quest Pro Read More »

blippar-expands-blippbuilder-support-to-ar-glasses-under-new-ceo

Blippar Expands Blippbuilder Support to AR Glasses Under New CEO

AR creation tool Blippar has long offered its creation tool Blippbuilder, which recently implemented a “freemium” pricing model. Naturally, the tool was built around smartphones, which is how most people still experience AR. However, with the increasing prevalence of AR-enabled headsets, the company is expanding the tool’s availability.

To learn more about Blippbuilder on headsets, the company’s long-term strategy, and the effects of other Blippar developments, ARPost met the company’s new CEO, Preet Prasannan.

Meet New CEO Preet Prasannan

Prasannan is Blippar’s new CEO, but he isn’t new to the company. Prasannan discovered Blippar almost ten years ago when he was working at DreamWorks when his manager left DreamWorks to work at Blippar.

“At the time, I got very excited about what Blippar was doing in AR,” said Prasannan. “To be honest, I didn’t even know what AR was.”

Prasannan worked at Blippar for a time before leaving to found his own startup. He was still working on that project when Blippar came into problems and ultimately entered into administration. Prasannan returned to Blippar and was instrumental in its return as “Blippar 2.0” by serving as the Chief Technical Officer.

“Blippar was like family to me, so I reached out, we started speaking,” said Prasannan. “I realized that there was an opportunity to bring Blippar back to life.”

Prasannan was the CTO throughout the tenure of CEO Faisal Galaria, who recently stepped down. This offered another opportunity for Prasannan to step up.

“In December, when Faisal decided to part ways with us, we decided it would be good if I was up for it,” said Prasannan. “This is my family.”

Blippbuilder Comes to Next-Gen Hardware

The first big move under Prasannan’s leadership is bringing Blippbuilder compatibility to AR glasses. While AR on a head-mounted display and AR on a handheld display might sound similar, there were some initial hurdles.

“To be frank, it was a bit of heavy lifting when we started on headsets. The first one,” said Prasannan. “The first headset that we supported took us six months and the next headset that we supported took us 48 hours.”

The first two headsets were Magic Leap and Meta Quest Pro. While some things are being ironed out before the next selection of compatible AR headsets is released, Prasannan says that the company can essentially achieve compatibility with new headsets as fast as they are produced. Which is good, they are being produced a lot more regularly these days.

Blippbuilder for AR glasses - Meta Quest Pro

“For the next generation of AR, we have to have devices that feel natural,” said Prasannan. “It becomes a natural way of seeing and visualizing AR content.”

This isn’t just a way of future-proofing Blippar. It’s also a way to advance AR as a field worth buying into.

“If you have amazing, exciting content and a tool that creates content easily, why would you not want to buy that headset?” asked Prasannan.

A Growing Ecosystem

The announcement is exciting in another way as well: the sort of experiences that are created using Blippbuilder, particularly since it became free to use. The move has also been positive for Blippar, of course.

“We had tens of thousands of users more than usual joining us,” said Prasannan. “It seems like we took the right step when we went in that direction.”

So, who are all of those new users? Naturally, they don’t all fit into one basket, but Prasannan said that there have been a lot of educational experiences created.

“We saw a very interesting solar system being created by one of our users,” said Prasannan. “I was actually showing it to the kids in my family and the feedback was immediate.”

There has been a long-standing chasm in the promising field of educational XR. The sum is that educators don’t typically know how to build experiences and experience builders don’t typically know how to educate. Blippbuilder’s free, no-code, increasingly versatile authoring tool is helping to bridge that gap.

“One of the driving factors of switching to the freemium model was to encourage creativity in all of our users,” said Prasannan. “Right now, Blippbuilder is free so anyone can create an account and publish projects.”

More to Come From Blippar

There are more big things coming from Blippar, as a “new iteration of Blippbuilder” is scheduled to release as a beta toward the end of Q1 of this year. The tool will “make developers and technologists out of anyone who wants to” because “technology should make things more simple, not more complicated.”

Blippar Expands Blippbuilder Support to AR Glasses Under New CEO Read More »

gdc-2023-state-of-the-game-industry-report-includes-insights-into-vr-and-ar

GDC 2023 State of the Game Industry Report Includes Insights Into VR and AR

Games are the largest use case for consumer VR and AR. While VR and AR remain a comparatively small segment of the games industry, the industry is taking notice as VR hardware in particular improves. This presents possibilities for new kinds of games but also promises to breathe new life into established franchises.

The GDC’s State of the Game Industry Report for this year is by no means dedicated to VR and AR. However, it does hold insights into how this segment of the industry is growing and changing. This includes insights into larger emerging technology trends like Web3 and the metaverse.

VR and AR in the Larger Games Industry

This GDC survey, the 11th in an annual series, found that “the metaverse has become more than a buzzword.” That doesn’t mean that VR and AR are now the driving force in the games industry.

In terms of which platforms developers are building for, VR headsets land in 10th place with 12% of respondents. AR landed in 14th place with 4% of respondents. When asked which platforms developers are building their next project for, VR headsets remained in 10th place with 12%, but AR moved up to 11th place with 5%.

GDC State of the Game Industry 2023 - platforms developers are building for
Source: GDC State of the Game Industry 2023

PC leads the pack, with the intervening platforms consisting of the usual suspects – legacy gaming platforms and mobile. However, this may be changing in the near future.

When asked which platforms developers are most interested in, 23% of respondents said VR, pushing the platform to 6th place, ahead of Android, Mac, and Xbox One. Similarly, 12% responded with AR, placing it in 11th place ahead of PS4 and web browsers.

GDC State of the Game Industry 2023 - which platform most interests game developers
Source: GDC State of the Game Industry 2023

So, while we might not see a boom period for VR and AR games in the immediate future, it’s increasingly on the radar of game developers. This trend looks like it could be setting up growth in this aspect of the industry within the next few years.

That said, last year’s big metaverse hype may have led to increased expectations for the cycle we’re in now. Last year, 42% of respondents said that they were actively involved in VR and AR game development. Now that number is at 38%, closer to where it was in 2021.

Platform Wars Within VR

So, of the developers that are working in VR and AR gaming, what platforms are they working on?

When asked which platform their next game will release on, 36% responded with Quest meaning Quest 2. An additional 10% responded with “Project Cambria” – the Quest Pro which had not yet been released at the time of the survey. A further 10% responded with Rift, Meta’s now discontinued line of tethered PC VR headsets.

GDC State of the Game Industry 2023 - VR and AR platforms developers are building games for
Source: GDC State of the Game Industry 2023

It is worth noting that the percentage of respondents working with Quest has gone up almost 10% since last year. That in itself is not necessarily surprising if not for the fact that the overall number of VR and AR game developers has gone down.

Interestingly, the runner-up is the as-yet-unreleased PlayStation VR 2 with 18%, followed by the HTC VIVE ecosystem at 15%. A further 12% responded with Apple’s ARKit, and another 9% responded with Android’s ARCore. There was also a potentially unexpected write-in entry.

“A handful of respondents shared that they were developing games for Pico, a platform that was not on the survey list,” the report offers. In some geographical markets, the Pico 4,  which was announced shortly before the Quest Pro, is a significant potential Quest Pro competitor. However, Pico Interactive does not currently offer consumer support in the US.

Gaming in the Metaverse?

“The concept of the metaverse continues to pick up steam in the game industry, as new and existing companies alike move to secure funding, spin up projects, and develop new technology,” reads the survey. However, like VR and AR gaming, this news comes with a grain of salt and some more sober attitudes since last year.

Nearly half of the respondents didn’t select any of the survey’s platform options. They instead said that “the metaverse concept will never deliver on its promise.” This occurred last year as well when around a third of respondents said that the metaverse will never materialize.

From a VR and AR perspective, it gets worse. More developers said that Fortnite would become the model metaverse platform than Horizon Worlds. This isn’t bad news because Horizon Worlds is better than Fortnite, it’s bad news because Horizon Worlds is VR and Fortnite isn’t. In fact, many of the more popular “metaverse” contenders are flat platforms.

GDC 2023 State of the Game Industry - Metaverse promise
Source: GDC State of the Game Industry 2023

And it gets worse. “Microsoft/Minecraft” came in a distant third place with 7% of respondents choosing them as the model metaverse. This presumably included AltspaceVR. As this article was being written, it was announced that AltspaceVR is coming to an end.

A Note on Blockchain

ARPost is not explicitly interested in blockchain but as a potential pillar of both the metaverse and the future of gaming, it shouldn’t be inappropriate to share some of the survey’s findings in this field. And, if you aren’t explicitly interested in blockchain either, the survey results should please you.

When asked about their interest in blockchain integration in games, 23% of respondents said that they were “very interested” or “somewhat interested”, with 75% saying that they were not interested at all. The remaining 2% are using blockchain in games already, with blockchain being the principal monetization strategy of around 4% of games.

Interest in blockchain is down slightly from last year, but, according to the report, most respondents were against blockchain last year as well and simply haven’t changed their minds.

GDC State of the Game Industry 2023 - blockchain in game industry
Source: GDC State of the Game Industry 2023

“Many developers said there could be a valuable place for blockchain technology in video games in the future,” the report explains. “Others said that the risks outweigh the benefits and that existing technologies serve similar purposes that negate the need for blockchain.”

A Maturing Industry

If you thought that the gaming industry was moving a little too fast last year, you were right. Metaverse hype driven by hardware expectations and blockchain buzz may have led to a brief, hard burn in the industry. It now seems that a small correction has taken place but the VR and AR games industry is settling in for longer-term development.

For the full picture of the whole gaming industry, find the complete report here.

GDC 2023 State of the Game Industry Report Includes Insights Into VR and AR Read More »